diff --git a/.claude/commands/post-release-testing.md b/.claude/commands/post-release-testing.md index 10cf6ff..09e2b6a 100644 --- a/.claude/commands/post-release-testing.md +++ b/.claude/commands/post-release-testing.md @@ -43,7 +43,7 @@ Manually verify the install script works in a fresh environment: ```bash docker run --rm alpine:latest sh -c " apk add --no-cache curl ca-certificates libstdc++ libgcc bash && - curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh && + curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh && sandbox-agent --version " ``` diff --git a/.env.development.example b/.env.development.example index 0ae0f58..c4132f4 100644 --- a/.env.development.example +++ b/.env.development.example @@ -23,9 +23,6 @@ GITHUB_APP_PRIVATE_KEY= # Webhook secret for verifying GitHub webhook payloads. # Use smee.io for local development: https://smee.io/new GITHUB_WEBHOOK_SECRET= -# Required for local GitHub webhook forwarding in compose.dev. -SMEE_URL= -SMEE_TARGET=http://backend:7741/v1/webhooks/github # Fill these in when enabling live Stripe billing. STRIPE_SECRET_KEY= diff --git a/.gitignore b/.gitignore index de4d863..7b6c859 100644 --- a/.gitignore +++ b/.gitignore @@ -59,4 +59,3 @@ sdks/cli/platforms/*/bin/ # Foundry desktop app build artifacts foundry/packages/desktop/frontend-dist/ foundry/packages/desktop/src-tauri/sidecars/ -.context/ diff --git a/CLAUDE.md b/CLAUDE.md index 248f075..26dfa28 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -1,5 +1,40 @@ # Instructions +## ACP v1 Baseline + +- v1 is ACP-native. +- `/v1/*` is removed and returns `410 Gone` (`application/problem+json`). +- `/opencode/*` is disabled during ACP core phases and returns `503`. +- Prompt/session traffic is ACP JSON-RPC over streamable HTTP on `/v1/rpc`: + - `POST /v1/rpc` + - `GET /v1/rpc` (SSE) + - `DELETE /v1/rpc` +- Control-plane endpoints: + - `GET /v1/health` + - `GET /v1/agents` + - `POST /v1/agents/{agent}/install` +- Binary filesystem transfer endpoints (intentionally HTTP, not ACP extension methods): + - `GET /v1/fs/file` + - `PUT /v1/fs/file` + - `POST /v1/fs/upload-batch` +- Sandbox Agent ACP extension method naming: + - Custom ACP methods use `_sandboxagent/...` (not `_sandboxagent/v1/...`). + - Session detach method is `_sandboxagent/session/detach`. + +## API Scope + +- ACP is the primary protocol for agent/session behavior and all functionality that talks directly to the agent. +- ACP extensions may be used for gaps (for example `skills`, `models`, and related metadata), but the default is that agent-facing behavior is implemented by the agent through ACP. +- Custom HTTP APIs are for non-agent/session platform services (for example filesystem, terminals, and other host/runtime capabilities). +- Filesystem and terminal APIs remain Sandbox Agent-specific HTTP contracts and are not ACP. + - Do not make Sandbox Agent core flows depend on ACP client implementations of `fs/*` or `terminal/*`; in practice those client-side capabilities are often incomplete or inconsistent. + - ACP-native filesystem and terminal methods are also too limited for Sandbox Agent host/runtime needs, so prefer the native HTTP APIs for richer behavior. +- Keep `GET /v1/fs/file`, `PUT /v1/fs/file`, and `POST /v1/fs/upload-batch` on HTTP: + - These are Sandbox Agent host/runtime operations with cross-agent-consistent behavior. + - They may involve very large binary transfers that ACP JSON-RPC envelopes are not suited to stream. + - This is intentionally separate from ACP native `fs/read_text_file` and `fs/write_text_file`. + - ACP extension variants may exist in parallel, but SDK defaults should prefer HTTP for these binary transfer operations. + ## Naming and Ownership - This repository/product is **Sandbox Agent**. @@ -13,14 +48,66 @@ - Never mention "ACP" in user-facing docs (`docs/**/*.mdx`) except in docs that are specifically about ACP itself (e.g. `docs/acp-http-client.mdx`). - Never expose underlying protocol method names (e.g. `session/request_permission`, `session/create`, `_sandboxagent/session/detach`) in non-ACP docs. Describe the behavior in user-facing terms instead. - Do not describe the underlying protocol implementation in docs. Only document the SDK surface (methods, types, options). ACP protocol details belong exclusively in ACP-specific pages. -- Do not use em dashes (`—`) in docs. Use commas, periods, or parentheses instead. -### Docs Source Of Truth (HTTP/CLI) +## Architecture (Brief) +- HTTP contract and problem/error mapping: `server/packages/sandbox-agent/src/router.rs` +- ACP client runtime and agent process bridge: `server/packages/sandbox-agent/src/acp_runtime/mod.rs` +- Agent/native + ACP agent process install and lazy install: `server/packages/agent-management/` +- Inspector UI served at `/ui/` and bound to ACP over HTTP from `frontend/packages/inspector/` + +## TypeScript SDK Architecture + +- TypeScript clients are split into: + - `acp-http-client`: protocol-pure ACP-over-HTTP (`/v1/acp`) with no Sandbox-specific HTTP helpers. + - `sandbox-agent`: `SandboxAgent` SDK wrapper that combines ACP session operations with Sandbox control-plane and filesystem helpers. +- `SandboxAgent` entry points are `SandboxAgent.connect(...)` and `SandboxAgent.start(...)`. +- Stable Sandbox session methods are `createSession`, `resumeSession`, `resumeOrCreateSession`, `destroySession`, `rawSendSessionMethod`, `onSessionEvent`, `setSessionMode`, `setSessionModel`, `setSessionThoughtLevel`, `setSessionConfigOption`, `getSessionConfigOptions`, `getSessionModes`, `respondPermission`, `rawRespondPermission`, and `onPermissionRequest`. +- `Session` helpers are `prompt(...)`, `rawSend(...)`, `onEvent(...)`, `setMode(...)`, `setModel(...)`, `setThoughtLevel(...)`, `setConfigOption(...)`, `getConfigOptions()`, `getModes()`, `respondPermission(...)`, `rawRespondPermission(...)`, and `onPermissionRequest(...)`. +- Cleanup is `sdk.dispose()`. + +### React Component Methodology + +- Shared React UI belongs in `sdks/react` only when it is reusable outside the Inspector. +- If the same UI pattern is shared between the Sandbox Agent Inspector and Foundry, prefer extracting it into `sdks/react` instead of maintaining parallel implementations. +- Keep shared components unstyled by default: behavior in the package, styling in the consumer via `className`, slot-level `classNames`, render overrides, and `data-*` hooks. +- Prefer extracting reusable pieces such as transcript, composer, and conversation surfaces. Keep Inspector-specific shells such as session selection, session headers, and control-plane actions in `frontend/packages/inspector/`. +- Document all shared React components in `docs/react-components.mdx`, and keep that page aligned with the exported surface in `sdks/react/src/index.ts`. + +### TypeScript SDK Naming Conventions + +- Use `respond(id, reply)` for SDK methods that reply to an agent-initiated request (e.g. `respondPermission`). This is the standard pattern for answering any inbound JSON-RPC request from the agent. +- Prefix raw/low-level escape hatches with `raw` (e.g. `rawRespondPermission`, `rawSend`). These accept protocol-level types directly and bypass SDK abstractions. + +### Docs Source Of Truth + +- For TypeScript docs/examples, source of truth is implementation in: + - `sdks/typescript/src/client.ts` + - `sdks/typescript/src/index.ts` + - `sdks/acp-http-client/src/index.ts` +- Do not document TypeScript APIs unless they are exported and implemented in those files. - For HTTP/CLI docs/examples, source of truth is: - `server/packages/sandbox-agent/src/router.rs` - `server/packages/sandbox-agent/src/cli.rs` -- Keep docs aligned to implemented endpoints/commands only (for example ACP under `/v1/acp`, not legacy session REST APIs). +- Keep docs aligned to implemented endpoints/commands only (for example ACP under `/v1/acp`, not legacy `/v1/sessions` APIs). + +## ACP Protocol Compliance + +- Before adding any new ACP method, property, or config option category to the SDK, verify it exists in the ACP spec at `https://agentclientprotocol.com/llms-full.txt`. +- Valid `SessionConfigOptionCategory` values are: `mode`, `model`, `thought_level`, `other`, or custom categories prefixed with `_` (e.g. `_permission_mode`). +- Do not invent ACP properties or categories (e.g. `permission_mode` is not a valid ACP category — use `_permission_mode` if it's a custom extension, or use existing ACP mechanisms like `session/set_mode`). +- `NewSessionRequest` only has `_meta`, `cwd`, and `mcpServers`. Do not add non-ACP fields to it. +- Sandbox Agent SDK abstractions (like `SessionCreateRequest`) may add convenience properties, but must clearly map to real ACP methods internally and not send fabricated fields over the wire. + +## Source Documents + +- ACP protocol specification (full LLM-readable reference): `https://agentclientprotocol.com/llms-full.txt` +- `~/misc/acp-docs/schema/schema.json` +- `~/misc/acp-docs/schema/meta.json` +- `research/acp/spec.md` +- `research/acp/v1-schema-to-acp-mapping.md` +- `research/acp/friction.md` +- `research/acp/todo.md` ## Change Tracking @@ -32,20 +119,14 @@ - Append blockers/decisions to `research/acp/friction.md` during ACP work. - `docs/agent-capabilities.mdx` lists models/modes/thought levels per agent. Update it when adding a new agent or changing `fallback_config_options`. If its "Last updated" date is >2 weeks old, re-run `cd scripts/agent-configs && npx tsx dump.ts` and update the doc to match. Source data: `scripts/agent-configs/resources/*.json` and hardcoded entries in `server/packages/sandbox-agent/src/router/support.rs` (`fallback_config_options`). - Some agent models are gated by subscription (e.g. Claude `opus`). The live report only shows models available to the current credentials. The static doc and JSON resource files should list all known models regardless of subscription tier. +- TypeScript SDK tests should run against a real running server/runtime over real `/v1` HTTP APIs, typically using the real `mock` agent for deterministic behavior. +- Do not use Vitest fetch/transport mocks to simulate server functionality in TypeScript SDK tests. -## Docker Test Image +## Docker Examples (Dev Testing) -- Docker-backed Rust and TypeScript tests build `docker/test-agent/Dockerfile` directly in-process and cache the image tag only in memory (`OnceLock` in Rust, module-level variable in TypeScript). -- Do not add cross-process image-build scripts unless there is a concrete need for them. - -## Common Software Sync - -- These three files must stay in sync: - - `docs/common-software.mdx` (user-facing documentation) - - `docker/test-common-software/Dockerfile` (packages installed in the test image) - - `server/packages/sandbox-agent/tests/common_software.rs` (test assertions) -- When adding or removing software from `docs/common-software.mdx`, also add/remove the corresponding `apt-get install` line in the Dockerfile and add/remove the test in `common_software.rs`. -- Run `cargo test -p sandbox-agent --test common_software` to verify. +- When manually testing bleeding-edge (unreleased) versions of sandbox-agent in `examples/`, use `SANDBOX_AGENT_DEV=1` with the Docker-based examples. +- This triggers a local build of `docker/runtime/Dockerfile.full` which builds the server binary from local source and packages it into the Docker image. +- Example: `SANDBOX_AGENT_DEV=1 pnpm --filter @sandbox-agent/example-mcp start` ## Install Version References @@ -78,3 +159,4 @@ - `scripts/release/main.ts` - `scripts/release/promote-artifacts.ts` - `scripts/release/sdk.ts` + - `scripts/sandbox-testing/test-sandbox.ts` diff --git a/Cargo.toml b/Cargo.toml index 0fc4dc8..c353c2c 100644 --- a/Cargo.toml +++ b/Cargo.toml @@ -4,7 +4,7 @@ members = ["server/packages/*", "gigacode"] exclude = ["factory/packages/desktop/src-tauri", "foundry/packages/desktop/src-tauri"] [workspace.package] -version = "0.4.2" +version = "0.3.2" edition = "2021" authors = [ "Rivet Gaming, LLC " ] license = "Apache-2.0" @@ -13,13 +13,13 @@ description = "Universal API for automatic coding agents in sandboxes. Supports [workspace.dependencies] # Internal crates -sandbox-agent = { version = "0.4.2", path = "server/packages/sandbox-agent" } -sandbox-agent-error = { version = "0.4.2", path = "server/packages/error" } -sandbox-agent-agent-management = { version = "0.4.2", path = "server/packages/agent-management" } -sandbox-agent-agent-credentials = { version = "0.4.2", path = "server/packages/agent-credentials" } -sandbox-agent-opencode-adapter = { version = "0.4.2", path = "server/packages/opencode-adapter" } -sandbox-agent-opencode-server-manager = { version = "0.4.2", path = "server/packages/opencode-server-manager" } -acp-http-adapter = { version = "0.4.2", path = "server/packages/acp-http-adapter" } +sandbox-agent = { version = "0.3.2", path = "server/packages/sandbox-agent" } +sandbox-agent-error = { version = "0.3.2", path = "server/packages/error" } +sandbox-agent-agent-management = { version = "0.3.2", path = "server/packages/agent-management" } +sandbox-agent-agent-credentials = { version = "0.3.2", path = "server/packages/agent-credentials" } +sandbox-agent-opencode-adapter = { version = "0.3.2", path = "server/packages/opencode-adapter" } +sandbox-agent-opencode-server-manager = { version = "0.3.2", path = "server/packages/opencode-server-manager" } +acp-http-adapter = { version = "0.3.2", path = "server/packages/acp-http-adapter" } # Serialization serde = { version = "1.0", features = ["derive"] } diff --git a/README.md b/README.md index cf9b933..d4bfc61 100644 --- a/README.md +++ b/README.md @@ -80,11 +80,11 @@ Import the SDK directly into your Node or browser application. Full type safety **Install** ```bash -npm install sandbox-agent@0.4.x +npm install sandbox-agent@0.3.x ``` ```bash -bun add sandbox-agent@0.4.x +bun add sandbox-agent@0.3.x # Optional: allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()). bun pm trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 ``` @@ -135,7 +135,7 @@ Run as an HTTP server and connect from any language. Deploy to E2B, Daytona, Ver ```bash # Install it -curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh +curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh # Run it sandbox-agent server --token "$SANDBOX_TOKEN" --host 127.0.0.1 --port 2468 ``` @@ -159,12 +159,12 @@ sandbox-agent server --no-token --host 127.0.0.1 --port 2468 Install the CLI wrapper (optional but convenient): ```bash -npm install -g @sandbox-agent/cli@0.4.x +npm install -g @sandbox-agent/cli@0.3.x ``` ```bash # Allow Bun to run postinstall scripts for native binaries. -bun add -g @sandbox-agent/cli@0.4.x +bun add -g @sandbox-agent/cli@0.3.x bun pm -g trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 ``` @@ -179,11 +179,11 @@ sandbox-agent api sessions send-message-stream my-session --message "Hello" --en You can also use npx like: ```bash -npx @sandbox-agent/cli@0.4.x --help +npx @sandbox-agent/cli@0.3.x --help ``` ```bash -bunx @sandbox-agent/cli@0.4.x --help +bunx @sandbox-agent/cli@0.3.x --help ``` [CLI documentation](https://sandboxagent.dev/docs/cli) @@ -277,7 +277,7 @@ Coding agents expect interactive terminals with proper TTY handling. SSH with pi - **Storage of sessions on disk**: Sessions are already stored by the respective coding agents on disk. It's assumed that the consumer is streaming data from this machine to an external storage, such as Postgres, ClickHouse, or Rivet. - **Direct LLM wrappers**: Use the [Vercel AI SDK](https://ai-sdk.dev/docs/introduction) if you want to implement your own agent from scratch. - **Git Repo Management**: Just use git commands or the features provided by your sandbox provider of choice. -- **Sandbox Provider API**: Sandbox providers have many nuanced differences in their API, it does not make sense for us to try to provide a custom layer. Instead, we opt to provide guides that let you integrate this repository with sandbox providers. +- **Sandbox Provider API**: Sandbox providers have many nuanced differences in their API, it does not make sense for us to try to provide a custom layer. Instead, we opt to provide guides that let you integrate this project with sandbox providers. ## Roadmap diff --git a/docker/inspector-dev/Dockerfile b/docker/inspector-dev/Dockerfile deleted file mode 100644 index b55923f..0000000 --- a/docker/inspector-dev/Dockerfile +++ /dev/null @@ -1,7 +0,0 @@ -FROM node:22-bookworm-slim - -RUN npm install -g pnpm@10.28.2 - -WORKDIR /app - -CMD ["bash", "-lc", "pnpm install --filter @sandbox-agent/inspector... && cd frontend/packages/inspector && exec pnpm vite --host 0.0.0.0 --port 5173"] diff --git a/docker/release/linux-aarch64.Dockerfile b/docker/release/linux-aarch64.Dockerfile index d5ff208..412e6c0 100644 --- a/docker/release/linux-aarch64.Dockerfile +++ b/docker/release/linux-aarch64.Dockerfile @@ -10,6 +10,7 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ +COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/ COPY sdks/react/package.json ./sdks/react/ COPY sdks/typescript/package.json ./sdks/typescript/ @@ -20,13 +21,15 @@ RUN pnpm install --filter @sandbox-agent/inspector... COPY docs/openapi.json ./docs/ COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/acp-http-client ./sdks/acp-http-client +COPY sdks/persist-indexeddb ./sdks/persist-indexeddb COPY sdks/react ./sdks/react COPY sdks/typescript ./sdks/typescript -# Build cli-shared, acp-http-client, SDK, then react (depends on SDK) +# Build cli-shared, acp-http-client, SDK, then persist-indexeddb and react (depends on SDK) RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup +RUN cd sdks/persist-indexeddb && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup # Copy inspector source and build diff --git a/docker/release/linux-x86_64.Dockerfile b/docker/release/linux-x86_64.Dockerfile index 1c41711..323e471 100644 --- a/docker/release/linux-x86_64.Dockerfile +++ b/docker/release/linux-x86_64.Dockerfile @@ -10,6 +10,7 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ +COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/ COPY sdks/react/package.json ./sdks/react/ COPY sdks/typescript/package.json ./sdks/typescript/ @@ -20,13 +21,15 @@ RUN pnpm install --filter @sandbox-agent/inspector... COPY docs/openapi.json ./docs/ COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/acp-http-client ./sdks/acp-http-client +COPY sdks/persist-indexeddb ./sdks/persist-indexeddb COPY sdks/react ./sdks/react COPY sdks/typescript ./sdks/typescript -# Build cli-shared, acp-http-client, SDK, then react (depends on SDK) +# Build cli-shared, acp-http-client, SDK, then persist-indexeddb and react (depends on SDK) RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup +RUN cd sdks/persist-indexeddb && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup # Copy inspector source and build diff --git a/docker/release/macos-aarch64.Dockerfile b/docker/release/macos-aarch64.Dockerfile index 5d918b2..000157e 100644 --- a/docker/release/macos-aarch64.Dockerfile +++ b/docker/release/macos-aarch64.Dockerfile @@ -10,6 +10,7 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ +COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/ COPY sdks/react/package.json ./sdks/react/ COPY sdks/typescript/package.json ./sdks/typescript/ @@ -20,13 +21,15 @@ RUN pnpm install --filter @sandbox-agent/inspector... COPY docs/openapi.json ./docs/ COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/acp-http-client ./sdks/acp-http-client +COPY sdks/persist-indexeddb ./sdks/persist-indexeddb COPY sdks/react ./sdks/react COPY sdks/typescript ./sdks/typescript -# Build cli-shared, acp-http-client, SDK, then react (depends on SDK) +# Build cli-shared, acp-http-client, SDK, then persist-indexeddb and react (depends on SDK) RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup +RUN cd sdks/persist-indexeddb && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup # Copy inspector source and build diff --git a/docker/release/macos-x86_64.Dockerfile b/docker/release/macos-x86_64.Dockerfile index 9b52aa6..9082018 100644 --- a/docker/release/macos-x86_64.Dockerfile +++ b/docker/release/macos-x86_64.Dockerfile @@ -10,6 +10,7 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ +COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/ COPY sdks/react/package.json ./sdks/react/ COPY sdks/typescript/package.json ./sdks/typescript/ @@ -20,13 +21,15 @@ RUN pnpm install --filter @sandbox-agent/inspector... COPY docs/openapi.json ./docs/ COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/acp-http-client ./sdks/acp-http-client +COPY sdks/persist-indexeddb ./sdks/persist-indexeddb COPY sdks/react ./sdks/react COPY sdks/typescript ./sdks/typescript -# Build cli-shared, acp-http-client, SDK, then react (depends on SDK) +# Build cli-shared, acp-http-client, SDK, then persist-indexeddb and react (depends on SDK) RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup +RUN cd sdks/persist-indexeddb && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup # Copy inspector source and build diff --git a/docker/release/windows.Dockerfile b/docker/release/windows.Dockerfile index 92067db..9c7694d 100644 --- a/docker/release/windows.Dockerfile +++ b/docker/release/windows.Dockerfile @@ -10,6 +10,7 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ +COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/ COPY sdks/react/package.json ./sdks/react/ COPY sdks/typescript/package.json ./sdks/typescript/ @@ -20,13 +21,15 @@ RUN pnpm install --filter @sandbox-agent/inspector... COPY docs/openapi.json ./docs/ COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/acp-http-client ./sdks/acp-http-client +COPY sdks/persist-indexeddb ./sdks/persist-indexeddb COPY sdks/react ./sdks/react COPY sdks/typescript ./sdks/typescript -# Build cli-shared, acp-http-client, SDK, then react (depends on SDK) +# Build cli-shared, acp-http-client, SDK, then persist-indexeddb and react (depends on SDK) RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup +RUN cd sdks/persist-indexeddb && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup # Copy inspector source and build diff --git a/docker/runtime/Dockerfile b/docker/runtime/Dockerfile index 85473be..bdd1a16 100644 --- a/docker/runtime/Dockerfile +++ b/docker/runtime/Dockerfile @@ -12,6 +12,7 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ +COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/ COPY sdks/react/package.json ./sdks/react/ COPY sdks/typescript/package.json ./sdks/typescript/ @@ -22,6 +23,7 @@ RUN pnpm install --filter @sandbox-agent/inspector... COPY docs/openapi.json ./docs/ COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/acp-http-client ./sdks/acp-http-client +COPY sdks/persist-indexeddb ./sdks/persist-indexeddb COPY sdks/react ./sdks/react COPY sdks/typescript ./sdks/typescript @@ -29,6 +31,7 @@ COPY sdks/typescript ./sdks/typescript RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup +RUN cd sdks/persist-indexeddb && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup # Copy inspector source and build @@ -149,8 +152,7 @@ FROM debian:bookworm-slim RUN apt-get update && apt-get install -y \ ca-certificates \ curl \ - git \ - ffmpeg && \ + git && \ rm -rf /var/lib/apt/lists/* # Copy the binary from builder diff --git a/docker/runtime/Dockerfile.full b/docker/runtime/Dockerfile.full index 9ab4c0d..beb1664 100644 --- a/docker/runtime/Dockerfile.full +++ b/docker/runtime/Dockerfile.full @@ -11,6 +11,7 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./ COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ COPY sdks/cli-shared/package.json ./sdks/cli-shared/ COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ +COPY sdks/persist-indexeddb/package.json ./sdks/persist-indexeddb/ COPY sdks/react/package.json ./sdks/react/ COPY sdks/typescript/package.json ./sdks/typescript/ @@ -19,12 +20,14 @@ RUN pnpm install --filter @sandbox-agent/inspector... COPY docs/openapi.json ./docs/ COPY sdks/cli-shared ./sdks/cli-shared COPY sdks/acp-http-client ./sdks/acp-http-client +COPY sdks/persist-indexeddb ./sdks/persist-indexeddb COPY sdks/react ./sdks/react COPY sdks/typescript ./sdks/typescript RUN cd sdks/cli-shared && pnpm exec tsup RUN cd sdks/acp-http-client && pnpm exec tsup RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup +RUN cd sdks/persist-indexeddb && pnpm exec tsup RUN cd sdks/react && pnpm exec tsup COPY frontend/packages/inspector ./frontend/packages/inspector diff --git a/docker/test-agent/Dockerfile b/docker/test-agent/Dockerfile deleted file mode 100644 index 67888b3..0000000 --- a/docker/test-agent/Dockerfile +++ /dev/null @@ -1,61 +0,0 @@ -FROM rust:1.88.0-bookworm AS builder -WORKDIR /build - -COPY Cargo.toml Cargo.lock ./ -COPY server/ ./server/ -COPY gigacode/ ./gigacode/ -COPY resources/agent-schemas/artifacts/ ./resources/agent-schemas/artifacts/ -COPY scripts/agent-configs/ ./scripts/agent-configs/ -COPY scripts/audit-acp-deps/ ./scripts/audit-acp-deps/ - -ENV SANDBOX_AGENT_SKIP_INSPECTOR=1 - -RUN --mount=type=cache,target=/usr/local/cargo/registry \ - --mount=type=cache,target=/usr/local/cargo/git \ - --mount=type=cache,target=/build/target \ - cargo build -p sandbox-agent --release && \ - cp target/release/sandbox-agent /sandbox-agent - -# Extract neko binary from the official image for WebRTC desktop streaming. -# Using neko v3 base image from GHCR which provides multi-arch support (amd64, arm64). -# Pinned by digest to prevent breaking changes from upstream. -# Reference client: https://github.com/demodesk/neko-client/blob/37f93eae6bd55b333c94bd009d7f2b079075a026/src/component/internal/webrtc.ts -FROM ghcr.io/m1k1o/neko/base@sha256:0c384afa56268aaa2d5570211d284763d0840dcdd1a7d9a24be3081d94d3dfce AS neko-base - -FROM node:22-bookworm-slim -RUN apt-get update -qq && \ - apt-get install -y -qq --no-install-recommends \ - ca-certificates \ - bash \ - libstdc++6 \ - xvfb \ - openbox \ - xdotool \ - imagemagick \ - ffmpeg \ - gstreamer1.0-tools \ - gstreamer1.0-plugins-base \ - gstreamer1.0-plugins-good \ - gstreamer1.0-plugins-bad \ - gstreamer1.0-plugins-ugly \ - gstreamer1.0-nice \ - gstreamer1.0-x \ - gstreamer1.0-pulseaudio \ - libxcvt0 \ - x11-xserver-utils \ - dbus-x11 \ - xauth \ - fonts-dejavu-core \ - xterm \ - > /dev/null 2>&1 && \ - rm -rf /var/lib/apt/lists/* - -COPY --from=builder /sandbox-agent /usr/local/bin/sandbox-agent -COPY --from=neko-base /usr/bin/neko /usr/local/bin/neko - -EXPOSE 3000 -# Expose UDP port range for WebRTC media transport -EXPOSE 59050-59070/udp - -ENTRYPOINT ["/usr/local/bin/sandbox-agent"] -CMD ["server", "--host", "0.0.0.0", "--port", "3000", "--no-token"] diff --git a/docker/test-common-software/Dockerfile b/docker/test-common-software/Dockerfile deleted file mode 100644 index 7a03abc..0000000 --- a/docker/test-common-software/Dockerfile +++ /dev/null @@ -1,37 +0,0 @@ -# Extends the base test-agent image with common software pre-installed. -# Used by the common_software integration test to verify that all documented -# software in docs/common-software.mdx works correctly inside the sandbox. -# -# KEEP IN SYNC with docs/common-software.mdx - -ARG BASE_IMAGE=sandbox-agent-test:dev -FROM ${BASE_IMAGE} - -USER root - -RUN apt-get update -qq && \ - apt-get install -y -qq --no-install-recommends \ - # Browsers - chromium \ - firefox-esr \ - # Languages - python3 python3-pip python3-venv \ - default-jdk \ - ruby-full \ - # Databases - sqlite3 \ - redis-server \ - # Build tools - build-essential cmake pkg-config \ - # CLI tools - git jq tmux \ - # Media and graphics - imagemagick \ - poppler-utils \ - # Desktop apps - gimp \ - > /dev/null 2>&1 && \ - rm -rf /var/lib/apt/lists/* - -ENTRYPOINT ["/usr/local/bin/sandbox-agent"] -CMD ["server", "--host", "0.0.0.0", "--port", "3000", "--no-token"] diff --git a/docs/agent-capabilities.mdx b/docs/agent-capabilities.mdx new file mode 100644 index 0000000..13f2723 --- /dev/null +++ b/docs/agent-capabilities.mdx @@ -0,0 +1,127 @@ +--- +title: "Agent Capabilities" +description: "Models, modes, and thought levels supported by each agent." +--- + +Capabilities are subject to change as the agents are updated. See [Agent Sessions](/agent-sessions) for full session configuration API details. + + + + _Last updated: March 5th, 2026. See [Generating a live report](#generating-a-live-report) for up-to-date reference._ + + +## Claude + +| Category | Values | +|----------|--------| +| **Models** | `default`, `sonnet`, `opus`, `haiku` | +| **Modes** | `default`, `acceptEdits`, `plan`, `dontAsk`, `bypassPermissions` | +| **Thought levels** | Unsupported | + +### Configuring Effort Level For Claude + +Claude does not natively support changing effort level after a session starts, so configure it in the filesystem before creating the session. + +```ts +import { mkdir, writeFile } from "node:fs/promises"; +import path from "node:path"; +import { SandboxAgent } from "sandbox-agent"; + +const cwd = "/path/to/workspace"; +await mkdir(path.join(cwd, ".claude"), { recursive: true }); +await writeFile( + path.join(cwd, ".claude", "settings.json"), + JSON.stringify({ effortLevel: "high" }, null, 2), +); + +const sdk = await SandboxAgent.connect({ baseUrl: "http://127.0.0.1:2468" }); +await sdk.createSession({ + agent: "claude", + sessionInit: { cwd, mcpServers: [] }, +}); +``` + + + +1. `~/.claude/settings.json` +2. `/.claude/settings.json` +3. `/.claude/settings.local.json` + + + +## Codex + +| Category | Values | +|----------|--------| +| **Models** | `gpt-5.3-codex` (default), `gpt-5.3-codex-spark`, `gpt-5.2-codex`, `gpt-5.1-codex-max`, `gpt-5.2`, `gpt-5.1-codex-mini` | +| **Modes** | `read-only` (default), `auto`, `full-access` | +| **Thought levels** | `low`, `medium`, `high` (default), `xhigh` | + +## OpenCode + +| Category | Values | +|----------|--------| +| **Models** | See below | +| **Modes** | `build` (default), `plan` | +| **Thought levels** | Unsupported | + + + +| Provider | Models | +|----------|--------| +| **Anthropic** | `anthropic/claude-3-5-haiku-20241022`, `anthropic/claude-3-5-haiku-latest`, `anthropic/claude-3-5-sonnet-20240620`, `anthropic/claude-3-5-sonnet-20241022`, `anthropic/claude-3-7-sonnet-20250219`, `anthropic/claude-3-7-sonnet-latest`, `anthropic/claude-3-haiku-20240307`, `anthropic/claude-3-opus-20240229`, `anthropic/claude-3-sonnet-20240229`, `anthropic/claude-haiku-4-5`, `anthropic/claude-haiku-4-5-20251001`, `anthropic/claude-opus-4-0`, `anthropic/claude-opus-4-1`, `anthropic/claude-opus-4-1-20250805`, `anthropic/claude-opus-4-20250514`, `anthropic/claude-opus-4-5`, `anthropic/claude-opus-4-5-20251101`, `anthropic/claude-opus-4-6`, `anthropic/claude-sonnet-4-0`, `anthropic/claude-sonnet-4-20250514`, `anthropic/claude-sonnet-4-5`, `anthropic/claude-sonnet-4-5-20250929` | +| **OpenAI** | `openai/gpt-5.1-codex`, `openai/gpt-5.1-codex-max`, `openai/gpt-5.1-codex-mini`, `openai/gpt-5.2`, `openai/gpt-5.2-codex`, `openai/gpt-5.3-codex` | +| **Cerebras** | `cerebras/gpt-oss-120b`, `cerebras/qwen-3-235b-a22b-instruct-2507`, `cerebras/zai-glm-4.7` | +| **OpenCode Zen** | `opencode/big-pickle`, `opencode/claude-3-5-haiku`, `opencode/claude-haiku-4-5`, `opencode/claude-opus-4-1`, `opencode/claude-opus-4-5`, `opencode/claude-opus-4-6`, `opencode/claude-sonnet-4`, `opencode/claude-sonnet-4-5`, `opencode/gemini-3-flash`, `opencode/gemini-3-pro` (default), `opencode/glm-4.6`, `opencode/glm-4.7`, `opencode/gpt-5`, `opencode/gpt-5-codex`, `opencode/gpt-5-nano`, `opencode/gpt-5.1`, `opencode/gpt-5.1-codex`, `opencode/gpt-5.1-codex-max`, `opencode/gpt-5.1-codex-mini`, `opencode/gpt-5.2`, `opencode/gpt-5.2-codex`, `opencode/kimi-k2`, `opencode/kimi-k2-thinking`, `opencode/kimi-k2.5`, `opencode/kimi-k2.5-free`, `opencode/minimax-m2.1`, `opencode/minimax-m2.1-free`, `opencode/trinity-large-preview-free` | + + + +## Cursor + +| Category | Values | +|----------|--------| +| **Models** | See below | +| **Modes** | Unsupported | +| **Thought levels** | Unsupported | + + + +| Group | Models | +|-------|--------| +| **Auto** | `auto` | +| **Composer** | `composer-1.5`, `composer-1` | +| **GPT-5.3 Codex** | `gpt-5.3-codex`, `gpt-5.3-codex-low`, `gpt-5.3-codex-high`, `gpt-5.3-codex-xhigh`, `gpt-5.3-codex-fast`, `gpt-5.3-codex-low-fast`, `gpt-5.3-codex-high-fast`, `gpt-5.3-codex-xhigh-fast` | +| **GPT-5.2** | `gpt-5.2`, `gpt-5.2-high`, `gpt-5.2-codex`, `gpt-5.2-codex-low`, `gpt-5.2-codex-high`, `gpt-5.2-codex-xhigh`, `gpt-5.2-codex-fast`, `gpt-5.2-codex-low-fast`, `gpt-5.2-codex-high-fast`, `gpt-5.2-codex-xhigh-fast` | +| **GPT-5.1** | `gpt-5.1-high`, `gpt-5.1-codex-max`, `gpt-5.1-codex-max-high` | +| **Claude** | `opus-4.6-thinking` (default), `opus-4.6`, `opus-4.5`, `opus-4.5-thinking`, `sonnet-4.5`, `sonnet-4.5-thinking` | +| **Other** | `gemini-3-pro`, `gemini-3-flash`, `grok` | + + + +## Amp + +| Category | Values | +|----------|--------| +| **Models** | `amp-default` | +| **Modes** | `default`, `bypass` | +| **Thought levels** | Unsupported | + +## Pi + +| Category | Values | +|----------|--------| +| **Models** | `default` | +| **Modes** | Unsupported | +| **Thought levels** | Unsupported | + +## Generating a live report + +Requires a running Sandbox Agent server. `--endpoint` defaults to `http://127.0.0.1:2468`. + +```bash +sandbox-agent api agents report +``` + + + The live report reflects what the agent adapter returns for the current credentials. Some models may be gated by subscription (e.g. Claude's `opus` requires a paid plan) and will not appear in the report if the credentials don't have access. + diff --git a/docs/agent-sessions.mdx b/docs/agent-sessions.mdx index 0154537..cf56e9c 100644 --- a/docs/agent-sessions.mdx +++ b/docs/agent-sessions.mdx @@ -21,7 +21,10 @@ const sdk = await SandboxAgent.connect({ const session = await sdk.createSession({ agent: "codex", - cwd: "/", + sessionInit: { + cwd: "/", + mcpServers: [], + }, }); console.log(session.id, session.agentSessionId); @@ -51,108 +54,6 @@ await session.prompt([ unsubscribe(); ``` -### Event types - -Each event's `payload` contains a session update. The `sessionUpdate` field identifies the type. - - - -Streamed text or content from the agent's response. - -```json -{ - "sessionUpdate": "agent_message_chunk", - "content": { "type": "text", "text": "Here's how the repository is structured..." } -} -``` - - - -Internal reasoning from the agent (chain-of-thought / extended thinking). - -```json -{ - "sessionUpdate": "agent_thought_chunk", - "content": { "type": "text", "text": "I should start by looking at the project structure..." } -} -``` - - - -Echo of the user's prompt being processed. - -```json -{ - "sessionUpdate": "user_message_chunk", - "content": { "type": "text", "text": "Summarize the repository structure." } -} -``` - - - -The agent invoked a tool (file edit, terminal command, etc.). - -```json -{ - "sessionUpdate": "tool_call", - "toolCallId": "tc_abc123", - "title": "Read file", - "status": "in_progress", - "rawInput": { "path": "/src/index.ts" } -} -``` - - - -Progress or result update for an in-progress tool call. - -```json -{ - "sessionUpdate": "tool_call_update", - "toolCallId": "tc_abc123", - "status": "completed", - "content": [{ "type": "text", "text": "import express from 'express';\n..." }] -} -``` - - - -The agent's execution plan for the current task. - -```json -{ - "sessionUpdate": "plan", - "entries": [ - { "content": "Read the project structure", "status": "completed" }, - { "content": "Identify main entrypoints", "status": "in_progress" }, - { "content": "Write summary", "status": "pending" } - ] -} -``` - - - -Token usage metrics for the current turn. - -```json -{ - "sessionUpdate": "usage_update" -} -``` - - - -Session metadata changed (e.g. agent-generated title). - -```json -{ - "sessionUpdate": "session_info_update", - "title": "Repository structure analysis" -} -``` - - - ## Fetch persisted event history ```ts diff --git a/docs/agents/amp.mdx b/docs/agents/amp.mdx deleted file mode 100644 index f94e97d..0000000 --- a/docs/agents/amp.mdx +++ /dev/null @@ -1,20 +0,0 @@ ---- -title: "Amp" -description: "Use Amp as a sandbox agent." ---- - -## Usage - -```typescript -const session = await client.createSession({ - agent: "amp", -}); -``` - -## Capabilities - -| Category | Values | -|----------|--------| -| **Models** | `amp-default` | -| **Modes** | `default`, `bypass` | -| **Thought levels** | Unsupported | diff --git a/docs/agents/claude.mdx b/docs/agents/claude.mdx deleted file mode 100644 index 2e4fd43..0000000 --- a/docs/agents/claude.mdx +++ /dev/null @@ -1,49 +0,0 @@ ---- -title: "Claude" -description: "Use Claude Code as a sandbox agent." ---- - -## Usage - -```typescript -const session = await client.createSession({ - agent: "claude", -}); -``` - -## Capabilities - -| Category | Values | -|----------|--------| -| **Models** | `default`, `sonnet`, `opus`, `haiku` | -| **Modes** | `default`, `acceptEdits`, `plan`, `dontAsk`, `bypassPermissions` | -| **Thought levels** | Unsupported | - -## Configuring effort level - -Claude does not support changing effort level after a session starts. Configure it in the filesystem before creating the session. - -```ts -import { mkdir, writeFile } from "node:fs/promises"; -import path from "node:path"; - -const cwd = "/path/to/workspace"; -await mkdir(path.join(cwd, ".claude"), { recursive: true }); -await writeFile( - path.join(cwd, ".claude", "settings.json"), - JSON.stringify({ effortLevel: "high" }, null, 2), -); - -const session = await client.createSession({ - agent: "claude", - cwd, -}); -``` - - - -1. `~/.claude/settings.json` -2. `/.claude/settings.json` -3. `/.claude/settings.local.json` - - diff --git a/docs/agents/codex.mdx b/docs/agents/codex.mdx deleted file mode 100644 index d359beb..0000000 --- a/docs/agents/codex.mdx +++ /dev/null @@ -1,20 +0,0 @@ ---- -title: "Codex" -description: "Use OpenAI Codex as a sandbox agent." ---- - -## Usage - -```typescript -const session = await client.createSession({ - agent: "codex", -}); -``` - -## Capabilities - -| Category | Values | -|----------|--------| -| **Models** | `gpt-5.3-codex` (default), `gpt-5.3-codex-spark`, `gpt-5.2-codex`, `gpt-5.1-codex-max`, `gpt-5.2`, `gpt-5.1-codex-mini` | -| **Modes** | `read-only` (default), `auto`, `full-access` | -| **Thought levels** | `low`, `medium`, `high` (default), `xhigh` | diff --git a/docs/agents/cursor.mdx b/docs/agents/cursor.mdx deleted file mode 100644 index 0905baa..0000000 --- a/docs/agents/cursor.mdx +++ /dev/null @@ -1,34 +0,0 @@ ---- -title: "Cursor" -description: "Use Cursor as a sandbox agent." ---- - -## Usage - -```typescript -const session = await client.createSession({ - agent: "cursor", -}); -``` - -## Capabilities - -| Category | Values | -|----------|--------| -| **Models** | See below | -| **Modes** | Unsupported | -| **Thought levels** | Unsupported | - - - -| Group | Models | -|-------|--------| -| **Auto** | `auto` | -| **Composer** | `composer-1.5`, `composer-1` | -| **GPT-5.3 Codex** | `gpt-5.3-codex`, `gpt-5.3-codex-low`, `gpt-5.3-codex-high`, `gpt-5.3-codex-xhigh`, `gpt-5.3-codex-fast`, `gpt-5.3-codex-low-fast`, `gpt-5.3-codex-high-fast`, `gpt-5.3-codex-xhigh-fast` | -| **GPT-5.2** | `gpt-5.2`, `gpt-5.2-high`, `gpt-5.2-codex`, `gpt-5.2-codex-low`, `gpt-5.2-codex-high`, `gpt-5.2-codex-xhigh`, `gpt-5.2-codex-fast`, `gpt-5.2-codex-low-fast`, `gpt-5.2-codex-high-fast`, `gpt-5.2-codex-xhigh-fast` | -| **GPT-5.1** | `gpt-5.1-high`, `gpt-5.1-codex-max`, `gpt-5.1-codex-max-high` | -| **Claude** | `opus-4.6-thinking` (default), `opus-4.6`, `opus-4.5`, `opus-4.5-thinking`, `sonnet-4.5`, `sonnet-4.5-thinking` | -| **Other** | `gemini-3-pro`, `gemini-3-flash`, `grok` | - - diff --git a/docs/agents/opencode.mdx b/docs/agents/opencode.mdx deleted file mode 100644 index db7b640..0000000 --- a/docs/agents/opencode.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: "OpenCode" -description: "Use OpenCode as a sandbox agent." ---- - -## Usage - -```typescript -const session = await client.createSession({ - agent: "opencode", -}); -``` - -## Capabilities - -| Category | Values | -|----------|--------| -| **Models** | See below | -| **Modes** | `build` (default), `plan` | -| **Thought levels** | Unsupported | - - - -| Provider | Models | -|----------|--------| -| **Anthropic** | `anthropic/claude-3-5-haiku-20241022`, `anthropic/claude-3-5-haiku-latest`, `anthropic/claude-3-5-sonnet-20240620`, `anthropic/claude-3-5-sonnet-20241022`, `anthropic/claude-3-7-sonnet-20250219`, `anthropic/claude-3-7-sonnet-latest`, `anthropic/claude-3-haiku-20240307`, `anthropic/claude-3-opus-20240229`, `anthropic/claude-3-sonnet-20240229`, `anthropic/claude-haiku-4-5`, `anthropic/claude-haiku-4-5-20251001`, `anthropic/claude-opus-4-0`, `anthropic/claude-opus-4-1`, `anthropic/claude-opus-4-1-20250805`, `anthropic/claude-opus-4-20250514`, `anthropic/claude-opus-4-5`, `anthropic/claude-opus-4-5-20251101`, `anthropic/claude-opus-4-6`, `anthropic/claude-sonnet-4-0`, `anthropic/claude-sonnet-4-20250514`, `anthropic/claude-sonnet-4-5`, `anthropic/claude-sonnet-4-5-20250929` | -| **OpenAI** | `openai/gpt-5.1-codex`, `openai/gpt-5.1-codex-max`, `openai/gpt-5.1-codex-mini`, `openai/gpt-5.2`, `openai/gpt-5.2-codex`, `openai/gpt-5.3-codex` | -| **Cerebras** | `cerebras/gpt-oss-120b`, `cerebras/qwen-3-235b-a22b-instruct-2507`, `cerebras/zai-glm-4.7` | -| **OpenCode Zen** | `opencode/big-pickle`, `opencode/claude-3-5-haiku`, `opencode/claude-haiku-4-5`, `opencode/claude-opus-4-1`, `opencode/claude-opus-4-5`, `opencode/claude-opus-4-6`, `opencode/claude-sonnet-4`, `opencode/claude-sonnet-4-5`, `opencode/gemini-3-flash`, `opencode/gemini-3-pro` (default), `opencode/glm-4.6`, `opencode/glm-4.7`, `opencode/gpt-5`, `opencode/gpt-5-codex`, `opencode/gpt-5-nano`, `opencode/gpt-5.1`, `opencode/gpt-5.1-codex`, `opencode/gpt-5.1-codex-max`, `opencode/gpt-5.1-codex-mini`, `opencode/gpt-5.2`, `opencode/gpt-5.2-codex`, `opencode/kimi-k2`, `opencode/kimi-k2-thinking`, `opencode/kimi-k2.5`, `opencode/kimi-k2.5-free`, `opencode/minimax-m2.1`, `opencode/minimax-m2.1-free`, `opencode/trinity-large-preview-free` | - - diff --git a/docs/agents/pi.mdx b/docs/agents/pi.mdx deleted file mode 100644 index 1d56370..0000000 --- a/docs/agents/pi.mdx +++ /dev/null @@ -1,20 +0,0 @@ ---- -title: "Pi" -description: "Use Pi as a sandbox agent." ---- - -## Usage - -```typescript -const session = await client.createSession({ - agent: "pi", -}); -``` - -## Capabilities - -| Category | Values | -|----------|--------| -| **Models** | `default` | -| **Modes** | Unsupported | -| **Thought levels** | Unsupported | diff --git a/docs/architecture.mdx b/docs/architecture.mdx index 61b4689..78585a2 100644 --- a/docs/architecture.mdx +++ b/docs/architecture.mdx @@ -1,63 +1,64 @@ --- title: "Architecture" -description: "How the Sandbox Agent server, SDK, and agent processes fit together." +description: "How the client, sandbox, server, and agent fit together." +icon: "microchip" --- -Sandbox Agent is a lightweight HTTP server that runs **inside** a sandbox. It: - -- **Agent management**: Installs, spawns, and stops coding agent processes -- **Sessions**: Routes prompts to agents and streams events back in real time -- **Sandbox APIs**: Filesystem, process, and terminal access for the sandbox environment +Sandbox Agent runs as an HTTP server inside your sandbox. Your app talks to it remotely. ## Components -```mermaid -flowchart LR - CLIENT["Your App"] +- `Your client`: your app code using the `sandbox-agent` SDK. +- `Sandbox`: isolated runtime (E2B, Daytona, Docker, etc.). +- `Sandbox Agent server`: process inside the sandbox exposing HTTP transport. +- `Agent`: Claude/Codex/OpenCode/Amp process managed by Sandbox Agent. + +```mermaid placement="top-right" + flowchart LR + CLIENT["Sandbox Agent SDK"] + SERVER["Sandbox Agent server"] + AGENT["Agent process"] subgraph SANDBOX["Sandbox"] - direction TB - SERVER["Sandbox Agent Server"] - AGENT["Agent Process
(Claude, Codex, etc.)"] - SERVER --> AGENT + direction TB + SERVER --> AGENT end - CLIENT -->|"SDK (HTTP)"| SERVER + CLIENT -->|HTTP| SERVER ``` -- **Your app**: Uses the `sandbox-agent` TypeScript SDK to talk to the server over HTTP. -- **Sandbox**: An isolated runtime (local process, Docker, E2B, Daytona, Vercel, Cloudflare). -- **Sandbox Agent server**: A single binary inside the sandbox that manages agent lifecycles, routes prompts, streams events, and exposes filesystem/process/terminal APIs. -- **Agent process**: A coding agent (Claude Code, Codex, etc.) spawned by the server. Each session maps to one agent process. +## Suggested Topology -## What `SandboxAgent.start()` does +Run the SDK on your backend, then call it from your frontend. -1. **Provision**: The provider creates a sandbox (starts a container, creates a VM, etc.) -2. **Install**: The Sandbox Agent binary is installed inside the sandbox -3. **Boot**: The server starts listening on an HTTP port -4. **Health check**: The SDK waits for `/v1/health` to respond -5. **Ready**: The SDK returns a connected client +This extra hop is recommended because it keeps auth/token logic on the backend and makes persistence simpler. -For the `local` provider, provisioning is a no-op and the server runs as a local subprocess. +```mermaid placement="top-right" + flowchart LR + BROWSER["Browser"] + subgraph BACKEND["Your backend"] + direction TB + SDK["Sandbox Agent SDK"] + end + subgraph SANDBOX_SIMPLE["Sandbox"] + SERVER_SIMPLE["Sandbox Agent server"] + end -### Server recovery - -If the server process stops, the SDK automatically calls the provider's `ensureServer()` after 3 consecutive health-check failures. Most built-in providers implement this. Custom providers can add `ensureServer(sandboxId)` to their `SandboxProvider` object. - -## Server HTTP API - -See the [HTTP API reference](/api-reference) for the full list of server endpoints. - -## Agent installation - -Agents are installed lazily on first use. To avoid the cold-start delay, pre-install them: - -```bash -sandbox-agent install-agent --all + BROWSER --> BACKEND + BACKEND --> SDK --> SERVER_SIMPLE ``` -The `rivetdev/sandbox-agent:0.4.2-full` Docker image ships with all agents pre-installed. +### Backend requirements -## Production-ready agent orchestration +Your backend layer needs to handle: -For production deployments, see [Orchestration Architecture](/orchestration-architecture) for recommended topology, backend requirements, and session persistence patterns. +- **Long-running connections**: prompts can take minutes. +- **Session affinity**: follow-up messages must reach the same session. +- **State between requests**: session metadata and event history must persist across requests. +- **Graceful recovery**: sessions should resume after backend restarts. + +We recommend [Rivet](https://rivet.dev) over serverless because actors natively support the long-lived connections, session routing, and state persistence that agent workloads require. + +## Session persistence + +For storage driver options and replay behavior, see [Persisting Sessions](/session-persistence). diff --git a/docs/cli.mdx b/docs/cli.mdx index 362de49..a3cd839 100644 --- a/docs/cli.mdx +++ b/docs/cli.mdx @@ -37,36 +37,6 @@ Notes: - Set `SANDBOX_AGENT_LOG_STDOUT=1` to force stdout/stderr logging. - Use `SANDBOX_AGENT_LOG_DIR` to override log directory. -## install - -Install first-party runtime dependencies. - -### install desktop - -Install the Linux desktop runtime packages required by `/v1/desktop/*`. - -```bash -sandbox-agent install desktop [OPTIONS] -``` - -| Option | Description | -|--------|-------------| -| `--yes` | Skip the confirmation prompt | -| `--print-only` | Print the package-manager command without executing it | -| `--package-manager ` | Override package-manager detection | -| `--no-fonts` | Skip the default DejaVu font package | - -```bash -sandbox-agent install desktop --yes -sandbox-agent install desktop --print-only -``` - -Notes: - -- Supported on Linux only. -- The command detects `apt`, `dnf`, or `apk`. -- If the host is not already running as root, the command requires `sudo`. - ## install-agent Install or reinstall a single agent, or every supported agent with `--all`. @@ -89,39 +59,6 @@ sandbox-agent install-agent claude --reinstall sandbox-agent install-agent --all ``` -### Custom Pi implementation path - -If you use a forked/custom `pi` binary with `pi-acp`, you can override what executable gets launched. - -#### Option 1: explicit command override (recommended) - -Set `PI_ACP_PI_COMMAND` in the environment where `sandbox-agent` runs: - -```bash -PI_ACP_PI_COMMAND=/absolute/path/to/your/pi-fork sandbox-agent server -``` - -This is forwarded to `pi-acp`, which uses it instead of looking up `pi` on `PATH`. - -#### Option 2: PATH override - -Put your custom `pi` first on `PATH` before starting `sandbox-agent`: - -```bash -export PATH="/path/to/custom-pi-dir:$PATH" -sandbox-agent server -``` - -#### Option 3: symlink override - -Point `pi` to your custom binary via symlink in a directory that is early on `PATH`: - -```bash -ln -sf /absolute/path/to/your/pi-fork /usr/local/bin/pi -``` - -Then start `sandbox-agent` normally. - ## opencode (experimental) Start/reuse daemon and run `opencode attach` against `/opencode`. @@ -289,7 +226,7 @@ Example output: } ``` -See individual agent pages (e.g. [Claude](/agents/claude), [Codex](/agents/codex)) for supported models, modes, and thought levels. +See [Agent Capabilities](/agent-capabilities) for a full reference of supported models, modes, and thought levels per agent. #### api agents install diff --git a/docs/common-software.mdx b/docs/common-software.mdx deleted file mode 100644 index 7997a92..0000000 --- a/docs/common-software.mdx +++ /dev/null @@ -1,560 +0,0 @@ ---- -title: "Common Software" -description: "Install browsers, languages, databases, and other tools inside the sandbox." -sidebarTitle: "Common Software" -icon: "box-open" ---- - -The sandbox runs a Debian/Ubuntu base image. You can install software with `apt-get` via the [Process API](/processes) or by customizing your Docker image. This page covers commonly needed packages and how to install them. - -## Browsers - -### Chromium - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "chromium", "chromium-sandbox"], -}); - -// Launch headless -await sdk.runProcess({ - command: "chromium", - args: ["--headless", "--no-sandbox", "--disable-gpu", "https://example.com"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","chromium","chromium-sandbox"]}' -``` - - - -Use `--no-sandbox` when running Chromium inside a container. The container itself provides isolation. - - -### Firefox - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "firefox-esr"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","firefox-esr"]}' -``` - - -### Playwright browsers - -Playwright bundles its own browser binaries. Install the Playwright CLI and let it download browsers for you. - - -```ts TypeScript -await sdk.runProcess({ - command: "npx", - args: ["playwright", "install", "--with-deps", "chromium"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"npx","args":["playwright","install","--with-deps","chromium"]}' -``` - - ---- - -## Languages and runtimes - -### Node.js - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "nodejs", "npm"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","nodejs","npm"]}' -``` - - -For a specific version, use [nvm](https://github.com/nvm-sh/nvm): - -```ts TypeScript -await sdk.runProcess({ - command: "bash", - args: ["-c", "curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.1/install.sh | bash && . ~/.nvm/nvm.sh && nvm install 22"], -}); -``` - -### Python - -Python 3 is typically pre-installed. To add pip and common packages: - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "python3", "python3-pip", "python3-venv"], -}); - -await sdk.runProcess({ - command: "pip3", - args: ["install", "numpy", "pandas", "matplotlib"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","python3","python3-pip","python3-venv"]}' - -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"pip3","args":["install","numpy","pandas","matplotlib"]}' -``` - - -### Go - - -```ts TypeScript -await sdk.runProcess({ - command: "bash", - args: ["-c", "curl -fsSL https://go.dev/dl/go1.23.6.linux-amd64.tar.gz | tar -C /usr/local -xz"], -}); - -// Add to PATH for subsequent commands -await sdk.runProcess({ - command: "bash", - args: ["-c", "export PATH=$PATH:/usr/local/go/bin && go version"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"bash","args":["-c","curl -fsSL https://go.dev/dl/go1.23.6.linux-amd64.tar.gz | tar -C /usr/local -xz"]}' -``` - - -### Rust - - -```ts TypeScript -await sdk.runProcess({ - command: "bash", - args: ["-c", "curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"bash","args":["-c","curl --proto =https --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y"]}' -``` - - -### Java (OpenJDK) - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "default-jdk"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","default-jdk"]}' -``` - - -### Ruby - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "ruby-full"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","ruby-full"]}' -``` - - ---- - -## Databases - -### PostgreSQL - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "postgresql", "postgresql-client"], -}); - -// Start the service -const proc = await sdk.createProcess({ - command: "bash", - args: ["-c", "su - postgres -c 'pg_ctlcluster 15 main start'"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","postgresql","postgresql-client"]}' -``` - - -### SQLite - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "sqlite3"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","sqlite3"]}' -``` - - -### Redis - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "redis-server"], -}); - -const proc = await sdk.createProcess({ - command: "redis-server", - args: ["--daemonize", "no"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","redis-server"]}' - -curl -X POST "http://127.0.0.1:2468/v1/processes" \ - -H "Content-Type: application/json" \ - -d '{"command":"redis-server","args":["--daemonize","no"]}' -``` - - -### MySQL / MariaDB - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "mariadb-server", "mariadb-client"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","mariadb-server","mariadb-client"]}' -``` - - ---- - -## Build tools - -### Essential build toolchain - -Most compiled software needs the standard build toolchain: - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "build-essential", "cmake", "pkg-config"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","build-essential","cmake","pkg-config"]}' -``` - - -This installs `gcc`, `g++`, `make`, `cmake`, and related tools. - ---- - -## Desktop applications - -These require the [Computer Use](/computer-use) desktop to be started first. - -### LibreOffice - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "libreoffice"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","libreoffice"]}' -``` - - -### GIMP - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "gimp"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","gimp"]}' -``` - - -### VLC - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "vlc"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","vlc"]}' -``` - - -### VS Code (code-server) - - -```ts TypeScript -await sdk.runProcess({ - command: "bash", - args: ["-c", "curl -fsSL https://code-server.dev/install.sh | sh"], -}); - -const proc = await sdk.createProcess({ - command: "code-server", - args: ["--bind-addr", "0.0.0.0:8080", "--auth", "none"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"bash","args":["-c","curl -fsSL https://code-server.dev/install.sh | sh"]}' - -curl -X POST "http://127.0.0.1:2468/v1/processes" \ - -H "Content-Type: application/json" \ - -d '{"command":"code-server","args":["--bind-addr","0.0.0.0:8080","--auth","none"]}' -``` - - ---- - -## CLI tools - -### Git - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "git"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","git"]}' -``` - - -### Docker - - -```ts TypeScript -await sdk.runProcess({ - command: "bash", - args: ["-c", "curl -fsSL https://get.docker.com | sh"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"bash","args":["-c","curl -fsSL https://get.docker.com | sh"]}' -``` - - -### jq - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "jq"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","jq"]}' -``` - - -### tmux - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "tmux"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","tmux"]}' -``` - - ---- - -## Media and graphics - -### FFmpeg - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "ffmpeg"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","ffmpeg"]}' -``` - - -### ImageMagick - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "imagemagick"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","imagemagick"]}' -``` - - -### Poppler (PDF utilities) - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "poppler-utils"], -}); - -// Convert PDF to images -await sdk.runProcess({ - command: "pdftoppm", - args: ["-png", "document.pdf", "output"], -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","poppler-utils"]}' -``` - - ---- - -## Pre-installing in a Docker image - -For production use, install software in your Dockerfile instead of at runtime. This avoids repeated downloads and makes startup faster. - -```dockerfile -FROM ubuntu:22.04 - -RUN apt-get update && apt-get install -y \ - chromium \ - firefox-esr \ - nodejs npm \ - python3 python3-pip \ - git curl wget \ - build-essential \ - sqlite3 \ - ffmpeg \ - imagemagick \ - jq \ - && rm -rf /var/lib/apt/lists/* - -RUN pip3 install numpy pandas matplotlib -``` - -See [Docker deployment](/deploy/docker) for how to use custom images with Sandbox Agent. diff --git a/docs/computer-use.mdx b/docs/computer-use.mdx deleted file mode 100644 index fc6b7d0..0000000 --- a/docs/computer-use.mdx +++ /dev/null @@ -1,859 +0,0 @@ ---- -title: "Computer Use" -description: "Control a virtual desktop inside the sandbox with mouse, keyboard, screenshots, recordings, and live streaming." -sidebarTitle: "Computer Use" -icon: "desktop" ---- - -Sandbox Agent provides a managed virtual desktop (Xvfb + openbox) that you can control programmatically. This is useful for browser automation, GUI testing, and AI computer-use workflows. - -## Start and stop - - -```ts TypeScript -import { SandboxAgent } from "sandbox-agent"; - -const sdk = await SandboxAgent.connect({ - baseUrl: "http://127.0.0.1:2468", -}); - -const status = await sdk.startDesktop({ - width: 1920, - height: 1080, - dpi: 96, -}); - -console.log(status.state); // "active" -console.log(status.display); // ":99" - -// When done -await sdk.stopDesktop(); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/desktop/start" \ - -H "Content-Type: application/json" \ - -d '{"width":1920,"height":1080,"dpi":96}' - -curl -X POST "http://127.0.0.1:2468/v1/desktop/stop" -``` - - -All fields in the start request are optional. Defaults are 1440x900 at 96 DPI. - -### Start request options - -| Field | Type | Default | Description | -|-------|------|---------|-------------| -| `width` | number | 1440 | Desktop width in pixels | -| `height` | number | 900 | Desktop height in pixels | -| `dpi` | number | 96 | Display DPI | -| `displayNum` | number | 99 | Starting X display number. The runtime probes from this number upward to find an available display. | -| `stateDir` | string | (auto) | Desktop state directory for home, logs, recordings | -| `streamVideoCodec` | string | `"vp8"` | WebRTC video codec (`vp8`, `vp9`, `h264`) | -| `streamAudioCodec` | string | `"opus"` | WebRTC audio codec (`opus`, `g722`) | -| `streamFrameRate` | number | 30 | Streaming frame rate (1-60) | -| `webrtcPortRange` | string | `"59050-59070"` | UDP port range for WebRTC media | -| `recordingFps` | number | 30 | Default recording FPS when not specified in `startDesktopRecording` (1-60) | - -The streaming and recording options configure defaults for the desktop session. They take effect when streaming or recording is started later. - - -```ts TypeScript -const status = await sdk.startDesktop({ - width: 1920, - height: 1080, - streamVideoCodec: "h264", - streamFrameRate: 60, - webrtcPortRange: "59100-59120", - recordingFps: 15, -}); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/desktop/start" \ - -H "Content-Type: application/json" \ - -d '{ - "width": 1920, - "height": 1080, - "streamVideoCodec": "h264", - "streamFrameRate": 60, - "webrtcPortRange": "59100-59120", - "recordingFps": 15 - }' -``` - - -## Status - - -```ts TypeScript -const status = await sdk.getDesktopStatus(); -console.log(status.state); // "inactive" | "active" | "failed" | ... -``` - -```bash cURL -curl "http://127.0.0.1:2468/v1/desktop/status" -``` - - -## Screenshots - -Capture the full desktop or a specific region. Optionally include the cursor position. - - -```ts TypeScript -// Full screenshot (PNG by default) -const png = await sdk.takeDesktopScreenshot(); - -// JPEG at 70% quality, half scale -const jpeg = await sdk.takeDesktopScreenshot({ - format: "jpeg", - quality: 70, - scale: 0.5, -}); - -// Include cursor overlay -const withCursor = await sdk.takeDesktopScreenshot({ - showCursor: true, -}); - -// Region screenshot -const region = await sdk.takeDesktopRegionScreenshot({ - x: 100, - y: 100, - width: 400, - height: 300, -}); -``` - -```bash cURL -curl "http://127.0.0.1:2468/v1/desktop/screenshot" --output screenshot.png - -curl "http://127.0.0.1:2468/v1/desktop/screenshot?format=jpeg&quality=70&scale=0.5" \ - --output screenshot.jpg - -# Include cursor overlay -curl "http://127.0.0.1:2468/v1/desktop/screenshot?show_cursor=true" \ - --output with_cursor.png - -curl "http://127.0.0.1:2468/v1/desktop/screenshot/region?x=100&y=100&width=400&height=300" \ - --output region.png -``` - - -### Screenshot options - -| Param | Type | Default | Description | -|-------|------|---------|-------------| -| `format` | string | `"png"` | Output format: `png`, `jpeg`, or `webp` | -| `quality` | number | 85 | Compression quality (1-100, JPEG/WebP only) | -| `scale` | number | 1.0 | Scale factor (0.1-1.0) | -| `showCursor` | boolean | `false` | Composite a crosshair at the cursor position | - -When `showCursor` is enabled, the cursor position is captured at the moment of the screenshot and a red crosshair is drawn at that location. This is useful for AI agents that need to see where the cursor is in the screenshot. - -## Mouse - - -```ts TypeScript -// Get current position -const pos = await sdk.getDesktopMousePosition(); -console.log(pos.x, pos.y); - -// Move -await sdk.moveDesktopMouse({ x: 500, y: 300 }); - -// Click (left by default) -await sdk.clickDesktop({ x: 500, y: 300 }); - -// Right click -await sdk.clickDesktop({ x: 500, y: 300, button: "right" }); - -// Double click -await sdk.clickDesktop({ x: 500, y: 300, clickCount: 2 }); - -// Drag -await sdk.dragDesktopMouse({ - startX: 100, startY: 100, - endX: 400, endY: 400, -}); - -// Scroll -await sdk.scrollDesktop({ x: 500, y: 300, deltaY: -3 }); -``` - -```bash cURL -curl "http://127.0.0.1:2468/v1/desktop/mouse/position" - -curl -X POST "http://127.0.0.1:2468/v1/desktop/mouse/click" \ - -H "Content-Type: application/json" \ - -d '{"x":500,"y":300}' - -curl -X POST "http://127.0.0.1:2468/v1/desktop/mouse/drag" \ - -H "Content-Type: application/json" \ - -d '{"startX":100,"startY":100,"endX":400,"endY":400}' - -curl -X POST "http://127.0.0.1:2468/v1/desktop/mouse/scroll" \ - -H "Content-Type: application/json" \ - -d '{"x":500,"y":300,"deltaY":-3}' -``` - - -## Keyboard - - -```ts TypeScript -// Type text -await sdk.typeDesktopText({ text: "Hello, world!" }); - -// Press a key with modifiers -await sdk.pressDesktopKey({ - key: "c", - modifiers: { ctrl: true }, -}); - -// Low-level key down/up -await sdk.keyDownDesktop({ key: "Shift_L" }); -await sdk.keyUpDesktop({ key: "Shift_L" }); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/desktop/keyboard/type" \ - -H "Content-Type: application/json" \ - -d '{"text":"Hello, world!"}' - -curl -X POST "http://127.0.0.1:2468/v1/desktop/keyboard/press" \ - -H "Content-Type: application/json" \ - -d '{"key":"c","modifiers":{"ctrl":true}}' -``` - - -## Clipboard - -Read and write the X11 clipboard programmatically. - - -```ts TypeScript -// Read clipboard -const clipboard = await sdk.getDesktopClipboard(); -console.log(clipboard.text); - -// Read primary selection (mouse-selected text) -const primary = await sdk.getDesktopClipboard({ selection: "primary" }); - -// Write to clipboard -await sdk.setDesktopClipboard({ text: "Pasted via API" }); - -// Write to both clipboard and primary selection -await sdk.setDesktopClipboard({ - text: "Synced text", - selection: "both", -}); -``` - -```bash cURL -curl "http://127.0.0.1:2468/v1/desktop/clipboard" - -curl "http://127.0.0.1:2468/v1/desktop/clipboard?selection=primary" - -curl -X POST "http://127.0.0.1:2468/v1/desktop/clipboard" \ - -H "Content-Type: application/json" \ - -d '{"text":"Pasted via API"}' - -curl -X POST "http://127.0.0.1:2468/v1/desktop/clipboard" \ - -H "Content-Type: application/json" \ - -d '{"text":"Synced text","selection":"both"}' -``` - - -The `selection` parameter controls which X11 selection to read or write: - -| Value | Description | -|-------|-------------| -| `clipboard` (default) | The standard clipboard (Ctrl+C / Ctrl+V) | -| `primary` | The primary selection (text selected with the mouse) | -| `both` | Write to both clipboard and primary selection (write only) | - -## Display and windows - - -```ts TypeScript -const display = await sdk.getDesktopDisplayInfo(); -console.log(display.resolution); // { width: 1920, height: 1080, dpi: 96 } - -const { windows } = await sdk.listDesktopWindows(); -for (const win of windows) { - console.log(win.title, win.x, win.y, win.width, win.height); -} -``` - -```bash cURL -curl "http://127.0.0.1:2468/v1/desktop/display/info" - -curl "http://127.0.0.1:2468/v1/desktop/windows" -``` - - -The windows endpoint filters out noise automatically: window manager internals (Openbox), windows with empty titles, and tiny helper windows (under 120x80) are excluded. The currently active/focused window is always included regardless of filters. - -### Focused window - -Get the currently focused window without listing all windows. - - -```ts TypeScript -const focused = await sdk.getDesktopFocusedWindow(); -console.log(focused.title, focused.id); -``` - -```bash cURL -curl "http://127.0.0.1:2468/v1/desktop/windows/focused" -``` - - -Returns 404 if no window currently has focus. - -### Window management - -Focus, move, and resize windows by their X11 window ID. - - -```ts TypeScript -const { windows } = await sdk.listDesktopWindows(); -const win = windows[0]; - -// Bring window to foreground -await sdk.focusDesktopWindow(win.id); - -// Move window -await sdk.moveDesktopWindow(win.id, { x: 100, y: 50 }); - -// Resize window -await sdk.resizeDesktopWindow(win.id, { width: 1280, height: 720 }); -``` - -```bash cURL -# Focus a window -curl -X POST "http://127.0.0.1:2468/v1/desktop/windows/12345/focus" - -# Move a window -curl -X POST "http://127.0.0.1:2468/v1/desktop/windows/12345/move" \ - -H "Content-Type: application/json" \ - -d '{"x":100,"y":50}' - -# Resize a window -curl -X POST "http://127.0.0.1:2468/v1/desktop/windows/12345/resize" \ - -H "Content-Type: application/json" \ - -d '{"width":1280,"height":720}' -``` - - -All three endpoints return the updated window info so you can verify the operation took effect. The window manager may adjust the requested position or size. - -## App launching - -Launch applications or open files/URLs on the desktop without needing to shell out. - - -```ts TypeScript -// Launch an app by name -const result = await sdk.launchDesktopApp({ - app: "firefox", - args: ["--private"], -}); -console.log(result.processId); // "proc_7" - -// Launch and wait for the window to appear -const withWindow = await sdk.launchDesktopApp({ - app: "xterm", - wait: true, -}); -console.log(withWindow.windowId); // "12345" or null if timed out - -// Open a URL with the default handler -const opened = await sdk.openDesktopTarget({ - target: "https://example.com", -}); -console.log(opened.processId); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/desktop/launch" \ - -H "Content-Type: application/json" \ - -d '{"app":"firefox","args":["--private"]}' - -curl -X POST "http://127.0.0.1:2468/v1/desktop/launch" \ - -H "Content-Type: application/json" \ - -d '{"app":"xterm","wait":true}' - -curl -X POST "http://127.0.0.1:2468/v1/desktop/open" \ - -H "Content-Type: application/json" \ - -d '{"target":"https://example.com"}' -``` - - -The returned `processId` can be used with the [Process API](/processes) to read logs (`GET /v1/processes/{id}/logs`) or stop the application (`POST /v1/processes/{id}/stop`). - -When `wait` is `true`, the API polls for up to 5 seconds for a window to appear. If the window appears, its ID is returned in `windowId`. If it times out, `windowId` is `null` but the process is still running. - - -**Launch/Open vs the Process API:** Both `launch` and `open` are convenience wrappers around the [Process API](/processes). They create managed processes (with `owner: "desktop"`) that you can inspect, log, and stop through the same Process endpoints. The difference is that `launch` validates the binary exists in PATH first and can optionally wait for a window to appear, while `open` delegates to the system default handler (`xdg-open`). Use the Process API directly when you need full control over command, environment, working directory, or restart policies. - - -## Recording - -Record the desktop to MP4. - - -```ts TypeScript -const recording = await sdk.startDesktopRecording({ fps: 30 }); -console.log(recording.id); - -// ... do things ... - -const stopped = await sdk.stopDesktopRecording(); - -// List all recordings -const { recordings } = await sdk.listDesktopRecordings(); - -// Download -const mp4 = await sdk.downloadDesktopRecording(recording.id); - -// Clean up -await sdk.deleteDesktopRecording(recording.id); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/desktop/recording/start" \ - -H "Content-Type: application/json" \ - -d '{"fps":30}' - -curl -X POST "http://127.0.0.1:2468/v1/desktop/recording/stop" - -curl "http://127.0.0.1:2468/v1/desktop/recordings" - -curl "http://127.0.0.1:2468/v1/desktop/recordings/rec_1/download" --output recording.mp4 - -curl -X DELETE "http://127.0.0.1:2468/v1/desktop/recordings/rec_1" -``` - - -## Desktop processes - -The desktop runtime manages several background processes (Xvfb, openbox, neko, ffmpeg). These are all registered with the general [Process API](/processes) under the `desktop` owner, so you can inspect logs, check status, and troubleshoot using the same tools you use for any other managed process. - - -```ts TypeScript -// List all processes, including desktop-owned ones -const { processes } = await sdk.listProcesses(); - -const desktopProcs = processes.filter((p) => p.owner === "desktop"); -for (const p of desktopProcs) { - console.log(p.id, p.command, p.status); -} - -// Read logs from a specific desktop process -const logs = await sdk.getProcessLogs(desktopProcs[0].id, { tail: 50 }); -for (const entry of logs.entries) { - console.log(entry.stream, atob(entry.data)); -} -``` - -```bash cURL -# List all processes (desktop processes have owner: "desktop") -curl "http://127.0.0.1:2468/v1/processes" - -# Get logs from a specific desktop process -curl "http://127.0.0.1:2468/v1/processes/proc_1/logs?tail=50" -``` - - -The desktop status endpoint also includes a summary of running processes: - - -```ts TypeScript -const status = await sdk.getDesktopStatus(); -for (const proc of status.processes) { - console.log(proc.name, proc.pid, proc.running); -} -``` - -```bash cURL -curl "http://127.0.0.1:2468/v1/desktop/status" -# Response includes: processes: [{ name: "Xvfb", pid: 123, running: true }, ...] -``` - - -| Process | Role | Restart policy | -|---------|------|---------------| -| Xvfb | Virtual X11 framebuffer | Auto-restart while desktop is active | -| openbox | Window manager | Auto-restart while desktop is active | -| neko | WebRTC streaming server (started by `startDesktopStream`) | No auto-restart | -| ffmpeg | Screen recorder (started by `startDesktopRecording`) | No auto-restart | - -## Live streaming - -Start a WebRTC stream for real-time desktop viewing in a browser. - - -```ts TypeScript -await sdk.startDesktopStream(); - -// Check stream status -const status = await sdk.getDesktopStreamStatus(); -console.log(status.active); // true -console.log(status.processId); // "proc_5" - -// Connect via the React DesktopViewer component or -// use the WebSocket signaling endpoint directly -// at ws://127.0.0.1:2468/v1/desktop/stream/signaling - -await sdk.stopDesktopStream(); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/desktop/stream/start" - -# Check stream status -curl "http://127.0.0.1:2468/v1/desktop/stream/status" - -# Connect to ws://127.0.0.1:2468/v1/desktop/stream/signaling for WebRTC signaling - -curl -X POST "http://127.0.0.1:2468/v1/desktop/stream/stop" -``` - - -For a drop-in React component, see [React Components](/react-components). - -## API reference - -### Endpoints - -| Method | Path | Description | -|--------|------|-------------| -| `POST` | `/v1/desktop/start` | Start the desktop runtime | -| `POST` | `/v1/desktop/stop` | Stop the desktop runtime | -| `GET` | `/v1/desktop/status` | Get desktop runtime status | -| `GET` | `/v1/desktop/screenshot` | Capture full desktop screenshot | -| `GET` | `/v1/desktop/screenshot/region` | Capture a region screenshot | -| `GET` | `/v1/desktop/mouse/position` | Get current mouse position | -| `POST` | `/v1/desktop/mouse/move` | Move the mouse | -| `POST` | `/v1/desktop/mouse/click` | Click the mouse | -| `POST` | `/v1/desktop/mouse/down` | Press mouse button down | -| `POST` | `/v1/desktop/mouse/up` | Release mouse button | -| `POST` | `/v1/desktop/mouse/drag` | Drag from one point to another | -| `POST` | `/v1/desktop/mouse/scroll` | Scroll at a position | -| `POST` | `/v1/desktop/keyboard/type` | Type text | -| `POST` | `/v1/desktop/keyboard/press` | Press a key with optional modifiers | -| `POST` | `/v1/desktop/keyboard/down` | Press a key down (hold) | -| `POST` | `/v1/desktop/keyboard/up` | Release a key | -| `GET` | `/v1/desktop/display/info` | Get display info | -| `GET` | `/v1/desktop/windows` | List visible windows | -| `GET` | `/v1/desktop/windows/focused` | Get focused window info | -| `POST` | `/v1/desktop/windows/{id}/focus` | Focus a window | -| `POST` | `/v1/desktop/windows/{id}/move` | Move a window | -| `POST` | `/v1/desktop/windows/{id}/resize` | Resize a window | -| `GET` | `/v1/desktop/clipboard` | Read clipboard contents | -| `POST` | `/v1/desktop/clipboard` | Write to clipboard | -| `POST` | `/v1/desktop/launch` | Launch an application | -| `POST` | `/v1/desktop/open` | Open a file or URL | -| `POST` | `/v1/desktop/recording/start` | Start recording | -| `POST` | `/v1/desktop/recording/stop` | Stop recording | -| `GET` | `/v1/desktop/recordings` | List recordings | -| `GET` | `/v1/desktop/recordings/{id}` | Get recording metadata | -| `GET` | `/v1/desktop/recordings/{id}/download` | Download recording | -| `DELETE` | `/v1/desktop/recordings/{id}` | Delete recording | -| `POST` | `/v1/desktop/stream/start` | Start WebRTC streaming | -| `POST` | `/v1/desktop/stream/stop` | Stop WebRTC streaming | -| `GET` | `/v1/desktop/stream/status` | Get stream status | -| `GET` | `/v1/desktop/stream/signaling` | WebSocket for WebRTC signaling | - -### TypeScript SDK methods - -| Method | Returns | Description | -|--------|---------|-------------| -| `startDesktop(request?)` | `DesktopStatusResponse` | Start the desktop | -| `stopDesktop()` | `DesktopStatusResponse` | Stop the desktop | -| `getDesktopStatus()` | `DesktopStatusResponse` | Get desktop status | -| `takeDesktopScreenshot(query?)` | `Uint8Array` | Capture screenshot | -| `takeDesktopRegionScreenshot(query)` | `Uint8Array` | Capture region screenshot | -| `getDesktopMousePosition()` | `DesktopMousePositionResponse` | Get mouse position | -| `moveDesktopMouse(request)` | `DesktopMousePositionResponse` | Move mouse | -| `clickDesktop(request)` | `DesktopMousePositionResponse` | Click mouse | -| `mouseDownDesktop(request)` | `DesktopMousePositionResponse` | Mouse button down | -| `mouseUpDesktop(request)` | `DesktopMousePositionResponse` | Mouse button up | -| `dragDesktopMouse(request)` | `DesktopMousePositionResponse` | Drag mouse | -| `scrollDesktop(request)` | `DesktopMousePositionResponse` | Scroll | -| `typeDesktopText(request)` | `DesktopActionResponse` | Type text | -| `pressDesktopKey(request)` | `DesktopActionResponse` | Press key | -| `keyDownDesktop(request)` | `DesktopActionResponse` | Key down | -| `keyUpDesktop(request)` | `DesktopActionResponse` | Key up | -| `getDesktopDisplayInfo()` | `DesktopDisplayInfoResponse` | Get display info | -| `listDesktopWindows()` | `DesktopWindowListResponse` | List windows | -| `getDesktopFocusedWindow()` | `DesktopWindowInfo` | Get focused window | -| `focusDesktopWindow(id)` | `DesktopWindowInfo` | Focus a window | -| `moveDesktopWindow(id, request)` | `DesktopWindowInfo` | Move a window | -| `resizeDesktopWindow(id, request)` | `DesktopWindowInfo` | Resize a window | -| `getDesktopClipboard(query?)` | `DesktopClipboardResponse` | Read clipboard | -| `setDesktopClipboard(request)` | `DesktopActionResponse` | Write clipboard | -| `launchDesktopApp(request)` | `DesktopLaunchResponse` | Launch an app | -| `openDesktopTarget(request)` | `DesktopOpenResponse` | Open file/URL | -| `startDesktopRecording(request?)` | `DesktopRecordingInfo` | Start recording | -| `stopDesktopRecording()` | `DesktopRecordingInfo` | Stop recording | -| `listDesktopRecordings()` | `DesktopRecordingListResponse` | List recordings | -| `getDesktopRecording(id)` | `DesktopRecordingInfo` | Get recording | -| `downloadDesktopRecording(id)` | `Uint8Array` | Download recording | -| `deleteDesktopRecording(id)` | `void` | Delete recording | -| `startDesktopStream()` | `DesktopStreamStatusResponse` | Start streaming | -| `stopDesktopStream()` | `DesktopStreamStatusResponse` | Stop streaming | -| `getDesktopStreamStatus()` | `DesktopStreamStatusResponse` | Stream status | - -## Customizing the desktop environment - -The desktop runs inside the sandbox filesystem, so you can customize it using the [File System](/file-system) API before or after starting the desktop. The desktop HOME directory is located at `~/.local/state/sandbox-agent/desktop/home` (or `$XDG_STATE_HOME/sandbox-agent/desktop/home` if `XDG_STATE_HOME` is set). - -All configuration files below are written to paths relative to this HOME directory. - -### Window manager (openbox) - -The desktop uses [openbox](http://openbox.org/) as its window manager. You can customize its behavior, theme, and keyboard shortcuts by writing an `rc.xml` config file. - - -```ts TypeScript -const openboxConfig = ` - - - Clearlooks - NLIMC - DejaVu Sans10 - - 1 - - - - -`; - -await sdk.mkdirFs({ path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox" }); -await sdk.writeFsFile( - { path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox/rc.xml" }, - openboxConfig, -); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/fs/mkdir?path=~/.local/state/sandbox-agent/desktop/home/.config/openbox" - -curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.config/openbox/rc.xml" \ - -H "Content-Type: application/octet-stream" \ - --data-binary @rc.xml -``` - - -### Autostart programs - -Openbox runs scripts in `~/.config/openbox/autostart` on startup. Use this to launch applications, set the background, or configure the environment. - - -```ts TypeScript -const autostart = `#!/bin/sh -# Set a solid background color -xsetroot -solid "#1e1e2e" & - -# Launch a terminal -xterm -geometry 120x40+50+50 & - -# Launch a browser -firefox --no-remote & -`; - -await sdk.mkdirFs({ path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox" }); -await sdk.writeFsFile( - { path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox/autostart" }, - autostart, -); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/fs/mkdir?path=~/.local/state/sandbox-agent/desktop/home/.config/openbox" - -curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.config/openbox/autostart" \ - -H "Content-Type: application/octet-stream" \ - --data-binary @autostart.sh -``` - - - -The autostart script runs when openbox starts, which happens during `startDesktop()`. Write the autostart file before calling `startDesktop()` for it to take effect. - - -### Background - -There is no wallpaper set by default (the background is the X root window default). You can set it using `xsetroot` in the autostart script (as shown above), or use `feh` if you need an image: - - -```ts TypeScript -// Upload a wallpaper image -import fs from "node:fs"; - -const wallpaper = await fs.promises.readFile("./wallpaper.png"); -await sdk.writeFsFile( - { path: "~/.local/state/sandbox-agent/desktop/home/wallpaper.png" }, - wallpaper, -); - -// Set the autostart to apply it -const autostart = `#!/bin/sh -feh --bg-fill ~/wallpaper.png & -`; - -await sdk.mkdirFs({ path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox" }); -await sdk.writeFsFile( - { path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox/autostart" }, - autostart, -); -``` - -```bash cURL -curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/wallpaper.png" \ - -H "Content-Type: application/octet-stream" \ - --data-binary @wallpaper.png - -curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.config/openbox/autostart" \ - -H "Content-Type: application/octet-stream" \ - --data-binary @autostart.sh -``` - - - -`feh` is not installed by default. Install it via the [Process API](/processes) before starting the desktop: `await sdk.runProcess({ command: "apt-get", args: ["install", "-y", "feh"] })`. - - -### Fonts - -Only `fonts-dejavu-core` is installed by default. To add more fonts, install them with your system package manager or copy font files into the sandbox: - - -```ts TypeScript -// Install a font package -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "fonts-noto", "fonts-liberation"], -}); - -// Or copy a custom font file -import fs from "node:fs"; - -const font = await fs.promises.readFile("./CustomFont.ttf"); -await sdk.mkdirFs({ path: "~/.local/state/sandbox-agent/desktop/home/.local/share/fonts" }); -await sdk.writeFsFile( - { path: "~/.local/state/sandbox-agent/desktop/home/.local/share/fonts/CustomFont.ttf" }, - font, -); - -// Rebuild the font cache -await sdk.runProcess({ command: "fc-cache", args: ["-fv"] }); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","fonts-noto","fonts-liberation"]}' - -curl -X POST "http://127.0.0.1:2468/v1/fs/mkdir?path=~/.local/state/sandbox-agent/desktop/home/.local/share/fonts" - -curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.local/share/fonts/CustomFont.ttf" \ - -H "Content-Type: application/octet-stream" \ - --data-binary @CustomFont.ttf - -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"fc-cache","args":["-fv"]}' -``` - - -### Cursor theme - - -```ts TypeScript -await sdk.runProcess({ - command: "apt-get", - args: ["install", "-y", "dmz-cursor-theme"], -}); - -const xresources = `Xcursor.theme: DMZ-White\nXcursor.size: 24\n`; -await sdk.writeFsFile( - { path: "~/.local/state/sandbox-agent/desktop/home/.Xresources" }, - xresources, -); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ - -H "Content-Type: application/json" \ - -d '{"command":"apt-get","args":["install","-y","dmz-cursor-theme"]}' - -curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.Xresources" \ - -H "Content-Type: application/octet-stream" \ - --data-binary 'Xcursor.theme: DMZ-White\nXcursor.size: 24' -``` - - - -Run `xrdb -merge ~/.Xresources` (via the autostart or process API) after writing the file for changes to take effect. - - -### Shell and terminal - -No terminal emulator or shell is launched by default. Add one to the openbox autostart: - -```sh -# In ~/.config/openbox/autostart -xterm -geometry 120x40+50+50 & -``` - -To use a different shell, set the `SHELL` environment variable in your Dockerfile or install your preferred shell and configure the terminal to use it. - -### GTK theme - -Applications using GTK will pick up settings from `~/.config/gtk-3.0/settings.ini`: - - -```ts TypeScript -const gtkSettings = `[Settings] -gtk-theme-name=Adwaita -gtk-icon-theme-name=Adwaita -gtk-font-name=DejaVu Sans 10 -gtk-cursor-theme-name=DMZ-White -gtk-cursor-theme-size=24 -`; - -await sdk.mkdirFs({ path: "~/.local/state/sandbox-agent/desktop/home/.config/gtk-3.0" }); -await sdk.writeFsFile( - { path: "~/.local/state/sandbox-agent/desktop/home/.config/gtk-3.0/settings.ini" }, - gtkSettings, -); -``` - -```bash cURL -curl -X POST "http://127.0.0.1:2468/v1/fs/mkdir?path=~/.local/state/sandbox-agent/desktop/home/.config/gtk-3.0" - -curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.config/gtk-3.0/settings.ini" \ - -H "Content-Type: application/octet-stream" \ - --data-binary @settings.ini -``` - - -### Summary of configuration paths - -All paths are relative to the desktop HOME directory (`~/.local/state/sandbox-agent/desktop/home`). - -| What | Path | Notes | -|------|------|-------| -| Openbox config | `.config/openbox/rc.xml` | Window manager theme, keybindings, behavior | -| Autostart | `.config/openbox/autostart` | Shell script run on desktop start | -| Custom fonts | `.local/share/fonts/` | TTF/OTF files, run `fc-cache -fv` after | -| Cursor theme | `.Xresources` | Requires `xrdb -merge` to apply | -| GTK 3 settings | `.config/gtk-3.0/settings.ini` | Theme, icons, fonts for GTK apps | -| Wallpaper | Any path, referenced from autostart | Requires `feh` or similar tool | diff --git a/docs/credentials.mdx b/docs/credentials.mdx new file mode 100644 index 0000000..38bc7c4 --- /dev/null +++ b/docs/credentials.mdx @@ -0,0 +1,115 @@ +--- +title: "Credentials" +description: "How Sandbox Agent discovers and uses provider credentials." +--- + +Sandbox Agent discovers API credentials from environment variables and local agent config files. +These credentials are passed through to underlying agent runtimes. + +## Credential sources + +Credentials are discovered in priority order. + +### Environment variables (highest priority) + +API keys first: + +| Variable | Provider | +|----------|----------| +| `ANTHROPIC_API_KEY` | Anthropic | +| `CLAUDE_API_KEY` | Anthropic fallback | +| `OPENAI_API_KEY` | OpenAI | +| `CODEX_API_KEY` | OpenAI fallback | + +OAuth tokens (used when OAuth extraction is enabled): + +| Variable | Provider | +|----------|----------| +| `CLAUDE_CODE_OAUTH_TOKEN` | Anthropic | +| `ANTHROPIC_AUTH_TOKEN` | Anthropic fallback | + +### Agent config files + +| Agent | Config path | Provider | +|-------|-------------|----------| +| Amp | `~/.amp/config.json` | Anthropic | +| Claude Code | `~/.claude.json`, `~/.claude/.credentials.json` | Anthropic | +| Codex | `~/.codex/auth.json` | OpenAI | +| OpenCode | `~/.local/share/opencode/auth.json` | Anthropic/OpenAI | + +## Provider requirements by agent + +| Agent | Required provider | +|-------|-------------------| +| Claude Code | Anthropic | +| Amp | Anthropic | +| Codex | OpenAI | +| OpenCode | Anthropic or OpenAI | +| Mock | None | + +## Error handling behavior + +Credential extraction is best-effort: + +- Missing or malformed files are skipped. +- Discovery continues to later sources. +- Missing credentials mark providers unavailable instead of failing server startup. + +When prompting, Sandbox Agent does not pre-validate provider credentials. Agent-native authentication errors surface through session events/output. + +## Checking credential status + +### API + +`sdk.listAgents()` includes `credentialsAvailable` per agent. + +```json +{ + "agents": [ + { + "id": "claude", + "installed": true, + "credentialsAvailable": true + }, + { + "id": "codex", + "installed": true, + "credentialsAvailable": false + } + ] +} +``` + +### TypeScript SDK + +```typescript +const result = await sdk.listAgents(); + +for (const agent of result.agents) { + console.log(`${agent.id}: ${agent.credentialsAvailable ? "authenticated" : "no credentials"}`); +} +``` + +## Passing credentials explicitly + +Set environment variables before starting Sandbox Agent: + +```bash +export ANTHROPIC_API_KEY=sk-ant-... +export OPENAI_API_KEY=sk-... +sandbox-agent daemon start +``` + +Or with SDK-managed local spawn: + +```typescript +import { SandboxAgent } from "sandbox-agent"; + +const sdk = await SandboxAgent.start({ + spawn: { + env: { + ANTHROPIC_API_KEY: process.env.MY_ANTHROPIC_KEY, + }, + }, +}); +``` diff --git a/docs/custom-tools.mdx b/docs/custom-tools.mdx index 2fb3e15..727fb02 100644 --- a/docs/custom-tools.mdx +++ b/docs/custom-tools.mdx @@ -80,7 +80,9 @@ await sdk.setMcpConfig( const session = await sdk.createSession({ agent: "claude", - cwd: "/workspace", + sessionInit: { + cwd: "/workspace", + }, }); await session.prompt([ @@ -143,7 +145,9 @@ await sdk.writeFsFile({ path: "/opt/skills/random-number/SKILL.md" }, skill); ```ts const session = await sdk.createSession({ agent: "claude", - cwd: "/workspace", + sessionInit: { + cwd: "/workspace", + }, }); await session.prompt([ diff --git a/docs/deploy/boxlite.mdx b/docs/deploy/boxlite.mdx index 8c02bb4..115d8b8 100644 --- a/docs/deploy/boxlite.mdx +++ b/docs/deploy/boxlite.mdx @@ -20,7 +20,7 @@ that BoxLite can load directly (BoxLite has its own image store separate from Do ```dockerfile FROM node:22-bookworm-slim RUN apt-get update && apt-get install -y curl ca-certificates && rm -rf /var/lib/apt/lists/* -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh RUN sandbox-agent install-agent claude RUN sandbox-agent install-agent codex ``` diff --git a/docs/deploy/cloudflare.mdx b/docs/deploy/cloudflare.mdx index c0370e4..deca490 100644 --- a/docs/deploy/cloudflare.mdx +++ b/docs/deploy/cloudflare.mdx @@ -25,44 +25,13 @@ cd my-sandbox ```dockerfile FROM cloudflare/sandbox:0.7.0 -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh RUN sandbox-agent install-agent claude && sandbox-agent install-agent codex EXPOSE 8000 ``` -## TypeScript example (with provider) - -For standalone scripts, use the `cloudflare` provider: - -```bash -npm install sandbox-agent@0.4.x @cloudflare/sandbox -``` - -```typescript -import { SandboxAgent } from "sandbox-agent"; -import { cloudflare } from "sandbox-agent/cloudflare"; - -const sdk = await SandboxAgent.start({ - sandbox: cloudflare(), -}); - -try { - const session = await sdk.createSession({ agent: "codex" }); - const response = await session.prompt([ - { type: "text", text: "Summarize this repository" }, - ]); - console.log(response.stopReason); -} finally { - await sdk.destroySandbox(); -} -``` - -The `cloudflare` provider uses `containerFetch` under the hood, automatically stripping `AbortSignal` to avoid dropped streaming updates. - -## TypeScript example (Durable Objects) - -For Workers with Durable Objects, use `SandboxAgent.connect(...)` with a custom `fetch` backed by `sandbox.containerFetch(...)`: +## TypeScript example ```typescript import { getSandbox, type Sandbox } from "@cloudflare/sandbox"; @@ -140,6 +109,7 @@ app.all("*", (c) => c.env.ASSETS.fetch(c.req.raw)); export default app; ``` +Create the SDK client inside the Worker using custom `fetch` backed by `sandbox.containerFetch(...)`. This keeps all Sandbox Agent calls inside the Cloudflare sandbox routing path and does not require a `baseUrl`. ## Troubleshooting streaming updates diff --git a/docs/deploy/computesdk.mdx b/docs/deploy/computesdk.mdx index 601d9c7..5e07da0 100644 --- a/docs/deploy/computesdk.mdx +++ b/docs/deploy/computesdk.mdx @@ -1,66 +1,160 @@ --- title: "ComputeSDK" -description: "Deploy Sandbox Agent using ComputeSDK's provider-agnostic sandbox API." +description: "Deploy the daemon using ComputeSDK's provider-agnostic sandbox API." --- -[ComputeSDK](https://computesdk.com) provides a unified interface for managing sandboxes across multiple providers. Write once, deploy anywhere by changing environment variables. +[ComputeSDK](https://computesdk.com) provides a unified interface for managing sandboxes across multiple providers. Write once, deploy anywhere—switch providers by changing environment variables. ## Prerequisites - `COMPUTESDK_API_KEY` from [console.computesdk.com](https://console.computesdk.com) - Provider API key (one of: `E2B_API_KEY`, `DAYTONA_API_KEY`, `VERCEL_TOKEN`, `MODAL_TOKEN_ID` + `MODAL_TOKEN_SECRET`, `BLAXEL_API_KEY`, `CSB_API_KEY`) -- `ANTHROPIC_API_KEY` or `OPENAI_API_KEY` +- `ANTHROPIC_API_KEY` or `OPENAI_API_KEY` for the coding agents -## TypeScript example - -```bash -npm install sandbox-agent@0.4.x computesdk -``` +## TypeScript Example ```typescript +import { + compute, + detectProvider, + getMissingEnvVars, + getProviderConfigFromEnv, + isProviderAuthComplete, + isValidProvider, + PROVIDER_NAMES, + type ExplicitComputeConfig, + type ProviderName, +} from "computesdk"; import { SandboxAgent } from "sandbox-agent"; -import { computesdk } from "sandbox-agent/computesdk"; +const PORT = 3000; +const REQUEST_TIMEOUT_MS = + Number.parseInt(process.env.COMPUTESDK_TIMEOUT_MS || "", 10) || 120_000; + +/** + * Detects and validates the provider to use. + * Priority: COMPUTESDK_PROVIDER env var > auto-detection from API keys + */ +function resolveProvider(): ProviderName { + const providerOverride = process.env.COMPUTESDK_PROVIDER; + + if (providerOverride) { + if (!isValidProvider(providerOverride)) { + throw new Error( + `Unsupported provider "${providerOverride}". Supported: ${PROVIDER_NAMES.join(", ")}` + ); + } + if (!isProviderAuthComplete(providerOverride)) { + const missing = getMissingEnvVars(providerOverride); + throw new Error( + `Missing credentials for "${providerOverride}". Set: ${missing.join(", ")}` + ); + } + return providerOverride as ProviderName; + } + + const detected = detectProvider(); + if (!detected) { + throw new Error( + `No provider credentials found. Set one of: ${PROVIDER_NAMES.map((p) => getMissingEnvVars(p).join(", ")).join(" | ")}` + ); + } + return detected as ProviderName; +} + +function configureComputeSDK(): void { + const provider = resolveProvider(); + + const config: ExplicitComputeConfig = { + provider, + computesdkApiKey: process.env.COMPUTESDK_API_KEY, + requestTimeoutMs: REQUEST_TIMEOUT_MS, + }; + + // Add provider-specific config from environment + const providerConfig = getProviderConfigFromEnv(provider); + if (Object.keys(providerConfig).length > 0) { + (config as any)[provider] = providerConfig; + } + + compute.setConfig(config); +} + +configureComputeSDK(); + +// Build environment variables to pass to sandbox const envs: Record = {}; if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; -const sdk = await SandboxAgent.start({ - sandbox: computesdk({ - create: { - envs, - image: process.env.COMPUTESDK_IMAGE, - templateId: process.env.COMPUTESDK_TEMPLATE_ID, - }, - }), +// Create sandbox +const sandbox = await compute.sandbox.create({ + envs: Object.keys(envs).length > 0 ? envs : undefined, }); -try { - const session = await sdk.createSession({ agent: "claude" }); - const response = await session.prompt([ - { type: "text", text: "Summarize this repository" }, - ]); - console.log(response.stopReason); -} finally { - await sdk.destroySandbox(); +// Helper to run commands with error handling +const run = async (cmd: string, options?: { background?: boolean }) => { + const result = await sandbox.runCommand(cmd, options); + if (typeof result?.exitCode === "number" && result.exitCode !== 0) { + throw new Error(`Command failed: ${cmd} (exit ${result.exitCode})\n${result.stderr || ""}`); + } + return result; +}; + +// Install sandbox-agent +await run("curl -fsSL https://releases.rivet.dev/sandbox-agent/latest/install.sh | sh"); + +// Install agents conditionally based on available API keys +if (envs.ANTHROPIC_API_KEY) { + await run("sandbox-agent install-agent claude"); +} +if (envs.OPENAI_API_KEY) { + await run("sandbox-agent install-agent codex"); } -``` -The `computesdk` provider handles sandbox creation, Sandbox Agent installation, agent setup, and server startup automatically. ComputeSDK routes to your configured provider behind the scenes. -The `create` option now forwards the full ComputeSDK sandbox-create payload, including provider-specific fields such as `image` and `templateId` when the selected provider supports them. +// Start the server in the background +await run(`sandbox-agent server --no-token --host 0.0.0.0 --port ${PORT}`, { background: true }); -Before calling `SandboxAgent.start()`, configure ComputeSDK with your provider: +// Get the public URL for the sandbox +const baseUrl = await sandbox.getUrl({ port: PORT }); -```typescript -import { compute } from "computesdk"; +// Wait for server to be ready +const deadline = Date.now() + REQUEST_TIMEOUT_MS; +while (Date.now() < deadline) { + try { + const response = await fetch(`${baseUrl}/v1/health`); + if (response.ok) { + const data = await response.json(); + if (data?.status === "ok") break; + } + } catch { + // Server not ready yet + } + await new Promise((r) => setTimeout(r, 500)); +} -compute.setConfig({ - provider: "e2b", // or auto-detect via detectProvider() - computesdkApiKey: process.env.COMPUTESDK_API_KEY, +// Connect to the server +const client = await SandboxAgent.connect({ baseUrl }); + +// Detect which agent to use based on available API keys +const agent = envs.ANTHROPIC_API_KEY ? "claude" : "codex"; + +// Create a session and start coding +await client.createSession("my-session", { agent }); + +await client.postMessage("my-session", { + message: "Summarize this repository", }); + +for await (const event of client.streamEvents("my-session")) { + console.log(event.type, event.data); +} + +// Cleanup +await sandbox.destroy(); ``` -## Supported providers +## Supported Providers ComputeSDK auto-detects your provider from environment variables: @@ -75,7 +169,46 @@ ComputeSDK auto-detects your provider from environment variables: ## Notes -- **Provider resolution**: Set `COMPUTESDK_PROVIDER` to force a specific provider, or let ComputeSDK auto-detect from API keys. +- **Provider resolution order**: `COMPUTESDK_PROVIDER` env var takes priority, otherwise auto-detection from API keys. +- **Conditional agent installation**: Only agents with available API keys are installed, reducing setup time. +- **Command error handling**: The example validates exit codes and throws on failures for easier debugging. - `sandbox.runCommand(..., { background: true })` keeps the server running while your app continues. - `sandbox.getUrl({ port })` returns a public URL for the sandbox port. -- Always destroy the sandbox when done to avoid leaking resources. +- Always destroy the sandbox when you are done to avoid leaking resources. +- If sandbox creation times out, set `COMPUTESDK_TIMEOUT_MS` to a higher value (default: 120000ms). + +## Explicit Provider Selection + +To force a specific provider instead of auto-detection, set the `COMPUTESDK_PROVIDER` environment variable: + +```bash +export COMPUTESDK_PROVIDER=e2b +``` + +Or configure programmatically using `getProviderConfigFromEnv()`: + +```typescript +import { compute, getProviderConfigFromEnv, type ExplicitComputeConfig } from "computesdk"; + +const config: ExplicitComputeConfig = { + provider: "e2b", + computesdkApiKey: process.env.COMPUTESDK_API_KEY, + requestTimeoutMs: 120_000, +}; + +// Automatically populate provider-specific config from environment +const providerConfig = getProviderConfigFromEnv("e2b"); +if (Object.keys(providerConfig).length > 0) { + (config as any).e2b = providerConfig; +} + +compute.setConfig(config); +``` + +## Direct Mode (No ComputeSDK API Key) + +To bypass the ComputeSDK gateway and use provider SDKs directly, see the provider-specific examples: + +- [E2B](/deploy/e2b) +- [Daytona](/deploy/daytona) +- [Vercel](/deploy/vercel) diff --git a/docs/deploy/daytona.mdx b/docs/deploy/daytona.mdx index e546bef..5eb8f5d 100644 --- a/docs/deploy/daytona.mdx +++ b/docs/deploy/daytona.mdx @@ -15,37 +15,40 @@ See [Daytona network limits](https://www.daytona.io/docs/en/network-limits/). ## TypeScript example -```bash -npm install sandbox-agent@0.4.x @daytonaio/sdk -``` - ```typescript +import { Daytona } from "@daytonaio/sdk"; import { SandboxAgent } from "sandbox-agent"; -import { daytona } from "sandbox-agent/daytona"; + +const daytona = new Daytona(); const envVars: Record = {}; if (process.env.ANTHROPIC_API_KEY) envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.OPENAI_API_KEY) envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY; -const sdk = await SandboxAgent.start({ - sandbox: daytona({ - create: { envVars }, - }), -}); +const sandbox = await daytona.create({ envVars }); -try { - const session = await sdk.createSession({ agent: "claude" }); - const response = await session.prompt([ - { type: "text", text: "Summarize this repository" }, - ]); - console.log(response.stopReason); -} finally { - await sdk.destroySandbox(); -} +await sandbox.process.executeCommand( + "curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh" +); + +await sandbox.process.executeCommand("sandbox-agent install-agent claude"); +await sandbox.process.executeCommand("sandbox-agent install-agent codex"); + +await sandbox.process.executeCommand( + "nohup sandbox-agent server --no-token --host 0.0.0.0 --port 3000 >/tmp/sandbox-agent.log 2>&1 &" +); + +await new Promise((r) => setTimeout(r, 2000)); + +const baseUrl = (await sandbox.getSignedPreviewUrl(3000, 4 * 60 * 60)).url; +const sdk = await SandboxAgent.connect({ baseUrl }); + +const session = await sdk.createSession({ agent: "claude" }); +await session.prompt([{ type: "text", text: "Summarize this repository" }]); + +await sandbox.delete(); ``` -The `daytona` provider uses the `rivetdev/sandbox-agent:0.4.2-full` image by default and starts the server automatically. - ## Using snapshots for faster startup ```typescript @@ -61,7 +64,7 @@ if (!hasSnapshot) { name: SNAPSHOT, image: Image.base("ubuntu:22.04").runCommands( "apt-get update && apt-get install -y curl ca-certificates", - "curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh", + "curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh", "sandbox-agent install-agent claude", "sandbox-agent install-agent codex", ), diff --git a/docs/deploy/docker.mdx b/docs/deploy/docker.mdx index c5a3432..030ddc9 100644 --- a/docs/deploy/docker.mdx +++ b/docs/deploy/docker.mdx @@ -15,32 +15,11 @@ Run the published full image with all supported agents pre-installed: docker run --rm -p 3000:3000 \ -e ANTHROPIC_API_KEY="$ANTHROPIC_API_KEY" \ -e OPENAI_API_KEY="$OPENAI_API_KEY" \ - rivetdev/sandbox-agent:0.4.2-full \ + rivetdev/sandbox-agent:0.3.1-full \ server --no-token --host 0.0.0.0 --port 3000 ``` -The `0.4.2-full` tag pins the exact version. The moving `full` tag is also published for contributors who want the latest full image. - -If you also want the desktop API inside the container, install desktop dependencies before starting the server: - -```bash -docker run --rm -p 3000:3000 \ - -e ANTHROPIC_API_KEY="$ANTHROPIC_API_KEY" \ - -e OPENAI_API_KEY="$OPENAI_API_KEY" \ - node:22-bookworm-slim sh -c "\ - apt-get update && \ - DEBIAN_FRONTEND=noninteractive apt-get install -y curl ca-certificates bash libstdc++6 && \ - rm -rf /var/lib/apt/lists/* && \ - curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh && \ - sandbox-agent install desktop --yes && \ - sandbox-agent server --no-token --host 0.0.0.0 --port 3000" -``` - -In a Dockerfile: - -```dockerfile -RUN sandbox-agent install desktop --yes -``` +The `0.3.1-full` tag pins the exact version. The moving `full` tag is also published for contributors who want the latest full image. ## TypeScript with dockerode @@ -52,7 +31,7 @@ const docker = new Docker(); const PORT = 3000; const container = await docker.createContainer({ - Image: "rivetdev/sandbox-agent:0.4.2-full", + Image: "rivetdev/sandbox-agent:0.3.1-full", Cmd: ["server", "--no-token", "--host", "0.0.0.0", "--port", `${PORT}`], Env: [ `ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}`, @@ -86,7 +65,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \ bash ca-certificates curl git && \ rm -rf /var/lib/apt/lists/* -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh && \ +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh && \ sandbox-agent install-agent --all RUN useradd -m -s /bin/bash sandbox diff --git a/docs/deploy/e2b.mdx b/docs/deploy/e2b.mdx index 225cfdc..8ea4c74 100644 --- a/docs/deploy/e2b.mdx +++ b/docs/deploy/e2b.mdx @@ -10,43 +10,43 @@ description: "Deploy Sandbox Agent inside an E2B sandbox." ## TypeScript example -```bash -npm install sandbox-agent@0.4.x @e2b/code-interpreter -``` - ```typescript +import { Sandbox } from "@e2b/code-interpreter"; import { SandboxAgent } from "sandbox-agent"; -import { e2b } from "sandbox-agent/e2b"; const envs: Record = {}; if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; -const template = process.env.E2B_TEMPLATE; -const sdk = await SandboxAgent.start({ - sandbox: e2b({ - template, - create: { envs }, - }), +const sandbox = await Sandbox.create({ allowInternetAccess: true, envs }); + +await sandbox.commands.run( + "curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh" +); + +await sandbox.commands.run("sandbox-agent install-agent claude"); +await sandbox.commands.run("sandbox-agent install-agent codex"); + +await sandbox.commands.run( + "sandbox-agent server --no-token --host 0.0.0.0 --port 3000", + { background: true, timeoutMs: 0 } +); + +const baseUrl = `https://${sandbox.getHost(3000)}`; +const sdk = await SandboxAgent.connect({ baseUrl }); + +const session = await sdk.createSession({ agent: "claude" }); +const off = session.onEvent((event) => { + console.log(event.sender, event.payload); }); -try { - const session = await sdk.createSession({ agent: "claude" }); - const response = await session.prompt([ - { type: "text", text: "Summarize this repository" }, - ]); - console.log(response.stopReason); -} finally { - await sdk.destroySandbox(); -} +await session.prompt([{ type: "text", text: "Summarize this repository" }]); +off(); + +await sandbox.kill(); ``` -The `e2b` provider handles sandbox creation, Sandbox Agent installation, agent setup, and server startup automatically. Sandboxes pause by default instead of being deleted, and reconnecting with the same `sandboxId` resumes them automatically. - -Pass `template` when you want to start from a custom E2B template alias or template ID. E2B base-image selection happens when you build the template, then `sandbox-agent/e2b` uses that template at sandbox creation time. - ## Faster cold starts For faster startup, create a custom E2B template with Sandbox Agent and target agents pre-installed. -Build System 2.0 also lets you choose the template's base image in code. -See [E2B Custom Templates](https://e2b.dev/docs/sandbox-template) and [E2B Base Images](https://e2b.dev/docs/template/base-image). +See [E2B Custom Templates](https://e2b.dev/docs/sandbox-template). diff --git a/docs/deploy/foundry-self-hosting.mdx b/docs/deploy/foundry-self-hosting.mdx new file mode 100644 index 0000000..172d680 --- /dev/null +++ b/docs/deploy/foundry-self-hosting.mdx @@ -0,0 +1,153 @@ +--- +title: "Foundry Self-Hosting" +description: "Environment, credentials, and deployment setup for Sandbox Agent Foundry auth, GitHub, and billing." +--- + +This guide documents the deployment contract for the Foundry product surface: app auth, GitHub onboarding, repository import, and billing. + +It also covers the local-development bootstrap that uses `.env.development` only when `NODE_ENV=development`. + +## Local Development + +For backend local development, the Foundry backend now supports a development-only dotenv bootstrap: + +- It loads `.env.development.local` and `.env.development` +- It does this **only** when `NODE_ENV=development` +- It does **not** load dotenv files in production + +The example file lives at [`/.env.development.example`](https://github.com/rivet-dev/sandbox-agent/blob/main/.env.development.example). + +To use it locally: + +```bash +cp .env.development.example .env.development +``` + +Run the backend with: + +```bash +just foundry-backend-start +``` + +That recipe sets `NODE_ENV=development`, which enables the dotenv loader. + +### Local Defaults + +These values can be safely defaulted for local development: + +- `APP_URL=http://localhost:4173` +- `BETTER_AUTH_URL=http://localhost:7741` +- `BETTER_AUTH_SECRET=sandbox-agent-foundry-development-only-change-me` +- `GITHUB_REDIRECT_URI=http://localhost:7741/v1/auth/callback/github` + +These should be treated as development-only values. + +## Production Environment + +For production or self-hosting, set these as real environment variables in your deployment platform. Do not rely on dotenv file loading. + +### App/Auth + +| Variable | Required | Notes | +|---|---:|---| +| `APP_URL` | Yes | Public frontend origin | +| `BETTER_AUTH_URL` | Yes | Public auth base URL | +| `BETTER_AUTH_SECRET` | Yes | Strong random secret for auth/session signing | + +### GitHub OAuth + +| Variable | Required | Notes | +|---|---:|---| +| `GITHUB_CLIENT_ID` | Yes | GitHub OAuth app client id | +| `GITHUB_CLIENT_SECRET` | Yes | GitHub OAuth app client secret | +| `GITHUB_REDIRECT_URI` | Yes | GitHub OAuth callback URL | + +Use GitHub OAuth for: + +- user sign-in +- user identity +- org selection +- access to the signed-in user’s GitHub context + +## GitHub App + +If your Foundry deployment uses GitHub App-backed organization install and repo import, also configure: + +| Variable | Required | Notes | +|---|---:|---| +| `GITHUB_APP_ID` | Yes | GitHub App id | +| `GITHUB_APP_CLIENT_ID` | Yes | GitHub App client id | +| `GITHUB_APP_CLIENT_SECRET` | Yes | GitHub App client secret | +| `GITHUB_APP_PRIVATE_KEY` | Yes | PEM private key for installation auth | + +For `.env.development` and `.env.development.local`, store `GITHUB_APP_PRIVATE_KEY` as a quoted single-line value with `\n` escapes instead of raw multi-line PEM text. + +Recommended GitHub App permissions: + +- Repository `Metadata: Read` +- Repository `Contents: Read & Write` +- Repository `Pull requests: Read & Write` +- Repository `Checks: Read` +- Repository `Commit statuses: Read` + +Set the webhook URL to `https:///v1/webhooks/github` and generate a webhook secret. Store the secret as `GITHUB_WEBHOOK_SECRET`. + +Recommended webhook subscriptions: + +- `installation` +- `installation_repositories` +- `pull_request` +- `pull_request_review` +- `pull_request_review_comment` +- `push` +- `create` +- `delete` +- `check_suite` +- `check_run` +- `status` + +Use the GitHub App for: + +- installation/reconnect state +- org repo import +- repository sync +- PR creation and updates + +Use GitHub OAuth for: + +- who the user is +- which orgs they can choose + +## Stripe + +For live billing, configure: + +| Variable | Required | Notes | +|---|---:|---| +| `STRIPE_SECRET_KEY` | Yes | Server-side Stripe secret key | +| `STRIPE_PUBLISHABLE_KEY` | Yes | Client-side Stripe publishable key | +| `STRIPE_WEBHOOK_SECRET` | Yes | Signing secret for billing webhooks | +| `STRIPE_PRICE_TEAM` | Yes | Stripe price id for the Team plan checkout session | + +Stripe should own: + +- hosted checkout +- billing portal +- subscription status +- invoice history +- webhook-driven state sync + +## Mock Invariant + +Foundry’s mock client path should continue to work end to end even when the real auth/GitHub/Stripe path exists. + +That includes: + +- sign-in +- org selection/import +- settings +- billing UI +- workspace/task/session flow +- seat accrual + +Use mock mode for deterministic UI review and local product development. Use the real env-backed path for integration and self-hosting. diff --git a/docs/deploy/local.mdx b/docs/deploy/local.mdx index 6ecdb09..eab8f3f 100644 --- a/docs/deploy/local.mdx +++ b/docs/deploy/local.mdx @@ -9,7 +9,7 @@ For local development, run Sandbox Agent directly on your machine. ```bash # Install -curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh +curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh # Run sandbox-agent server --no-token --host 127.0.0.1 --port 2468 @@ -20,27 +20,24 @@ Or with npm/Bun: ```bash - npx @sandbox-agent/cli@0.4.x server --no-token --host 127.0.0.1 --port 2468 + npx @sandbox-agent/cli@0.3.x server --no-token --host 127.0.0.1 --port 2468 ``` ```bash - bunx @sandbox-agent/cli@0.4.x server --no-token --host 127.0.0.1 --port 2468 + bunx @sandbox-agent/cli@0.3.x server --no-token --host 127.0.0.1 --port 2468 ``` ## With the TypeScript SDK -The SDK can spawn and manage the server as a subprocess using the `local` provider: +The SDK can spawn and manage the server as a subprocess: ```typescript import { SandboxAgent } from "sandbox-agent"; -import { local } from "sandbox-agent/local"; -const sdk = await SandboxAgent.start({ - sandbox: local(), -}); +const sdk = await SandboxAgent.start(); const session = await sdk.createSession({ agent: "claude", @@ -50,21 +47,7 @@ await session.prompt([ { type: "text", text: "Summarize this repository." }, ]); -await sdk.destroySandbox(); +await sdk.dispose(); ``` This starts the server on an available local port and connects automatically. - -Pass options to customize the local provider: - -```typescript -const sdk = await SandboxAgent.start({ - sandbox: local({ - port: 3000, - log: "inherit", - env: { - ANTHROPIC_API_KEY: process.env.MY_ANTHROPIC_KEY, - }, - }), -}); -``` diff --git a/docs/deploy/modal.mdx b/docs/deploy/modal.mdx deleted file mode 100644 index 5850fd8..0000000 --- a/docs/deploy/modal.mdx +++ /dev/null @@ -1,55 +0,0 @@ ---- -title: "Modal" -description: "Deploy Sandbox Agent inside a Modal sandbox." ---- - -## Prerequisites - -- `MODAL_TOKEN_ID` and `MODAL_TOKEN_SECRET` from [modal.com/settings](https://modal.com/settings) -- `ANTHROPIC_API_KEY` or `OPENAI_API_KEY` - -## TypeScript example - -```bash -npm install sandbox-agent@0.4.x modal -``` - -```typescript -import { SandboxAgent } from "sandbox-agent"; -import { modal } from "sandbox-agent/modal"; - -const secrets: Record = {}; -if (process.env.ANTHROPIC_API_KEY) secrets.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; -if (process.env.OPENAI_API_KEY) secrets.OPENAI_API_KEY = process.env.OPENAI_API_KEY; -const baseImage = process.env.MODAL_BASE_IMAGE ?? "node:22-slim"; - -const sdk = await SandboxAgent.start({ - sandbox: modal({ - image: baseImage, - create: { secrets }, - }), -}); - -try { - const session = await sdk.createSession({ agent: "claude" }); - const response = await session.prompt([ - { type: "text", text: "Summarize this repository" }, - ]); - console.log(response.stopReason); -} finally { - await sdk.destroySandbox(); -} -``` - -The `modal` provider handles app creation, image building, sandbox provisioning, agent installation, server startup, and tunnel networking automatically. -Set `image` to change the base Docker image before Sandbox Agent and its agent binaries are layered on top. You can also pass a prebuilt Modal `Image` object. - -## Faster cold starts - -Modal caches image layers, so the Dockerfile commands that install `curl` and `sandbox-agent` only run on the first build. Subsequent sandbox creates reuse the cached image. - -## Notes - -- Modal sandboxes use [gVisor](https://gvisor.dev/) for strong isolation. -- Ports are exposed via encrypted tunnels (`encryptedPorts`). The provider uses `sb.tunnels()` to get the public HTTPS URL. -- Environment variables (API keys) are passed as Modal [Secrets](https://modal.com/docs/guide/secrets) for security. diff --git a/docs/deploy/vercel.mdx b/docs/deploy/vercel.mdx index ec931d8..2025d67 100644 --- a/docs/deploy/vercel.mdx +++ b/docs/deploy/vercel.mdx @@ -10,39 +10,51 @@ description: "Deploy Sandbox Agent inside a Vercel Sandbox." ## TypeScript example -```bash -npm install sandbox-agent@0.4.x @vercel/sandbox -``` - ```typescript +import { Sandbox } from "@vercel/sandbox"; import { SandboxAgent } from "sandbox-agent"; -import { vercel } from "sandbox-agent/vercel"; -const env: Record = {}; -if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; -if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY; +const envs: Record = {}; +if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; +if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; -const sdk = await SandboxAgent.start({ - sandbox: vercel({ - create: { - runtime: "node24", - env, - }, - }), +const sandbox = await Sandbox.create({ + runtime: "node24", + ports: [3000], }); -try { - const session = await sdk.createSession({ agent: "claude" }); - const response = await session.prompt([ - { type: "text", text: "Summarize this repository" }, - ]); - console.log(response.stopReason); -} finally { - await sdk.destroySandbox(); -} -``` +const run = async (cmd: string, args: string[] = []) => { + const result = await sandbox.runCommand({ cmd, args, env: envs }); + if (result.exitCode !== 0) { + throw new Error(`Command failed: ${cmd} ${args.join(" ")}`); + } +}; -The `vercel` provider handles sandbox creation, Sandbox Agent installation, agent setup, and server startup automatically. +await run("sh", ["-c", "curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh"]); +await run("sandbox-agent", ["install-agent", "claude"]); +await run("sandbox-agent", ["install-agent", "codex"]); + +await sandbox.runCommand({ + cmd: "sandbox-agent", + args: ["server", "--no-token", "--host", "0.0.0.0", "--port", "3000"], + env: envs, + detached: true, +}); + +const baseUrl = sandbox.domain(3000); +const sdk = await SandboxAgent.connect({ baseUrl }); + +const session = await sdk.createSession({ agent: "claude" }); + +const off = session.onEvent((event) => { + console.log(event.sender, event.payload); +}); + +await session.prompt([{ type: "text", text: "Summarize this repository" }]); +off(); + +await sandbox.stop(); +``` ## Authentication diff --git a/docs/docs.json b/docs/docs.json index dbcc407..a6c2087 100644 --- a/docs/docs.json +++ b/docs/docs.json @@ -1,6 +1,6 @@ { "$schema": "https://mintlify.com/docs.json", - "theme": "mint", + "theme": "willow", "name": "Sandbox Agent SDK", "appearance": { "default": "dark", @@ -8,8 +8,8 @@ }, "colors": { "primary": "#ff4f00", - "light": "#ff6a2a", - "dark": "#cc3f00" + "light": "#ff4f00", + "dark": "#ff4f00" }, "favicon": "/favicon.svg", "logo": { @@ -25,13 +25,17 @@ }, "navbar": { "links": [ + { + "label": "Gigacode", + "icon": "terminal", + "href": "https://github.com/rivet-dev/sandbox-agent/tree/main/gigacode" + }, { "label": "Discord", "icon": "discord", "href": "https://discord.gg/auCecybynK" }, { - "label": "GitHub", "type": "github", "href": "https://github.com/rivet-dev/sandbox-agent" } @@ -47,55 +51,46 @@ "pages": [ "quickstart", "sdk-overview", - "llm-credentials", "react-components", { "group": "Deploy", "icon": "server", "pages": [ "deploy/local", + "deploy/computesdk", "deploy/e2b", "deploy/daytona", "deploy/vercel", "deploy/cloudflare", "deploy/docker", - "deploy/modal", - "deploy/boxlite", - "deploy/computesdk" + "deploy/boxlite" ] } ] }, { "group": "Agent", - "pages": [ - "agent-sessions", - { - "group": "Agents", - "icon": "robot", - "pages": ["agents/claude", "agents/codex", "agents/opencode", "agents/cursor", "agents/amp", "agents/pi"] - }, - "attachments", - "skills-config", - "mcp-config", - "custom-tools" - ] + "pages": ["agent-sessions", "attachments", "skills-config", "mcp-config", "custom-tools"] }, { "group": "System", - "pages": ["file-system", "processes", "computer-use", "common-software"] + "pages": ["file-system", "processes"] + }, + { + "group": "Orchestration", + "pages": ["architecture", "session-persistence", "observability", "multiplayer", "security"] }, { "group": "Reference", "pages": [ - "troubleshooting", - "architecture", + "agent-capabilities", "cli", "inspector", "opencode-compatibility", { "group": "More", "pages": [ + "credentials", "daemon", "cors", "session-restoration", @@ -120,11 +115,5 @@ ] } ] - }, - "__removed": [ - { - "group": "Orchestration", - "pages": ["orchestration-architecture", "session-persistence", "observability", "multiplayer", "security"] - } - ] + } } diff --git a/docs/gigacode.mdx b/docs/gigacode.mdx new file mode 100644 index 0000000..ccc9e39 --- /dev/null +++ b/docs/gigacode.mdx @@ -0,0 +1,6 @@ +--- +title: Gigacode +url: "https://github.com/rivet-dev/sandbox-agent/tree/main/gigacode" +--- + + diff --git a/docs/inspector.mdx b/docs/inspector.mdx index 1412c21..cc5f3d0 100644 --- a/docs/inspector.mdx +++ b/docs/inspector.mdx @@ -35,7 +35,6 @@ console.log(url); - Prompt testing - Request/response debugging - Interactive permission prompts (approve, always-allow, or reject tool-use requests) -- Desktop panel for status, remediation, start/stop, and screenshot refresh - Process management (create, stop, kill, delete, view logs) - Interactive PTY terminal for tty processes - One-shot command execution @@ -51,16 +50,3 @@ console.log(url); The Inspector includes an embedded Ghostty-based terminal for interactive tty processes. The UI uses the SDK's high-level `connectProcessTerminal(...)` wrapper via the shared `@sandbox-agent/react` `ProcessTerminal` component. - -## Desktop panel - -The `Desktop` panel shows the current desktop runtime state, missing dependencies, -the suggested install command, last error details, process/log paths, and the -latest captured screenshot. - -Use it to: - -- Check whether desktop dependencies are installed -- Start or stop the managed desktop runtime -- Refresh desktop status -- Capture a fresh screenshot on demand diff --git a/docs/llm-credentials.mdx b/docs/llm-credentials.mdx deleted file mode 100644 index e771740..0000000 --- a/docs/llm-credentials.mdx +++ /dev/null @@ -1,250 +0,0 @@ ---- -title: "LLM Credentials" -description: "Strategies for providing LLM provider credentials to agents." -icon: "key" ---- - -Sandbox Agent needs LLM provider credentials (Anthropic, OpenAI, etc.) to run agent sessions. - -## Configuration - -Pass credentials via `spawn.env` when starting a sandbox. Each call to `SandboxAgent.start()` can use different credentials: - -```typescript -import { SandboxAgent } from "sandbox-agent"; - -const sdk = await SandboxAgent.start({ - spawn: { - env: { - ANTHROPIC_API_KEY: "sk-ant-...", - OPENAI_API_KEY: "sk-...", - }, - }, -}); -``` - -Each agent requires credentials from a specific provider. Sandbox Agent checks environment variables (including those passed via `spawn.env`) and host config files: - -| Agent | Provider | Environment variables | Config files | -|-------|----------|----------------------|--------------| -| Claude Code | Anthropic | `ANTHROPIC_API_KEY`, `CLAUDE_API_KEY` | `~/.claude.json`, `~/.claude/.credentials.json` | -| Amp | Anthropic | `ANTHROPIC_API_KEY`, `CLAUDE_API_KEY` | `~/.amp/config.json` | -| Codex | OpenAI | `OPENAI_API_KEY`, `CODEX_API_KEY` | `~/.codex/auth.json` | -| OpenCode | Anthropic or OpenAI | `ANTHROPIC_API_KEY`, `OPENAI_API_KEY` | `~/.local/share/opencode/auth.json` | -| Mock | None | - | - | - -## Credential strategies - -LLM credentials are passed into the sandbox as environment variables. The agent and everything inside the sandbox has access to the token, so it's important to choose the right strategy for how you provision and scope these credentials. - -| Strategy | Who pays | Cost attribution | Best for | -|----------|----------|-----------------|----------| -| **Per-tenant gateway** (recommended) | Your organization, billed back per tenant | Per-tenant keys with budgets | Multi-tenant SaaS, usage-based billing | -| **Bring your own key** | Each user (usage-based) | Per-user by default | Dev environments, internal tools | -| **Shared API key** | Your organization | None (single bill) | Single-tenant apps, internal platforms | -| **Personal subscription** | Each user (existing subscription) | Per-user by default | Local dev, internal tools where users have Claude or Codex subscriptions | - -### Per-tenant gateway (recommended) - -Route LLM traffic through a gateway that mints per-tenant API keys, each with its own spend tracking and budget limits. - -```mermaid -graph LR - B[Your Backend] -->|tenant key| S[Sandbox] - S -->|LLM requests| G[Gateway] - G -->|scoped key| P[LLM Provider] -``` - -Your backend issues a scoped key per tenant, then passes it to the sandbox. This is the typical pattern when using sandbox providers (E2B, Daytona, Docker). - -```typescript expandable -import { SandboxAgent } from "sandbox-agent"; - -async function createTenantSandbox(tenantId: string) { - // Issue a scoped key for this tenant via OpenRouter - const res = await fetch("https://openrouter.ai/api/v1/keys", { - method: "POST", - headers: { - Authorization: `Bearer ${process.env.OPENROUTER_PROVISIONING_KEY}`, - "Content-Type": "application/json", - }, - body: JSON.stringify({ - name: `tenant-${tenantId}`, - limit: 50, - limitResetType: "monthly", - }), - }); - const { key } = await res.json(); - - // Start a sandbox with the tenant's scoped key - const sdk = await SandboxAgent.start({ - spawn: { - env: { - OPENAI_API_KEY: key, // OpenRouter uses OpenAI-compatible endpoints - }, - }, - }); - - const session = await sdk.createSession({ - agent: "claude", - sessionInit: { cwd: "/workspace" }, - }); - - return { sdk, session }; -} -``` - -#### Security - -Recommended for multi-tenant applications. Each tenant gets a scoped key with its own budget, so exfiltration only exposes that tenant's allowance. - -#### Use cases - -- **Multi-tenant SaaS**: per-tenant spend tracking and budget limits -- **Production apps**: exposed to end users who need isolated credentials -- **Usage-based billing**: each tenant pays for their own consumption - -#### Choosing a gateway - - - - - -Managed service, zero infrastructure. [OpenRouter](https://openrouter.ai/docs/features/provisioning-api-keys) provides per-tenant API keys with spend tracking and budget limits via their Provisioning API. Pass the tenant key to Sandbox Agent as `OPENAI_API_KEY` (OpenRouter uses OpenAI-compatible endpoints). - -```bash -# Create a key for a tenant with a $50/month budget -curl https://openrouter.ai/api/v1/keys \ - -H "Authorization: Bearer $PROVISIONING_KEY" \ - -H "Content-Type: application/json" \ - -d '{ - "name": "tenant-acme", - "limit": 50, - "limitResetType": "monthly" - }' -``` - -Easiest to set up but not open-source. See [OpenRouter pricing](https://openrouter.ai/docs/framework/pricing) for details. - - - - - -Self-hosted, open-source (MIT). [LiteLLM](https://github.com/BerriAI/litellm) is an OpenAI-compatible proxy with hierarchical budgets (org, team, user, key), virtual keys, and spend tracking. Requires Python + PostgreSQL. - -```bash -# Create a team (tenant) with a $500 budget -curl http://litellm:4000/team/new \ - -H "Authorization: Bearer $LITELLM_MASTER_KEY" \ - -H "Content-Type: application/json" \ - -d '{ - "team_alias": "tenant-acme", - "max_budget": 500 - }' - -# Generate a key for that team -curl http://litellm:4000/key/generate \ - -H "Authorization: Bearer $LITELLM_MASTER_KEY" \ - -H "Content-Type: application/json" \ - -d '{ - "team_id": "team-abc123", - "max_budget": 100 - }' -``` - -Full control with no vendor lock-in. Organization-level features require an enterprise license. - - - - - -Self-hosted, open-source (Apache 2.0). [Portkey](https://github.com/Portkey-AI/gateway) is a lightweight OpenAI-compatible gateway supporting 200+ providers. Single binary, no database required. Create virtual keys with per-tenant budget limits and pass them to Sandbox Agent. - -Lightest operational footprint of the self-hosted options. Observability and analytics require the managed platform or your own tooling. - - - - - -To bill tenants for LLM usage, use [Stripe token billing](https://docs.stripe.com/billing/token-billing) (integrates natively with OpenRouter) or query your gateway's spend API and feed usage into your billing system. - -### Bring your own key - -Each user provides their own API key. Users are billed directly by the LLM provider with no additional infrastructure needed. - -Pass the user's key via `spawn.env`: - -```typescript -const sdk = await SandboxAgent.start({ - spawn: { - env: { - ANTHROPIC_API_KEY: userProvidedKey, - }, - }, -}); -``` - -#### Security - -API keys are typically long-lived. The key is visible to the agent and anything running inside the sandbox, so exfiltration is possible. This is usually acceptable for developer-facing tools where the user owns the key. - -#### Use cases - -- **Developer tools**: each user manages their own API key -- **Internal platforms**: users already have LLM provider accounts -- **Per-user billing**: no extra infrastructure needed - -### Shared credentials - -A single organization-wide API key is used for all sessions. All token usage appears on one bill with no per-user or per-tenant cost attribution. - -```typescript -const sdk = await SandboxAgent.start({ - spawn: { - env: { - ANTHROPIC_API_KEY: process.env.ORG_ANTHROPIC_KEY!, - OPENAI_API_KEY: process.env.ORG_OPENAI_KEY!, - }, - }, -}); -``` - -If you need to track or limit spend per tenant, use a per-tenant gateway instead. - -#### Security - -Not recommended for anything other than internal tooling. A single exfiltrated key exposes your organization's entire LLM budget. If you need org-paid credentials for external users, use a per-tenant gateway with scoped keys instead. - -#### Use cases - -- **Single-tenant apps**: small number of users, one bill -- **Prototyping**: cost attribution not needed yet -- **Simplicity over security**: acceptable when exfiltration risk is low - -### Personal subscription - -If the user is signed into Claude Code or Codex on the host machine, Sandbox Agent automatically picks up their OAuth tokens. No configuration is needed. - -#### Remote sandboxes - -Extract credentials locally and pass them to a remote sandbox via `spawn.env`: - -```bash -$ sandbox-agent credentials extract-env -ANTHROPIC_API_KEY=sk-ant-... -CLAUDE_API_KEY=sk-ant-... -OPENAI_API_KEY=sk-... -CODEX_API_KEY=sk-... -``` - -Use `-e` to prefix with `export` for shell sourcing. - -#### Security - -Personal subscriptions use OAuth tokens with a limited lifespan. These are the same credentials used when running an agent normally on the host. If a token is exfiltrated from the sandbox, the exposure window is short. - -#### Use cases - -- **Local development**: users are already signed into Claude Code or Codex -- **Internal tools**: every user has their own subscription -- **Prototyping**: no key management needed \ No newline at end of file diff --git a/docs/mcp-config.mdx b/docs/mcp-config.mdx index cc1c976..71e8105 100644 --- a/docs/mcp-config.mdx +++ b/docs/mcp-config.mdx @@ -27,7 +27,9 @@ await sdk.setMcpConfig( // Create a session using the configured MCP servers const session = await sdk.createSession({ agent: "claude", - cwd: "/workspace", + sessionInit: { + cwd: "/workspace", + }, }); await session.prompt([ diff --git a/docs/multiplayer.mdx b/docs/multiplayer.mdx index 215bb1c..4f405ea 100644 --- a/docs/multiplayer.mdx +++ b/docs/multiplayer.mdx @@ -20,40 +20,8 @@ Use [actor keys](https://rivet.dev/docs/actors/keys) to map each workspace to on ```ts Actor (server) import { actor, setup } from "rivetkit"; -import { SandboxAgent, type SessionPersistDriver, type SessionRecord, type SessionEvent, type ListPageRequest, type ListPage, type ListEventsRequest } from "sandbox-agent"; - -interface RivetPersistData { sessions: Record; events: Record; } -type RivetPersistState = { _sandboxAgentPersist: RivetPersistData }; - -class RivetSessionPersistDriver implements SessionPersistDriver { - private readonly stateKey: string; - private readonly ctx: { state: Record }; - constructor(ctx: { state: Record }, options: { stateKey?: string } = {}) { - this.ctx = ctx; - this.stateKey = options.stateKey ?? "_sandboxAgentPersist"; - if (!this.ctx.state[this.stateKey]) { - this.ctx.state[this.stateKey] = { sessions: {}, events: {} }; - } - } - private get data(): RivetPersistData { return this.ctx.state[this.stateKey] as RivetPersistData; } - async getSession(id: string) { const s = this.data.sessions[id]; return s ? { ...s } : undefined; } - async listSessions(request: ListPageRequest = {}): Promise> { - const sorted = Object.values(this.data.sessions).sort((a, b) => a.createdAt - b.createdAt || a.id.localeCompare(b.id)); - const offset = Number(request.cursor ?? 0); - const limit = request.limit ?? 100; - const slice = sorted.slice(offset, offset + limit); - return { items: slice, nextCursor: offset + slice.length < sorted.length ? String(offset + slice.length) : undefined }; - } - async updateSession(session: SessionRecord) { this.data.sessions[session.id] = { ...session }; if (!this.data.events[session.id]) this.data.events[session.id] = []; } - async listEvents(request: ListEventsRequest): Promise> { - const all = [...(this.data.events[request.sessionId] ?? [])].sort((a, b) => a.eventIndex - b.eventIndex || a.id.localeCompare(b.id)); - const offset = Number(request.cursor ?? 0); - const limit = request.limit ?? 100; - const slice = all.slice(offset, offset + limit); - return { items: slice, nextCursor: offset + slice.length < all.length ? String(offset + slice.length) : undefined }; - } - async insertEvent(sessionId: string, event: SessionEvent) { const events = this.data.events[sessionId] ?? []; events.push({ ...event, payload: JSON.parse(JSON.stringify(event.payload)) }); this.data.events[sessionId] = events; } -} +import { SandboxAgent } from "sandbox-agent"; +import { RivetSessionPersistDriver, type RivetPersistState } from "@sandbox-agent/persist-rivet"; type WorkspaceState = RivetPersistState & { sandboxId: string; @@ -143,5 +111,5 @@ await conn.prompt({ ## Notes - Keep sandbox calls actor-only. Browser clients should not call Sandbox Agent directly. -- Copy the Rivet persist driver from the example above into your project so session history persists in actor state. +- Use `@sandbox-agent/persist-rivet` so session history persists in actor state. - For client connection patterns, see [Rivet JavaScript client](https://rivet.dev/docs/clients/javascript). diff --git a/docs/openapi.json b/docs/openapi.json index 3624707..f2bd640 100644 --- a/docs/openapi.json +++ b/docs/openapi.json @@ -10,7 +10,7 @@ "license": { "name": "Apache-2.0" }, - "version": "0.4.2" + "version": "0.3.2" }, "servers": [ { @@ -628,1814 +628,6 @@ } } }, - "/v1/desktop/clipboard": { - "get": { - "tags": ["v1"], - "summary": "Read the desktop clipboard.", - "description": "Returns the current text content of the X11 clipboard.", - "operationId": "get_v1_desktop_clipboard", - "parameters": [ - { - "name": "selection", - "in": "query", - "required": false, - "schema": { - "type": "string", - "nullable": true - } - } - ], - "responses": { - "200": { - "description": "Clipboard contents", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopClipboardResponse" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "500": { - "description": "Clipboard read failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - }, - "post": { - "tags": ["v1"], - "summary": "Write to the desktop clipboard.", - "description": "Sets the text content of the X11 clipboard.", - "operationId": "post_v1_desktop_clipboard", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopClipboardWriteRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Clipboard updated", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopActionResponse" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "500": { - "description": "Clipboard write failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/display/info": { - "get": { - "tags": ["v1"], - "summary": "Get desktop display information.", - "description": "Performs a health-gated display query against the managed desktop and\nreturns the current display identifier and resolution.", - "operationId": "get_v1_desktop_display_info", - "responses": { - "200": { - "description": "Desktop display information", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopDisplayInfoResponse" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "503": { - "description": "Desktop runtime health or display query failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/keyboard/down": { - "post": { - "tags": ["v1"], - "summary": "Press and hold a desktop keyboard key.", - "description": "Performs a health-gated `xdotool keydown` operation against the managed\ndesktop.", - "operationId": "post_v1_desktop_keyboard_down", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopKeyboardDownRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop keyboard action result", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopActionResponse" - } - } - } - }, - "400": { - "description": "Invalid keyboard down request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/keyboard/press": { - "post": { - "tags": ["v1"], - "summary": "Press a desktop keyboard shortcut.", - "description": "Performs a health-gated `xdotool key` operation against the managed\ndesktop.", - "operationId": "post_v1_desktop_keyboard_press", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopKeyboardPressRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop keyboard action result", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopActionResponse" - } - } - } - }, - "400": { - "description": "Invalid keyboard press request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/keyboard/type": { - "post": { - "tags": ["v1"], - "summary": "Type desktop keyboard text.", - "description": "Performs a health-gated `xdotool type` operation against the managed\ndesktop.", - "operationId": "post_v1_desktop_keyboard_type", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopKeyboardTypeRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop keyboard action result", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopActionResponse" - } - } - } - }, - "400": { - "description": "Invalid keyboard type request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/keyboard/up": { - "post": { - "tags": ["v1"], - "summary": "Release a desktop keyboard key.", - "description": "Performs a health-gated `xdotool keyup` operation against the managed\ndesktop.", - "operationId": "post_v1_desktop_keyboard_up", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopKeyboardUpRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop keyboard action result", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopActionResponse" - } - } - } - }, - "400": { - "description": "Invalid keyboard up request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/launch": { - "post": { - "tags": ["v1"], - "summary": "Launch a desktop application.", - "description": "Launches an application by name on the managed desktop, optionally waiting\nfor its window to appear.", - "operationId": "post_v1_desktop_launch", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopLaunchRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Application launched", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopLaunchResponse" - } - } - } - }, - "404": { - "description": "Application not found", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/mouse/click": { - "post": { - "tags": ["v1"], - "summary": "Click on the desktop.", - "description": "Performs a health-gated pointer move and click against the managed desktop\nand returns the resulting mouse position.", - "operationId": "post_v1_desktop_mouse_click", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMouseClickRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop mouse position after click", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMousePositionResponse" - } - } - } - }, - "400": { - "description": "Invalid mouse click request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/mouse/down": { - "post": { - "tags": ["v1"], - "summary": "Press and hold a desktop mouse button.", - "description": "Performs a health-gated optional pointer move followed by `xdotool mousedown`\nand returns the resulting mouse position.", - "operationId": "post_v1_desktop_mouse_down", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMouseDownRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop mouse position after button press", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMousePositionResponse" - } - } - } - }, - "400": { - "description": "Invalid mouse down request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/mouse/drag": { - "post": { - "tags": ["v1"], - "summary": "Drag the desktop mouse.", - "description": "Performs a health-gated drag gesture against the managed desktop and\nreturns the resulting mouse position.", - "operationId": "post_v1_desktop_mouse_drag", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMouseDragRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop mouse position after drag", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMousePositionResponse" - } - } - } - }, - "400": { - "description": "Invalid mouse drag request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/mouse/move": { - "post": { - "tags": ["v1"], - "summary": "Move the desktop mouse.", - "description": "Performs a health-gated absolute pointer move on the managed desktop and\nreturns the resulting mouse position.", - "operationId": "post_v1_desktop_mouse_move", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMouseMoveRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop mouse position after move", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMousePositionResponse" - } - } - } - }, - "400": { - "description": "Invalid mouse move request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/mouse/position": { - "get": { - "tags": ["v1"], - "summary": "Get the current desktop mouse position.", - "description": "Performs a health-gated mouse position query against the managed desktop.", - "operationId": "get_v1_desktop_mouse_position", - "responses": { - "200": { - "description": "Desktop mouse position", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMousePositionResponse" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input check failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/mouse/scroll": { - "post": { - "tags": ["v1"], - "summary": "Scroll the desktop mouse wheel.", - "description": "Performs a health-gated scroll gesture at the requested coordinates and\nreturns the resulting mouse position.", - "operationId": "post_v1_desktop_mouse_scroll", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMouseScrollRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop mouse position after scroll", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMousePositionResponse" - } - } - } - }, - "400": { - "description": "Invalid mouse scroll request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/mouse/up": { - "post": { - "tags": ["v1"], - "summary": "Release a desktop mouse button.", - "description": "Performs a health-gated optional pointer move followed by `xdotool mouseup`\nand returns the resulting mouse position.", - "operationId": "post_v1_desktop_mouse_up", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMouseUpRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop mouse position after button release", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopMousePositionResponse" - } - } - } - }, - "400": { - "description": "Invalid mouse up request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or input failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/open": { - "post": { - "tags": ["v1"], - "summary": "Open a file or URL with the default handler.", - "description": "Opens a file path or URL using xdg-open on the managed desktop.", - "operationId": "post_v1_desktop_open", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopOpenRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Target opened", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopOpenResponse" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/recording/start": { - "post": { - "tags": ["v1"], - "summary": "Start desktop recording.", - "description": "Starts an ffmpeg x11grab recording against the managed desktop and returns\nthe created recording metadata.", - "operationId": "post_v1_desktop_recording_start", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopRecordingStartRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop recording started", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopRecordingInfo" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready or a recording is already active", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop recording failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/recording/stop": { - "post": { - "tags": ["v1"], - "summary": "Stop desktop recording.", - "description": "Stops the active desktop recording and returns the finalized recording\nmetadata.", - "operationId": "post_v1_desktop_recording_stop", - "responses": { - "200": { - "description": "Desktop recording stopped", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopRecordingInfo" - } - } - } - }, - "409": { - "description": "No active desktop recording", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop recording stop failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/recordings": { - "get": { - "tags": ["v1"], - "summary": "List desktop recordings.", - "description": "Returns the current desktop recording catalog.", - "operationId": "get_v1_desktop_recordings", - "responses": { - "200": { - "description": "Desktop recordings", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopRecordingListResponse" - } - } - } - }, - "502": { - "description": "Desktop recordings query failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/recordings/{id}": { - "get": { - "tags": ["v1"], - "summary": "Get desktop recording metadata.", - "description": "Returns metadata for a single desktop recording.", - "operationId": "get_v1_desktop_recording", - "parameters": [ - { - "name": "id", - "in": "path", - "description": "Desktop recording ID", - "required": true, - "schema": { - "type": "string" - } - } - ], - "responses": { - "200": { - "description": "Desktop recording metadata", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopRecordingInfo" - } - } - } - }, - "404": { - "description": "Unknown desktop recording", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - }, - "delete": { - "tags": ["v1"], - "summary": "Delete a desktop recording.", - "description": "Removes a completed desktop recording and its file from disk.", - "operationId": "delete_v1_desktop_recording", - "parameters": [ - { - "name": "id", - "in": "path", - "description": "Desktop recording ID", - "required": true, - "schema": { - "type": "string" - } - } - ], - "responses": { - "204": { - "description": "Desktop recording deleted" - }, - "404": { - "description": "Unknown desktop recording", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop recording is still active", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/recordings/{id}/download": { - "get": { - "tags": ["v1"], - "summary": "Download a desktop recording.", - "description": "Serves the recorded MP4 bytes for a completed desktop recording.", - "operationId": "get_v1_desktop_recording_download", - "parameters": [ - { - "name": "id", - "in": "path", - "description": "Desktop recording ID", - "required": true, - "schema": { - "type": "string" - } - } - ], - "responses": { - "200": { - "description": "Desktop recording as MP4 bytes" - }, - "404": { - "description": "Unknown desktop recording", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/screenshot": { - "get": { - "tags": ["v1"], - "summary": "Capture a full desktop screenshot.", - "description": "Performs a health-gated full-frame screenshot of the managed desktop and\nreturns the requested image bytes.", - "operationId": "get_v1_desktop_screenshot", - "parameters": [ - { - "name": "format", - "in": "query", - "required": false, - "schema": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopScreenshotFormat" - } - ], - "nullable": true - } - }, - { - "name": "quality", - "in": "query", - "required": false, - "schema": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - } - }, - { - "name": "scale", - "in": "query", - "required": false, - "schema": { - "type": "number", - "format": "float", - "nullable": true - } - }, - { - "name": "showCursor", - "in": "query", - "required": false, - "schema": { - "type": "boolean", - "nullable": true - } - } - ], - "responses": { - "200": { - "description": "Desktop screenshot as image bytes" - }, - "400": { - "description": "Invalid screenshot query", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or screenshot capture failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/screenshot/region": { - "get": { - "tags": ["v1"], - "summary": "Capture a desktop screenshot region.", - "description": "Performs a health-gated screenshot crop against the managed desktop and\nreturns the requested region image bytes.", - "operationId": "get_v1_desktop_screenshot_region", - "parameters": [ - { - "name": "x", - "in": "query", - "required": true, - "schema": { - "type": "integer", - "format": "int32" - } - }, - { - "name": "y", - "in": "query", - "required": true, - "schema": { - "type": "integer", - "format": "int32" - } - }, - { - "name": "width", - "in": "query", - "required": true, - "schema": { - "type": "integer", - "format": "int32", - "minimum": 0 - } - }, - { - "name": "height", - "in": "query", - "required": true, - "schema": { - "type": "integer", - "format": "int32", - "minimum": 0 - } - }, - { - "name": "format", - "in": "query", - "required": false, - "schema": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopScreenshotFormat" - } - ], - "nullable": true - } - }, - { - "name": "quality", - "in": "query", - "required": false, - "schema": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - } - }, - { - "name": "scale", - "in": "query", - "required": false, - "schema": { - "type": "number", - "format": "float", - "nullable": true - } - }, - { - "name": "showCursor", - "in": "query", - "required": false, - "schema": { - "type": "boolean", - "nullable": true - } - } - ], - "responses": { - "200": { - "description": "Desktop screenshot region as image bytes" - }, - "400": { - "description": "Invalid screenshot region", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop runtime health or screenshot capture failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/start": { - "post": { - "tags": ["v1"], - "summary": "Start the private desktop runtime.", - "description": "Lazily launches the managed Xvfb/openbox stack, validates display health,\nand returns the resulting desktop status snapshot.", - "operationId": "post_v1_desktop_start", - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopStartRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Desktop runtime status after start", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopStatusResponse" - } - } - } - }, - "400": { - "description": "Invalid desktop start request", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is already transitioning", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "501": { - "description": "Desktop API unsupported on this platform", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "503": { - "description": "Desktop runtime could not be started", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/status": { - "get": { - "tags": ["v1"], - "summary": "Get desktop runtime status.", - "description": "Returns the current desktop runtime state, dependency status, active\ndisplay metadata, and supervised process information.", - "operationId": "get_v1_desktop_status", - "responses": { - "200": { - "description": "Desktop runtime status", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopStatusResponse" - } - } - } - }, - "401": { - "description": "Authentication required", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/stop": { - "post": { - "tags": ["v1"], - "summary": "Stop the private desktop runtime.", - "description": "Terminates the managed openbox/Xvfb/dbus processes owned by the desktop\nruntime and returns the resulting status snapshot.", - "operationId": "post_v1_desktop_stop", - "responses": { - "200": { - "description": "Desktop runtime status after stop", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopStatusResponse" - } - } - } - }, - "409": { - "description": "Desktop runtime is already transitioning", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/stream/signaling": { - "get": { - "tags": ["v1"], - "summary": "Open a desktop WebRTC signaling session.", - "description": "Upgrades the connection to a WebSocket used for WebRTC signaling between\nthe browser client and the desktop streaming process. Also accepts mouse\nand keyboard input frames as a fallback transport.", - "operationId": "get_v1_desktop_stream_ws", - "parameters": [ - { - "name": "access_token", - "in": "query", - "description": "Bearer token alternative for WS auth", - "required": false, - "schema": { - "type": "string", - "nullable": true - } - } - ], - "responses": { - "101": { - "description": "WebSocket upgraded" - }, - "409": { - "description": "Desktop runtime or streaming session is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "502": { - "description": "Desktop stream failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/stream/start": { - "post": { - "tags": ["v1"], - "summary": "Start desktop streaming.", - "description": "Enables desktop websocket streaming for the managed desktop.", - "operationId": "post_v1_desktop_stream_start", - "responses": { - "200": { - "description": "Desktop streaming started", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopStreamStatusResponse" - } - } - } - } - } - } - }, - "/v1/desktop/stream/status": { - "get": { - "tags": ["v1"], - "summary": "Get desktop stream status.", - "description": "Returns the current state of the desktop WebRTC streaming session.", - "operationId": "get_v1_desktop_stream_status", - "responses": { - "200": { - "description": "Desktop stream status", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopStreamStatusResponse" - } - } - } - } - } - } - }, - "/v1/desktop/stream/stop": { - "post": { - "tags": ["v1"], - "summary": "Stop desktop streaming.", - "description": "Disables desktop websocket streaming for the managed desktop.", - "operationId": "post_v1_desktop_stream_stop", - "responses": { - "200": { - "description": "Desktop streaming stopped", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopStreamStatusResponse" - } - } - } - } - } - } - }, - "/v1/desktop/windows": { - "get": { - "tags": ["v1"], - "summary": "List visible desktop windows.", - "description": "Performs a health-gated visible-window enumeration against the managed\ndesktop and returns the current window metadata.", - "operationId": "get_v1_desktop_windows", - "responses": { - "200": { - "description": "Visible desktop windows", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopWindowListResponse" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "503": { - "description": "Desktop runtime health or window query failed", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/windows/focused": { - "get": { - "tags": ["v1"], - "summary": "Get the currently focused desktop window.", - "description": "Returns information about the window that currently has input focus.", - "operationId": "get_v1_desktop_windows_focused", - "responses": { - "200": { - "description": "Focused window info", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopWindowInfo" - } - } - } - }, - "404": { - "description": "No window is focused", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/windows/{id}/focus": { - "post": { - "tags": ["v1"], - "summary": "Focus a desktop window.", - "description": "Brings the specified window to the foreground and gives it input focus.", - "operationId": "post_v1_desktop_window_focus", - "parameters": [ - { - "name": "id", - "in": "path", - "description": "X11 window ID", - "required": true, - "schema": { - "type": "string" - } - } - ], - "responses": { - "200": { - "description": "Window info after focus", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopWindowInfo" - } - } - } - }, - "404": { - "description": "Window not found", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/windows/{id}/move": { - "post": { - "tags": ["v1"], - "summary": "Move a desktop window.", - "description": "Moves the specified window to the given position.", - "operationId": "post_v1_desktop_window_move", - "parameters": [ - { - "name": "id", - "in": "path", - "description": "X11 window ID", - "required": true, - "schema": { - "type": "string" - } - } - ], - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopWindowMoveRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Window info after move", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopWindowInfo" - } - } - } - }, - "404": { - "description": "Window not found", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, - "/v1/desktop/windows/{id}/resize": { - "post": { - "tags": ["v1"], - "summary": "Resize a desktop window.", - "description": "Resizes the specified window to the given dimensions.", - "operationId": "post_v1_desktop_window_resize", - "parameters": [ - { - "name": "id", - "in": "path", - "description": "X11 window ID", - "required": true, - "schema": { - "type": "string" - } - } - ], - "requestBody": { - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopWindowResizeRequest" - } - } - }, - "required": true - }, - "responses": { - "200": { - "description": "Window info after resize", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/DesktopWindowInfo" - } - } - } - }, - "404": { - "description": "Window not found", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - }, - "409": { - "description": "Desktop runtime is not ready", - "content": { - "application/json": { - "schema": { - "$ref": "#/components/schemas/ProblemDetails" - } - } - } - } - } - } - }, "/v1/fs/entries": { "get": { "tags": ["v1"], @@ -2719,21 +911,6 @@ "summary": "List all managed processes.", "description": "Returns a list of all processes (running and exited) currently tracked\nby the runtime, sorted by process ID.", "operationId": "get_v1_processes", - "parameters": [ - { - "name": "owner", - "in": "query", - "required": false, - "schema": { - "allOf": [ - { - "$ref": "#/components/schemas/ProcessOwner" - } - ], - "nullable": true - } - } - ], "responses": { "200": { "description": "List processes", @@ -3757,769 +1934,6 @@ } } }, - "DesktopActionResponse": { - "type": "object", - "required": ["ok"], - "properties": { - "ok": { - "type": "boolean" - } - } - }, - "DesktopClipboardQuery": { - "type": "object", - "properties": { - "selection": { - "type": "string", - "nullable": true - } - } - }, - "DesktopClipboardResponse": { - "type": "object", - "required": ["text", "selection"], - "properties": { - "selection": { - "type": "string" - }, - "text": { - "type": "string" - } - } - }, - "DesktopClipboardWriteRequest": { - "type": "object", - "required": ["text"], - "properties": { - "selection": { - "type": "string", - "nullable": true - }, - "text": { - "type": "string" - } - } - }, - "DesktopDisplayInfoResponse": { - "type": "object", - "required": ["display", "resolution"], - "properties": { - "display": { - "type": "string" - }, - "resolution": { - "$ref": "#/components/schemas/DesktopResolution" - } - } - }, - "DesktopErrorInfo": { - "type": "object", - "required": ["code", "message"], - "properties": { - "code": { - "type": "string" - }, - "message": { - "type": "string" - } - } - }, - "DesktopKeyModifiers": { - "type": "object", - "properties": { - "alt": { - "type": "boolean", - "nullable": true - }, - "cmd": { - "type": "boolean", - "nullable": true - }, - "ctrl": { - "type": "boolean", - "nullable": true - }, - "shift": { - "type": "boolean", - "nullable": true - } - } - }, - "DesktopKeyboardDownRequest": { - "type": "object", - "required": ["key"], - "properties": { - "key": { - "type": "string" - } - } - }, - "DesktopKeyboardPressRequest": { - "type": "object", - "required": ["key"], - "properties": { - "key": { - "type": "string" - }, - "modifiers": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopKeyModifiers" - } - ], - "nullable": true - } - } - }, - "DesktopKeyboardTypeRequest": { - "type": "object", - "required": ["text"], - "properties": { - "delayMs": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "text": { - "type": "string" - } - } - }, - "DesktopKeyboardUpRequest": { - "type": "object", - "required": ["key"], - "properties": { - "key": { - "type": "string" - } - } - }, - "DesktopLaunchRequest": { - "type": "object", - "required": ["app"], - "properties": { - "app": { - "type": "string" - }, - "args": { - "type": "array", - "items": { - "type": "string" - }, - "nullable": true - }, - "wait": { - "type": "boolean", - "nullable": true - } - } - }, - "DesktopLaunchResponse": { - "type": "object", - "required": ["processId"], - "properties": { - "pid": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "processId": { - "type": "string" - }, - "windowId": { - "type": "string", - "nullable": true - } - } - }, - "DesktopMouseButton": { - "type": "string", - "enum": ["left", "middle", "right"] - }, - "DesktopMouseClickRequest": { - "type": "object", - "required": ["x", "y"], - "properties": { - "button": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopMouseButton" - } - ], - "nullable": true - }, - "clickCount": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "x": { - "type": "integer", - "format": "int32" - }, - "y": { - "type": "integer", - "format": "int32" - } - } - }, - "DesktopMouseDownRequest": { - "type": "object", - "properties": { - "button": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopMouseButton" - } - ], - "nullable": true - }, - "x": { - "type": "integer", - "format": "int32", - "nullable": true - }, - "y": { - "type": "integer", - "format": "int32", - "nullable": true - } - } - }, - "DesktopMouseDragRequest": { - "type": "object", - "required": ["startX", "startY", "endX", "endY"], - "properties": { - "button": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopMouseButton" - } - ], - "nullable": true - }, - "endX": { - "type": "integer", - "format": "int32" - }, - "endY": { - "type": "integer", - "format": "int32" - }, - "startX": { - "type": "integer", - "format": "int32" - }, - "startY": { - "type": "integer", - "format": "int32" - } - } - }, - "DesktopMouseMoveRequest": { - "type": "object", - "required": ["x", "y"], - "properties": { - "x": { - "type": "integer", - "format": "int32" - }, - "y": { - "type": "integer", - "format": "int32" - } - } - }, - "DesktopMousePositionResponse": { - "type": "object", - "required": ["x", "y"], - "properties": { - "screen": { - "type": "integer", - "format": "int32", - "nullable": true - }, - "window": { - "type": "string", - "nullable": true - }, - "x": { - "type": "integer", - "format": "int32" - }, - "y": { - "type": "integer", - "format": "int32" - } - } - }, - "DesktopMouseScrollRequest": { - "type": "object", - "required": ["x", "y"], - "properties": { - "deltaX": { - "type": "integer", - "format": "int32", - "nullable": true - }, - "deltaY": { - "type": "integer", - "format": "int32", - "nullable": true - }, - "x": { - "type": "integer", - "format": "int32" - }, - "y": { - "type": "integer", - "format": "int32" - } - } - }, - "DesktopMouseUpRequest": { - "type": "object", - "properties": { - "button": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopMouseButton" - } - ], - "nullable": true - }, - "x": { - "type": "integer", - "format": "int32", - "nullable": true - }, - "y": { - "type": "integer", - "format": "int32", - "nullable": true - } - } - }, - "DesktopOpenRequest": { - "type": "object", - "required": ["target"], - "properties": { - "target": { - "type": "string" - } - } - }, - "DesktopOpenResponse": { - "type": "object", - "required": ["processId"], - "properties": { - "pid": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "processId": { - "type": "string" - } - } - }, - "DesktopProcessInfo": { - "type": "object", - "required": ["name", "running"], - "properties": { - "logPath": { - "type": "string", - "nullable": true - }, - "name": { - "type": "string" - }, - "pid": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "running": { - "type": "boolean" - } - } - }, - "DesktopRecordingInfo": { - "type": "object", - "required": ["id", "status", "fileName", "bytes", "startedAt"], - "properties": { - "bytes": { - "type": "integer", - "format": "int64", - "minimum": 0 - }, - "endedAt": { - "type": "string", - "nullable": true - }, - "fileName": { - "type": "string" - }, - "id": { - "type": "string" - }, - "processId": { - "type": "string", - "nullable": true - }, - "startedAt": { - "type": "string" - }, - "status": { - "$ref": "#/components/schemas/DesktopRecordingStatus" - } - } - }, - "DesktopRecordingListResponse": { - "type": "object", - "required": ["recordings"], - "properties": { - "recordings": { - "type": "array", - "items": { - "$ref": "#/components/schemas/DesktopRecordingInfo" - } - } - } - }, - "DesktopRecordingStartRequest": { - "type": "object", - "properties": { - "fps": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - } - } - }, - "DesktopRecordingStatus": { - "type": "string", - "enum": ["recording", "completed", "failed"] - }, - "DesktopRegionScreenshotQuery": { - "type": "object", - "required": ["x", "y", "width", "height"], - "properties": { - "format": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopScreenshotFormat" - } - ], - "nullable": true - }, - "height": { - "type": "integer", - "format": "int32", - "minimum": 0 - }, - "quality": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "scale": { - "type": "number", - "format": "float", - "nullable": true - }, - "showCursor": { - "type": "boolean", - "nullable": true - }, - "width": { - "type": "integer", - "format": "int32", - "minimum": 0 - }, - "x": { - "type": "integer", - "format": "int32" - }, - "y": { - "type": "integer", - "format": "int32" - } - } - }, - "DesktopResolution": { - "type": "object", - "required": ["width", "height"], - "properties": { - "dpi": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "height": { - "type": "integer", - "format": "int32", - "minimum": 0 - }, - "width": { - "type": "integer", - "format": "int32", - "minimum": 0 - } - } - }, - "DesktopScreenshotFormat": { - "type": "string", - "enum": ["png", "jpeg", "webp"] - }, - "DesktopScreenshotQuery": { - "type": "object", - "properties": { - "format": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopScreenshotFormat" - } - ], - "nullable": true - }, - "quality": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "scale": { - "type": "number", - "format": "float", - "nullable": true - }, - "showCursor": { - "type": "boolean", - "nullable": true - } - } - }, - "DesktopStartRequest": { - "type": "object", - "properties": { - "displayNum": { - "type": "integer", - "format": "int32", - "nullable": true - }, - "dpi": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "height": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "recordingFps": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "stateDir": { - "type": "string", - "nullable": true - }, - "streamAudioCodec": { - "type": "string", - "nullable": true - }, - "streamFrameRate": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - }, - "streamVideoCodec": { - "type": "string", - "nullable": true - }, - "webrtcPortRange": { - "type": "string", - "nullable": true - }, - "width": { - "type": "integer", - "format": "int32", - "nullable": true, - "minimum": 0 - } - } - }, - "DesktopState": { - "type": "string", - "enum": ["inactive", "install_required", "starting", "active", "stopping", "failed"] - }, - "DesktopStatusResponse": { - "type": "object", - "required": ["state"], - "properties": { - "display": { - "type": "string", - "nullable": true - }, - "installCommand": { - "type": "string", - "nullable": true - }, - "lastError": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopErrorInfo" - } - ], - "nullable": true - }, - "missingDependencies": { - "type": "array", - "items": { - "type": "string" - } - }, - "processes": { - "type": "array", - "items": { - "$ref": "#/components/schemas/DesktopProcessInfo" - } - }, - "resolution": { - "allOf": [ - { - "$ref": "#/components/schemas/DesktopResolution" - } - ], - "nullable": true - }, - "runtimeLogPath": { - "type": "string", - "nullable": true - }, - "startedAt": { - "type": "string", - "nullable": true - }, - "state": { - "$ref": "#/components/schemas/DesktopState" - }, - "windows": { - "type": "array", - "items": { - "$ref": "#/components/schemas/DesktopWindowInfo" - }, - "description": "Current visible windows (included when the desktop is active)." - } - } - }, - "DesktopStreamStatusResponse": { - "type": "object", - "required": ["active"], - "properties": { - "active": { - "type": "boolean" - }, - "processId": { - "type": "string", - "nullable": true - }, - "windowId": { - "type": "string", - "nullable": true - } - } - }, - "DesktopWindowInfo": { - "type": "object", - "required": ["id", "title", "x", "y", "width", "height", "isActive"], - "properties": { - "height": { - "type": "integer", - "format": "int32", - "minimum": 0 - }, - "id": { - "type": "string" - }, - "isActive": { - "type": "boolean" - }, - "title": { - "type": "string" - }, - "width": { - "type": "integer", - "format": "int32", - "minimum": 0 - }, - "x": { - "type": "integer", - "format": "int32" - }, - "y": { - "type": "integer", - "format": "int32" - } - } - }, - "DesktopWindowListResponse": { - "type": "object", - "required": ["windows"], - "properties": { - "windows": { - "type": "array", - "items": { - "$ref": "#/components/schemas/DesktopWindowInfo" - } - } - } - }, - "DesktopWindowMoveRequest": { - "type": "object", - "required": ["x", "y"], - "properties": { - "x": { - "type": "integer", - "format": "int32" - }, - "y": { - "type": "integer", - "format": "int32" - } - } - }, - "DesktopWindowResizeRequest": { - "type": "object", - "required": ["width", "height"], - "properties": { - "height": { - "type": "integer", - "format": "int32", - "minimum": 0 - }, - "width": { - "type": "integer", - "format": "int32", - "minimum": 0 - } - } - }, "ErrorType": { "type": "string", "enum": [ @@ -4912,7 +2326,7 @@ }, "ProcessInfo": { "type": "object", - "required": ["id", "command", "args", "tty", "interactive", "owner", "status", "createdAtMs"], + "required": ["id", "command", "args", "tty", "interactive", "status", "createdAtMs"], "properties": { "args": { "type": "array", @@ -4947,9 +2361,6 @@ "interactive": { "type": "boolean" }, - "owner": { - "$ref": "#/components/schemas/ProcessOwner" - }, "pid": { "type": "integer", "format": "int32", @@ -4987,19 +2398,6 @@ } } }, - "ProcessListQuery": { - "type": "object", - "properties": { - "owner": { - "allOf": [ - { - "$ref": "#/components/schemas/ProcessOwner" - } - ], - "nullable": true - } - } - }, "ProcessListResponse": { "type": "object", "required": ["processes"], @@ -5086,10 +2484,6 @@ "type": "string", "enum": ["stdout", "stderr", "combined", "pty"] }, - "ProcessOwner": { - "type": "string", - "enum": ["user", "desktop", "system"] - }, "ProcessRunRequest": { "type": "object", "required": ["command"], diff --git a/docs/orchestration-architecture.mdx b/docs/orchestration-architecture.mdx deleted file mode 100644 index 08c776c..0000000 --- a/docs/orchestration-architecture.mdx +++ /dev/null @@ -1,43 +0,0 @@ ---- -title: "Orchestration Architecture" -description: "Production topology, backend requirements, and session persistence." -icon: "sitemap" ---- - -This page covers production topology and backend requirements. Read [Architecture](/architecture) first for an overview of how the server, SDK, and agent processes fit together. - -## Suggested Topology - -Run the SDK on your backend, then call it from your frontend. - -This extra hop is recommended because it keeps auth/token logic on the backend and makes persistence simpler. - -```mermaid placement="top-right" - flowchart LR - BROWSER["Browser"] - subgraph BACKEND["Your backend"] - direction TB - SDK["Sandbox Agent SDK"] - end - subgraph SANDBOX_SIMPLE["Sandbox"] - SERVER_SIMPLE["Sandbox Agent server"] - end - - BROWSER --> BACKEND - BACKEND --> SDK --> SERVER_SIMPLE -``` - -### Backend requirements - -Your backend layer needs to handle: - -- **Long-running connections**: prompts can take minutes. -- **Session affinity**: follow-up messages must reach the same session. -- **State between requests**: session metadata and event history must persist across requests. -- **Graceful recovery**: sessions should resume after backend restarts. - -We recommend [Rivet](https://rivet.dev) over serverless because actors natively support the long-lived connections, session routing, and state persistence that agent workloads require. - -## Session persistence - -For storage driver options and replay behavior, see [Persisting Sessions](/session-persistence). diff --git a/docs/pi-support-plan.md b/docs/pi-support-plan.md new file mode 100644 index 0000000..5e207a5 --- /dev/null +++ b/docs/pi-support-plan.md @@ -0,0 +1,210 @@ +# Pi Agent Support Plan (pi-mono) + +## Implementation Status Update + +- Runtime selection now supports two internal modes: + - `PerSession` (default for unknown/non-allowlisted Pi capabilities) + - `Shared` (allowlist-only compatibility path) +- Pi sessions now use per-session process isolation by default, enabling true concurrent Pi sessions in Inspector and API clients. +- Shared Pi server code remains available and is used only when capability checks allow multiplexing. +- Session termination for per-session Pi mode hard-kills the underlying Pi process and clears queued prompts/pending waiters. +- In-session concurrent sends are serialized with an unbounded daemon-side FIFO queue per session. + +## Investigation Summary + +### Pi CLI modes and RPC protocol +- Pi supports multiple modes including interactive, print/JSON output, RPC, and SDK usage. JSON mode outputs a stream of JSON events suitable for parsing, and RPC mode is intended for programmatic control over stdin/stdout. +- RPC mode is started with `pi --mode rpc` and supports options like `--provider`, `--model`, `--no-session`, and `--session-dir`. +- The RPC protocol is newline-delimited JSON over stdin/stdout: + - Commands are JSON objects written to stdin. + - Responses are JSON objects with `type: "response"` and optional `id`. + - Events are JSON objects without `id`. +- `prompt` can include images using `ImageContent` (base64 or URL) alongside text. +- JSON/print mode (`pi -p` or `pi --print --mode json`) produces JSONL for non-interactive parsing and can resume sessions with a token. + +### RPC commands +RPC commands listed in `rpc.md` include: +- `new_session`, `get_state`, `list_sessions`, `delete_session`, `rename_session`, `clear_session` +- `prompt`, `queue_message`, `abort`, `get_queued_messages` + +### RPC event types +RPC events listed in `rpc.md` include: +- `agent_start`, `agent_end` +- `turn_start`, `turn_end` +- `message_start`, `message_update`, `message_end` +- `tool_execution_start`, `tool_execution_update`, `tool_execution_end` +- `auto_compaction`, `auto_retry`, `hook_error` + +`message_update` uses `assistantMessageEvent` deltas such as: +- `start`, `text_start`, `text_delta`, `text_end` +- `thinking_start`, `thinking_delta`, `thinking_end` +- `toolcall_start`, `toolcall_delta`, `toolcall_end` +- `toolcall_args_start`, `toolcall_args_delta`, `toolcall_args_end` +- `done`, `error` + +`tool_execution_update` includes `partialResult`, which is described as accumulated output so far. + +### Schema source locations (pi-mono) +RPC types are documented as living in: +- `packages/ai/src/types.ts` (Model types) +- `packages/agent/src/types.ts` (AgentResponse types) +- `packages/coding-agent/src/core/messages.ts` (message types) +- `packages/coding-agent/src/modes/rpc/rpc-types.ts` (RPC protocol types) + +### Distribution assets +Pi releases provide platform-specific binaries such as: +- `pi-darwin-arm64`, `pi-darwin-x64` +- `pi-linux-arm64`, `pi-linux-x64` +- `pi-win-x64.zip` + +## Integration Decisions +- Follow the OpenCode pattern: a shared long-running process (stdio RPC) with session multiplexing. +- Primary integration path is RPC streaming (`pi --mode rpc`). +- JSON/print mode is a fallback only (diagnostics or non-interactive runs). +- Create sessions via `new_session`; store the returned `sessionId` as `native_session_id`. +- Use `get_state` as a re-sync path after server restarts. +- Use `prompt` for send-message, with optional image content. +- Convert Pi events into universal events; emit daemon synthetic `session.started` on session creation and `session.ended` only on errors/termination. + +## Implementation Plan + +### 1) Agent Identity + Capabilities +Files: +- `server/packages/agent-management/src/agents.rs` +- `server/packages/sandbox-agent/src/router.rs` +- `docs/cli.mdx`, `docs/conversion.mdx`, `docs/session-transcript-schema.mdx` +- `README.md`, `frontend/packages/website/src/components/FAQ.tsx` + +Tasks: +- Add `AgentId::Pi` with string/binary name `"pi"` and parsing rules. +- Add Pi to `all_agents()` and agent lists. +- Define `AgentCapabilities` for Pi: + - `tool_calls=true`, `tool_results=true` + - `text_messages=true`, `streaming_deltas=true`, `item_started=true` + - `reasoning=true` (from `thinking_*` deltas) + - `images=true` (ImageContent in `prompt`) + - `permissions=false`, `questions=false`, `mcp_tools=false` + - `shared_process=true`, `session_lifecycle=false` (no native session events) + - `error_events=true` (hook_error) + - `command_execution=false`, `file_changes=false`, `file_attachments=false` + +### 2) Installer and Binary Resolution +Files: +- `server/packages/agent-management/src/agents.rs` + +Tasks: +- Add `install_pi()` that: + - Downloads the correct release asset per platform (`pi-`). + - Handles `.zip` on Windows and raw binaries elsewhere. + - Marks binary executable. +- Add Pi to `AgentManager::install`, `is_installed`, `version`. +- Version detection: try `--version`, `version`, `-V`. + +### 3) Schema Extraction for Pi +Files: +- `resources/agent-schemas/src/pi.ts` (new) +- `resources/agent-schemas/src/index.ts` +- `resources/agent-schemas/artifacts/json-schema/pi.json` +- `server/packages/extracted-agent-schemas/build.rs` +- `server/packages/extracted-agent-schemas/src/lib.rs` + +Tasks: +- Implement `extractPiSchema()`: + - Download pi-mono sources (zip/tarball) into a temp dir. + - Use `ts-json-schema-generator` against `packages/coding-agent/src/modes/rpc/rpc-types.ts`. + - Include dependent files per `rpc.md` (ai/types, agent/types, core/messages). + - Extract `RpcEvent`, `RpcResponse`, `RpcCommand` unions (exact type names from source). +- Add fallback schema if remote fetch fails (minimal union with event/response fields). +- Wire pi into extractor index and artifact generation. + +### 4) Universal Schema Conversion (Pi -> Universal) +Files: +- `server/packages/universal-agent-schema/src/agents/pi.rs` (new) +- `server/packages/universal-agent-schema/src/agents/mod.rs` +- `server/packages/universal-agent-schema/src/lib.rs` +- `server/packages/sandbox-agent/src/router.rs` + +Mapping rules: +- `message_start` -> `item.started` (kind=message, role=assistant, native_item_id=messageId) +- `message_update`: + - `text_*` -> `item.delta` (assistant text delta) + - `thinking_*` -> `item.delta` with `ContentPart::Reasoning` (visibility=Private) + - `toolcall_*` and `toolcall_args_*` -> ignore for now (tool_execution_* is authoritative) + - `error` -> `item.completed` with `ItemStatus::Failed` (if no later message_end) +- `message_end` -> `item.completed` (finalize assistant message) +- `tool_execution_start` -> `item.started` (kind=tool_call, ContentPart::ToolCall) +- `tool_execution_update` -> `item.delta` for a synthetic tool_result item: + - Maintain a per-toolCallId buffer to compute delta from accumulated `partialResult`. +- `tool_execution_end` -> `item.completed` (kind=tool_result, output from `result.content`) + - If `isError=true`, set item status to failed. +- `agent_start`, `turn_start`, `turn_end`, `agent_end`, `auto_compaction`, `auto_retry`, `hook_error`: + - Map to `ItemKind::Status` with a label like `pi.agent_start`, `pi.auto_retry`, etc. + - Do not emit `session.ended` for these events. +- If event parsing fails, emit `agent.unparsed` (source=daemon, synthetic=true) and fail tests. + +### 5) Shared RPC Server Integration +Files: +- `server/packages/sandbox-agent/src/router.rs` + +Tasks: +- Add a new managed stdio server type for Pi, similar to Codex: + - Create `PiServer` struct with: + - stdin sender + - pending request map keyed by request id + - per-session native session id mapping + - Extend `ManagedServerKind` to include Pi. + - Add `ensure_pi_server()` and `spawn_pi_server()` using `pi --mode rpc`. + - Add a `handle_pi_server_output()` loop to parse stdout lines into events/responses. +- Session creation: + - On `create_session`, ensure Pi server is running, send `new_session`, store sessionId. + - Register session with `server_manager.register_session` for native mapping. +- Sending messages: + - Use `prompt` command; include sessionId and optional images. + - Emit synthetic `item.started` only if Pi does not emit `message_start`. + +### 6) Router + Streaming Path Changes +Files: +- `server/packages/sandbox-agent/src/router.rs` + +Tasks: +- Add Pi handling to: + - `create_session` (new_session) + - `send_message` (prompt) + - `parse_agent_line` (Pi event conversion) + - `agent_modes` (default to `default` unless Pi exposes a mode list) + - `agent_supports_resume` (true if Pi supports session resume) + +### 7) Tests +Files: +- `server/packages/sandbox-agent/tests/...` +- `server/packages/universal-agent-schema/tests/...` (if present) + +Tasks: +- Unit tests for conversion: + - `message_start/update/end` -> item.started/delta/completed + - `tool_execution_*` -> tool call/result mapping with partialResult delta + - failure -> agent.unparsed +- Integration tests: + - Start Pi RPC server, create session, send prompt, stream events. + - Validate `native_session_id` mapping and event ordering. +- Update HTTP/SSE test coverage to include Pi agent if relevant. + +## Risk Areas / Edge Cases +- `tool_execution_update.partialResult` is cumulative; must compute deltas. +- `message_update` may emit `done`/`error` without `message_end`; handle both paths. +- No native session lifecycle events; rely on daemon synthetic events. +- Session recovery after RPC server restart requires `get_state` + re-register sessions. + +## Acceptance Criteria +- Pi appears in `/v1/agents`, CLI list, and docs. +- `create_session` returns `native_session_id` from Pi `new_session`. +- Streaming prompt yields universal events with proper ordering: + - message -> item.started/delta/completed + - tool execution -> tool call + tool result +- Tests pass and no synthetic data is used in test fixtures. + +## Sources +- https://upd.dev/badlogic/pi-mono/src/commit/d36e0ea07303d8a76d51b4a7bd5f0d6d3c490860/packages/coding-agent/docs/rpc.md +- https://buildwithpi.ai/pi-cli +- https://takopi.dev/docs/pi-cli/ +- https://upd.dev/badlogic/pi-mono/releases diff --git a/docs/quickstart.mdx b/docs/quickstart.mdx index 223a54d..a6293fe 100644 --- a/docs/quickstart.mdx +++ b/docs/quickstart.mdx @@ -64,7 +64,7 @@ icon: "rocket" docker run -p 2468:2468 \ -e ANTHROPIC_API_KEY="sk-ant-..." \ -e OPENAI_API_KEY="sk-..." \ - rivetdev/sandbox-agent:0.4.2-full \ + rivetdev/sandbox-agent:0.3.1-full \ server --no-token --host 0.0.0.0 --port 2468 ``` @@ -77,9 +77,6 @@ icon: "rocket" Use the `mock` agent for SDK and integration testing without provider credentials. - - For per-tenant token tracking, budget enforcement, or usage-based billing, see [LLM Credentials](/llm-credentials) for gateway options like OpenRouter, LiteLLM, and Portkey. - @@ -89,7 +86,7 @@ icon: "rocket" Install and run the binary directly. ```bash - curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh + curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh sandbox-agent server --no-token --host 0.0.0.0 --port 2468 ``` @@ -98,7 +95,7 @@ icon: "rocket" Run without installing globally. ```bash - npx @sandbox-agent/cli@0.4.x server --no-token --host 0.0.0.0 --port 2468 + npx @sandbox-agent/cli@0.3.x server --no-token --host 0.0.0.0 --port 2468 ``` @@ -106,7 +103,7 @@ icon: "rocket" Run without installing globally. ```bash - bunx @sandbox-agent/cli@0.4.x server --no-token --host 0.0.0.0 --port 2468 + bunx @sandbox-agent/cli@0.3.x server --no-token --host 0.0.0.0 --port 2468 ``` @@ -114,7 +111,7 @@ icon: "rocket" Install globally, then run. ```bash - npm install -g @sandbox-agent/cli@0.4.x + npm install -g @sandbox-agent/cli@0.3.x sandbox-agent server --no-token --host 0.0.0.0 --port 2468 ``` @@ -123,7 +120,7 @@ icon: "rocket" Install globally, then run. ```bash - bun add -g @sandbox-agent/cli@0.4.x + bun add -g @sandbox-agent/cli@0.3.x # Allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()). bun pm -g trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 sandbox-agent server --no-token --host 0.0.0.0 --port 2468 @@ -134,7 +131,7 @@ icon: "rocket" For local development, use `SandboxAgent.start()` to spawn and manage the server as a subprocess. ```bash - npm install sandbox-agent@0.4.x + npm install sandbox-agent@0.3.x ``` ```typescript @@ -148,7 +145,7 @@ icon: "rocket" For local development, use `SandboxAgent.start()` to spawn and manage the server as a subprocess. ```bash - bun add sandbox-agent@0.4.x + bun add sandbox-agent@0.3.x # Allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()). bun pm trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 ``` @@ -226,16 +223,6 @@ icon: "rocket" If agents are not installed up front, they are lazily installed when creating a session. - - If you want to use `/v1/desktop/*`, install the desktop runtime packages first: - - ```bash - sandbox-agent install desktop --yes - ``` - - Then use `GET /v1/desktop/status` or `sdk.getDesktopStatus()` to verify the runtime is ready before calling desktop screenshot or input APIs. - - ```typescript import { SandboxAgent } from "sandbox-agent"; diff --git a/docs/react-components.mdx b/docs/react-components.mdx index 71a76d2..0fa41b0 100644 --- a/docs/react-components.mdx +++ b/docs/react-components.mdx @@ -12,12 +12,11 @@ Current exports: - `ProcessTerminal` for attaching to a running tty process - `AgentTranscript` for rendering session/message timelines without bundling any styles - `ChatComposer` for a reusable prompt input/send surface -- `useTranscriptVirtualizer` for wiring large transcript lists to a scroll container ## Install ```bash -npm install @sandbox-agent/react@0.4.x +npm install @sandbox-agent/react@0.3.x ``` ## Full example @@ -185,20 +184,11 @@ Useful props: - `className`: root class hook - `classNames`: slot-level class hooks for styling from outside the package -- `scrollRef` + `virtualize`: opt into TanStack Virtual against an external scroll container - `renderMessageText`: custom text or markdown renderer - `renderToolItemIcon`, `renderToolGroupIcon`, `renderChevron`, `renderEventLinkContent`: presentation overrides - `renderInlinePendingIndicator`, `renderThinkingState`: loading/thinking UI overrides - `isDividerEntry`, `canOpenEvent`, `getToolGroupSummary`: behavior overrides for grouping and labels -## Transcript virtualization hook - -`useTranscriptVirtualizer` exposes the same TanStack Virtual behavior used by `AgentTranscript` when `virtualize` is enabled. - -- Pass the grouped transcript rows you want to virtualize -- Pass a `scrollRef` that points at the actual scrollable element -- Use it when you need transcript-aware virtualization outside the stock `AgentTranscript` renderer - ## Composer and conversation `ChatComposer` is the headless message input. `AgentConversation` composes `AgentTranscript` and `ChatComposer` so apps can reuse the transcript/composer pairing without pulling in Inspector session chrome. diff --git a/docs/sdk-overview.mdx b/docs/sdk-overview.mdx index 73e0d35..228060b 100644 --- a/docs/sdk-overview.mdx +++ b/docs/sdk-overview.mdx @@ -11,22 +11,28 @@ The TypeScript SDK is centered on `sandbox-agent` and its `SandboxAgent` class. ```bash - npm install sandbox-agent@0.4.x + npm install sandbox-agent@0.3.x ``` ```bash - bun add sandbox-agent@0.4.x + bun add sandbox-agent@0.3.x # Allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()). bun pm trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 ``` +## Optional persistence drivers + +```bash +npm install @sandbox-agent/persist-indexeddb@0.3.x @sandbox-agent/persist-sqlite@0.3.x @sandbox-agent/persist-postgres@0.3.x +``` + ## Optional React components ```bash -npm install @sandbox-agent/react@0.4.x +npm install @sandbox-agent/react@0.3.x ``` ## Create a client @@ -62,12 +68,15 @@ const sdk = await SandboxAgent.connect({ controller.abort(); ``` -With persistence (see [Persisting Sessions](/session-persistence) for driver options): +With persistence: ```ts -import { SandboxAgent, InMemorySessionPersistDriver } from "sandbox-agent"; +import { SandboxAgent } from "sandbox-agent"; +import { SQLiteSessionPersistDriver } from "@sandbox-agent/persist-sqlite"; -const persist = new InMemorySessionPersistDriver(); +const persist = new SQLiteSessionPersistDriver({ + filename: "./sessions.db", +}); const sdk = await SandboxAgent.connect({ baseUrl: "http://127.0.0.1:2468", @@ -75,40 +84,25 @@ const sdk = await SandboxAgent.connect({ }); ``` -Local spawn with a sandbox provider: +Local autospawn (Node.js only): ```ts import { SandboxAgent } from "sandbox-agent"; -import { local } from "sandbox-agent/local"; -const sdk = await SandboxAgent.start({ - sandbox: local(), -}); +const localSdk = await SandboxAgent.start(); -// sdk.sandboxId — prefixed provider ID (e.g. "local/127.0.0.1:2468") - -await sdk.destroySandbox(); // provider-defined cleanup + disposes client +await localSdk.dispose(); ``` -`SandboxAgent.start(...)` requires a `sandbox` provider. Built-in providers: - -| Import | Provider | -|--------|----------| -| `sandbox-agent/local` | Local subprocess | -| `sandbox-agent/docker` | Docker container | -| `sandbox-agent/e2b` | E2B sandbox | -| `sandbox-agent/daytona` | Daytona workspace | -| `sandbox-agent/vercel` | Vercel Sandbox | -| `sandbox-agent/cloudflare` | Cloudflare Sandbox | - -Use `sdk.dispose()` to disconnect without changing sandbox state, `sdk.pauseSandbox()` for graceful suspension when supported, or `sdk.killSandbox()` for permanent deletion. - ## Session flow ```ts const session = await sdk.createSession({ agent: "mock", - cwd: "/", + sessionInit: { + cwd: "/", + mcpServers: [], + }, }); const prompt = await session.prompt([ @@ -196,44 +190,6 @@ const writeResult = await sdk.writeFsFile({ path: "./hello.txt" }, "hello"); console.log(health.status, agents.agents.length, entries.length, writeResult.path); ``` -## Desktop API - -The SDK also wraps the desktop host/runtime HTTP API. - -Install desktop dependencies first on Linux hosts: - -```bash -sandbox-agent install desktop --yes -``` - -Then query status, surface remediation if needed, and start the runtime: - -```ts -const status = await sdk.getDesktopStatus(); - -if (status.state === "install_required") { - console.log(status.installCommand); -} - -const started = await sdk.startDesktop({ - width: 1440, - height: 900, - dpi: 96, -}); - -const screenshot = await sdk.takeDesktopScreenshot(); -const displayInfo = await sdk.getDesktopDisplayInfo(); - -await sdk.moveDesktopMouse({ x: 400, y: 300 }); -await sdk.clickDesktop({ x: 400, y: 300, button: "left", clickCount: 1 }); -await sdk.typeDesktopText({ text: "hello world", delayMs: 10 }); -await sdk.pressDesktopKey({ key: "ctrl+l" }); - -await sdk.stopDesktop(); -``` - -Screenshot helpers return `Uint8Array` PNG bytes. The SDK does not attempt to install OS packages remotely; callers should surface `missingDependencies` and `installCommand` from `getDesktopStatus()`. - ## Error handling ```ts @@ -267,10 +223,5 @@ Parameters: - `token` (optional): Bearer token for authenticated servers - `headers` (optional): Additional request headers - `fetch` (optional): Custom fetch implementation used by SDK HTTP and session calls -- `skipHealthCheck` (optional): set `true` to skip the startup `/v1/health` wait - `waitForHealth` (optional, defaults to enabled): waits for `/v1/health` before HTTP helpers and session setup proceed; pass `false` to disable or `{ timeoutMs }` to bound the wait - `signal` (optional): aborts the startup `/v1/health` wait used by `connect()` - -## LLM credentials - -Sandbox Agent supports personal API keys, shared organization keys, and per-tenant gateway keys with budget enforcement. See [LLM Credentials](/llm-credentials) for setup details. diff --git a/docs/security.mdx b/docs/security.mdx index c8b02ad..ec00f49 100644 --- a/docs/security.mdx +++ b/docs/security.mdx @@ -4,7 +4,7 @@ description: "Backend-first auth and access control patterns." icon: "shield" --- -As covered in [Orchestration Architecture](/orchestration-architecture), run the Sandbox Agent client on your backend, not in the browser. +As covered in [Architecture](/architecture), run the Sandbox Agent client on your backend, not in the browser. This keeps sandbox credentials private and gives you one place for authz, rate limiting, and audit logging. @@ -92,7 +92,7 @@ export const workspace = actor({ const session = await sdk.createSession({ agent: "claude", - cwd: "/workspace", + sessionInit: { cwd: "/workspace" }, }); session.onEvent((event) => { diff --git a/docs/session-persistence.mdx b/docs/session-persistence.mdx index 5505864..eaa4de0 100644 --- a/docs/session-persistence.mdx +++ b/docs/session-persistence.mdx @@ -10,22 +10,14 @@ With persistence enabled, sessions can be restored after runtime/session loss. S Each driver stores: -- `SessionRecord` (`id`, `agent`, `agentSessionId`, `lastConnectionId`, `createdAt`, optional `destroyedAt`, optional `sandboxId`, optional `sessionInit`, optional `configOptions`, optional `modes`) +- `SessionRecord` (`id`, `agent`, `agentSessionId`, `lastConnectionId`, `createdAt`, optional `destroyedAt`, optional `sessionInit`) - `SessionEvent` (`id`, `eventIndex`, `sessionId`, `connectionId`, `sender`, `payload`, `createdAt`) ## Persistence drivers -### Rivet +### In-memory -Recommended for sandbox orchestration with actor state. See [Multiplayer](/multiplayer) for a full Rivet actor example with persistence in actor state. - -### IndexedDB (browser) - -Best for browser apps that should survive reloads. See the [Inspector source](https://github.com/rivet-dev/sandbox-agent/tree/main/frontend/packages/inspector/src/persist-indexeddb.ts) for a complete IndexedDB driver you can copy into your project. - -### In-memory (built-in) - -Best for local dev and ephemeral workloads. No extra dependencies required. +Best for local dev and ephemeral workloads. ```ts import { InMemorySessionPersistDriver, SandboxAgent } from "sandbox-agent"; @@ -41,17 +33,91 @@ const sdk = await SandboxAgent.connect({ }); ``` +### Rivet + +Recommended for sandbox orchestration with actor state. + +```bash +npm install @sandbox-agent/persist-rivet@0.3.x +``` + +```ts +import { actor } from "rivetkit"; +import { SandboxAgent } from "sandbox-agent"; +import { RivetSessionPersistDriver, type RivetPersistState } from "@sandbox-agent/persist-rivet"; + +type PersistedState = RivetPersistState & { + sandboxId: string; + baseUrl: string; +}; + +export default actor({ + createState: async () => { + return { + sandboxId: "sbx_123", + baseUrl: "http://127.0.0.1:2468", + } satisfies Partial; + }, + createVars: async (c) => { + const persist = new RivetSessionPersistDriver(c); + const sdk = await SandboxAgent.connect({ + baseUrl: c.state.baseUrl, + persist, + }); + + const session = await sdk.resumeOrCreateSession({ id: "default", agent: "codex" }); + + const unsubscribe = session.onEvent((event) => { + c.broadcast("session.event", event); + }); + + return { sdk, session, unsubscribe }; + }, + actions: { + sendMessage: async (c, message: string) => { + await c.vars.session.prompt([{ type: "text", text: message }]); + }, + }, + onSleep: async (c) => { + c.vars.unsubscribe?.(); + await c.vars.sdk.dispose(); + }, +}); +``` + +### IndexedDB + +Best for browser apps that should survive reloads. + +```bash +npm install @sandbox-agent/persist-indexeddb@0.3.x +``` + +```ts +import { SandboxAgent } from "sandbox-agent"; +import { IndexedDbSessionPersistDriver } from "@sandbox-agent/persist-indexeddb"; + +const persist = new IndexedDbSessionPersistDriver({ + databaseName: "sandbox-agent-session-store", +}); + +const sdk = await SandboxAgent.connect({ + baseUrl: "http://127.0.0.1:2468", + persist, +}); +``` + ### SQLite Best for local/server Node apps that need durable storage without a DB server. ```bash -npm install better-sqlite3 +npm install @sandbox-agent/persist-sqlite@0.3.x ``` ```ts import { SandboxAgent } from "sandbox-agent"; -import { SQLiteSessionPersistDriver } from "./persist.ts"; +import { SQLiteSessionPersistDriver } from "@sandbox-agent/persist-sqlite"; const persist = new SQLiteSessionPersistDriver({ filename: "./sandbox-agent.db", @@ -63,19 +129,17 @@ const sdk = await SandboxAgent.connect({ }); ``` -See the [full SQLite example](https://github.com/rivet-dev/sandbox-agent/tree/main/examples/persist-sqlite) for the complete driver implementation you can copy into your project. - ### Postgres Use when you already run Postgres and want shared relational storage. ```bash -npm install pg +npm install @sandbox-agent/persist-postgres@0.3.x ``` ```ts import { SandboxAgent } from "sandbox-agent"; -import { PostgresSessionPersistDriver } from "./persist.ts"; +import { PostgresSessionPersistDriver } from "@sandbox-agent/persist-postgres"; const persist = new PostgresSessionPersistDriver({ connectionString: process.env.DATABASE_URL, @@ -88,8 +152,6 @@ const sdk = await SandboxAgent.connect({ }); ``` -See the [full Postgres example](https://github.com/rivet-dev/sandbox-agent/tree/main/examples/persist-postgres) for the complete driver implementation you can copy into your project. - ### Custom driver Implement `SessionPersistDriver` for custom backends. @@ -98,11 +160,11 @@ Implement `SessionPersistDriver` for custom backends. import type { SessionPersistDriver } from "sandbox-agent"; class MyDriver implements SessionPersistDriver { - async getSession(id) { return undefined; } + async getSession(id) { return null; } async listSessions(request) { return { items: [] }; } async updateSession(session) {} async listEvents(request) { return { items: [] }; } - async insertEvent(sessionId, event) {} + async insertEvent(event) {} } ``` diff --git a/docs/session-transcript-schema.mdx b/docs/session-transcript-schema.mdx new file mode 100644 index 0000000..c9c004a --- /dev/null +++ b/docs/session-transcript-schema.mdx @@ -0,0 +1,388 @@ +--- +title: "Session Transcript Schema" +description: "Universal event schema for session transcripts across all agents." +--- + +Each coding agent outputs events in its own native format. The sandbox-agent converts these into a universal event schema, giving you a consistent session transcript regardless of which agent you use. + +The schema is defined in [OpenAPI format](https://github.com/rivet-dev/sandbox-agent/blob/main/docs/openapi.json). See the [HTTP API Reference](/api-reference) for endpoint documentation. + +## Coverage Matrix + +This table shows which agent feature coverage appears in the universal event stream. All agents retain their full native feature coverage—this only reflects what's normalized into the schema. + +| Feature | Claude | Codex | OpenCode | Amp | Pi (RPC) | +|--------------------|:------:|:-----:|:------------:|:------------:|:------------:| +| Stability | Stable | Stable| Experimental | Experimental | Experimental | +| Text Messages | ✓ | ✓ | ✓ | ✓ | ✓ | +| Tool Calls | ✓ | ✓ | ✓ | ✓ | ✓ | +| Tool Results | ✓ | ✓ | ✓ | ✓ | ✓ | +| Questions (HITL) | ✓ | | ✓ | | | +| Permissions (HITL) | ✓ | ✓ | ✓ | - | | +| Images | - | ✓ | ✓ | - | ✓ | +| File Attachments | - | ✓ | ✓ | - | | +| Session Lifecycle | - | ✓ | ✓ | - | | +| Error Events | - | ✓ | ✓ | ✓ | ✓ | +| Reasoning/Thinking | - | ✓ | - | - | ✓ | +| Command Execution | - | ✓ | - | - | | +| File Changes | - | ✓ | - | - | | +| MCP Tools | ✓ | ✓ | ✓ | ✓ | | +| Streaming Deltas | ✓ | ✓ | ✓ | - | ✓ | +| Variants | | ✓ | ✓ | ✓ | ✓ | + +Agents: [Claude Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview) · [Codex](https://github.com/openai/codex) · [OpenCode](https://github.com/opencode-ai/opencode) · [Amp](https://ampcode.com) · [Pi](https://buildwithpi.ai/pi-cli) + +- ✓ = Appears in session events +- \- = Agent supports natively, schema conversion coming soon +- (blank) = Not supported by agent +- Pi runtime model is router-managed per-session RPC (`pi --mode rpc`); it does not use generic subprocess streaming. + + + + Basic message exchange between user and assistant. + + + Visibility into tool invocations (file reads, command execution, etc.) and their results. When not natively supported, tool activity is embedded in message content. + + + Interactive questions the agent asks the user. Emits `question.requested` and `question.resolved` events. + + + Permission requests for sensitive operations. Emits `permission.requested` and `permission.resolved` events. + + + Support for image attachments in messages. + + + Support for file attachments in messages. + + + Native `session.started` and `session.ended` events. When not supported, the daemon emits synthetic lifecycle events. + + + Structured error events for runtime failures. + + + Extended thinking or reasoning content with visibility controls. + + + Detailed command execution events with stdout/stderr. + + + Structured file modification events with diffs. + + + Model Context Protocol tool support. + + + Native streaming of content deltas. When not supported, the daemon emits a single synthetic delta before `item.completed`. + + + Model variants such as reasoning effort or depth. Agents may expose different variant sets per model. + + + +Want support for another agent? [Open an issue](https://github.com/rivet-dev/sandbox-agent/issues/new) to request it. + +## UniversalEvent + +Every event from the API is wrapped in a `UniversalEvent` envelope. + +| Field | Type | Description | +|-------|------|-------------| +| `event_id` | string | Unique identifier for this event | +| `sequence` | integer | Monotonic sequence number within the session (starts at 1) | +| `time` | string | RFC3339 timestamp | +| `session_id` | string | Daemon-generated session identifier | +| `native_session_id` | string? | Provider-native session/thread identifier (e.g., Codex `threadId`, OpenCode `sessionID`) | +| `source` | string | Event origin: `agent` (native) or `daemon` (synthetic) | +| `synthetic` | boolean | Whether this event was generated by the daemon to fill gaps | +| `type` | string | Event type (see [Event Types](#event-types)) | +| `data` | object | Event-specific payload | +| `raw` | any? | Original provider payload (only when `include_raw=true`) | + +```json +{ + "event_id": "evt_abc123", + "sequence": 1, + "time": "2025-01-28T12:00:00Z", + "session_id": "my-session", + "native_session_id": "thread_xyz", + "source": "agent", + "synthetic": false, + "type": "item.completed", + "data": { ... } +} +``` + +## Event Types + +### Session Lifecycle + +| Type | Description | Data | +|------|-------------|------| +| `session.started` | Session has started | `{ metadata?: any }` | +| `session.ended` | Session has ended | `{ reason, terminated_by, message?, exit_code? }` | + +### Turn Lifecycle + +| Type | Description | Data | +|------|-------------|------| +| `turn.started` | Turn has started | `{ phase: "started", turn_id?, metadata? }` | +| `turn.ended` | Turn has ended | `{ phase: "ended", turn_id?, metadata? }` | + +**SessionEndedData** + +| Field | Type | Values | +|-------|------|--------| +| `reason` | string | `completed`, `error`, `terminated` | +| `terminated_by` | string | `agent`, `daemon` | +| `message` | string? | Error message (only present when reason is `error`) | +| `exit_code` | int? | Process exit code (only present when reason is `error`) | +| `stderr` | StderrOutput? | Structured stderr output (only present when reason is `error`) | + +**StderrOutput** + +| Field | Type | Description | +|-------|------|-------------| +| `head` | string? | First 20 lines of stderr (if truncated) or full stderr (if not truncated) | +| `tail` | string? | Last 50 lines of stderr (only present if truncated) | +| `truncated` | boolean | Whether the output was truncated | +| `total_lines` | int? | Total number of lines in stderr | + +### Item Lifecycle + +| Type | Description | Data | +|------|-------------|------| +| `item.started` | Item creation | `{ item }` | +| `item.delta` | Streaming content delta | `{ item_id, native_item_id?, delta }` | +| `item.completed` | Item finalized | `{ item }` | + +Items follow a consistent lifecycle: `item.started` → `item.delta` (0 or more) → `item.completed`. + +### HITL (Human-in-the-Loop) + +| Type | Description | Data | +|------|-------------|------| +| `permission.requested` | Permission request pending | `{ permission_id, action, status, metadata? }` | +| `permission.resolved` | Permission decision recorded | `{ permission_id, action, status, metadata? }` | +| `question.requested` | Question pending user input | `{ question_id, prompt, options, status }` | +| `question.resolved` | Question answered or rejected | `{ question_id, prompt, options, status, response? }` | + +**PermissionEventData** + +| Field | Type | Description | +|-------|------|-------------| +| `permission_id` | string | Identifier for the permission request | +| `action` | string | What the agent wants to do | +| `status` | string | `requested`, `accept`, `accept_for_session`, `reject` | +| `metadata` | any? | Additional context | + +**QuestionEventData** + +| Field | Type | Description | +|-------|------|-------------| +| `question_id` | string | Identifier for the question | +| `prompt` | string | Question text | +| `options` | string[] | Available answer options | +| `status` | string | `requested`, `answered`, `rejected` | +| `response` | string? | Selected answer (when resolved) | + +### Errors + +| Type | Description | Data | +|------|-------------|------| +| `error` | Runtime error | `{ message, code?, details? }` | +| `agent.unparsed` | Parse failure | `{ error, location, raw_hash? }` | + +The `agent.unparsed` event indicates the daemon failed to parse an agent payload. This should be treated as a bug. + +## UniversalItem + +Items represent discrete units of content within a session. + +| Field | Type | Description | +|-------|------|-------------| +| `item_id` | string | Daemon-generated identifier | +| `native_item_id` | string? | Provider-native item/message identifier | +| `parent_id` | string? | Parent item ID (e.g., tool call/result parented to a message) | +| `kind` | string | Item category (see below) | +| `role` | string? | Actor role for message items | +| `status` | string | Lifecycle status | +| `content` | ContentPart[] | Ordered list of content parts | + +### ItemKind + +| Value | Description | +|-------|-------------| +| `message` | User or assistant message | +| `tool_call` | Tool invocation | +| `tool_result` | Tool execution result | +| `system` | System message | +| `status` | Status update | +| `unknown` | Unrecognized item type | + +### ItemRole + +| Value | Description | +|-------|-------------| +| `user` | User message | +| `assistant` | Assistant response | +| `system` | System prompt | +| `tool` | Tool-related message | + +### ItemStatus + +| Value | Description | +|-------|-------------| +| `in_progress` | Item is streaming or pending | +| `completed` | Item is finalized | +| `failed` | Item execution failed | + +## Content Parts + +The `content` array contains typed parts that make up an item's payload. + +### text + +Plain text content. + +```json +{ "type": "text", "text": "Hello, world!" } +``` + +### json + +Structured JSON content. + +```json +{ "type": "json", "json": { "key": "value" } } +``` + +### tool_call + +Tool invocation. + +| Field | Type | Description | +|-------|------|-------------| +| `name` | string | Tool name | +| `arguments` | string | JSON-encoded arguments | +| `call_id` | string | Unique call identifier | + +```json +{ + "type": "tool_call", + "name": "read_file", + "arguments": "{\"path\": \"/src/main.ts\"}", + "call_id": "call_abc123" +} +``` + +### tool_result + +Tool execution result. + +| Field | Type | Description | +|-------|------|-------------| +| `call_id` | string | Matching call identifier | +| `output` | string | Tool output | + +```json +{ + "type": "tool_result", + "call_id": "call_abc123", + "output": "File contents here..." +} +``` + +### file_ref + +File reference with optional diff. + +| Field | Type | Description | +|-------|------|-------------| +| `path` | string | File path | +| `action` | string | `read`, `write`, `patch` | +| `diff` | string? | Unified diff (for patches) | + +```json +{ + "type": "file_ref", + "path": "/src/main.ts", + "action": "write", + "diff": "@@ -1,3 +1,4 @@\n+import { foo } from 'bar';" +} +``` + +### image + +Image reference. + +| Field | Type | Description | +|-------|------|-------------| +| `path` | string | Image file path | +| `mime` | string? | MIME type | + +```json +{ "type": "image", "path": "/tmp/screenshot.png", "mime": "image/png" } +``` + +### reasoning + +Model reasoning/thinking content. + +| Field | Type | Description | +|-------|------|-------------| +| `text` | string | Reasoning text | +| `visibility` | string | `public` or `private` | + +```json +{ "type": "reasoning", "text": "Let me think about this...", "visibility": "public" } +``` + +### status + +Status indicator. + +| Field | Type | Description | +|-------|------|-------------| +| `label` | string | Status label | +| `detail` | string? | Additional detail | + +```json +{ "type": "status", "label": "Running tests", "detail": "3 of 10 passed" } +``` + +## Source & Synthetics + +### EventSource + +The `source` field indicates who emitted the event: + +| Value | Description | +|-------|-------------| +| `agent` | Native event from the agent | +| `daemon` | Synthetic event generated by the daemon | + +### Synthetic Events + +The daemon emits synthetic events (`synthetic: true`, `source: "daemon"`) to provide a consistent event stream across all agents. Common synthetics: + +| Synthetic | When | +|-----------|------| +| `session.started` | Agent doesn't emit explicit session start | +| `session.ended` | Agent doesn't emit explicit session end | +| `turn.started` | Agent doesn't emit explicit turn start | +| `turn.ended` | Agent doesn't emit explicit turn end | +| `item.started` | Agent doesn't emit item start events | +| `item.delta` | Agent doesn't stream deltas natively | +| `question.*` | Claude Code plan mode (from ExitPlanMode tool) | + +### Raw Payloads + +Pass `include_raw=true` to event endpoints to receive the original agent payload in the `raw` field. Useful for debugging or accessing agent-specific data not in the universal schema. + +```typescript +const events = await client.getEvents("my-session", { includeRaw: true }); +// events[0].raw contains the original agent payload +``` diff --git a/docs/skills-config.mdx b/docs/skills-config.mdx index c3145c2..c85bc2c 100644 --- a/docs/skills-config.mdx +++ b/docs/skills-config.mdx @@ -35,7 +35,9 @@ await sdk.setSkillsConfig( // Create a session using the configured skills const session = await sdk.createSession({ agent: "claude", - cwd: "/workspace", + sessionInit: { + cwd: "/workspace", + }, }); await session.prompt([ diff --git a/docs/theme.css b/docs/theme.css index 4286d2c..daeb719 100644 --- a/docs/theme.css +++ b/docs/theme.css @@ -20,6 +20,7 @@ body { color: var(--sa-text); } +/* a { color: var(--sa-primary); } @@ -40,13 +41,6 @@ select { color: var(--sa-text); } -code, -pre { - background-color: var(--sa-card); - border: 1px solid var(--sa-border); - color: var(--sa-text); -} - .card, .mintlify-card, .docs-card { @@ -70,3 +64,4 @@ pre { .alert-danger { border-color: var(--sa-danger); } +*/ diff --git a/docs/troubleshooting.mdx b/docs/troubleshooting.mdx index 18186d6..838cc28 100644 --- a/docs/troubleshooting.mdx +++ b/docs/troubleshooting.mdx @@ -29,6 +29,25 @@ Verify the agent is installed: ls -la ~/.local/share/sandbox-agent/bin/ ``` +### 4. Binary libc mismatch (musl vs glibc) + +Claude Code binaries are available in both musl and glibc variants. If you see errors like: + +``` +cannot execute: required file not found +Error loading shared library libstdc++.so.6: No such file or directory +``` + +This means the wrong binary variant was downloaded. + +**For sandbox-agent 0.2.0+**: Platform detection is automatic. The correct binary (musl or glibc) is downloaded based on the runtime environment. + +**For sandbox-agent 0.1.x**: Use Alpine Linux which has native musl support: + +```dockerfile +FROM alpine:latest +RUN apk add --no-cache curl ca-certificates libstdc++ libgcc bash +``` ## Daytona Network Restrictions diff --git a/examples/boxlite-python/Dockerfile b/examples/boxlite-python/Dockerfile index 8aba774..3630511 100644 --- a/examples/boxlite-python/Dockerfile +++ b/examples/boxlite-python/Dockerfile @@ -1,5 +1,5 @@ FROM node:22-bookworm-slim RUN apt-get update && apt-get install -y curl ca-certificates && rm -rf /var/lib/apt/lists/* -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh RUN sandbox-agent install-agent claude RUN sandbox-agent install-agent codex diff --git a/examples/boxlite/Dockerfile b/examples/boxlite/Dockerfile index 8aba774..3630511 100644 --- a/examples/boxlite/Dockerfile +++ b/examples/boxlite/Dockerfile @@ -1,5 +1,5 @@ FROM node:22-bookworm-slim RUN apt-get update && apt-get install -y curl ca-certificates && rm -rf /var/lib/apt/lists/* -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh RUN sandbox-agent install-agent claude RUN sandbox-agent install-agent codex diff --git a/examples/boxlite/src/index.ts b/examples/boxlite/src/index.ts index 171166b..bdcd53a 100644 --- a/examples/boxlite/src/index.ts +++ b/examples/boxlite/src/index.ts @@ -25,7 +25,7 @@ const baseUrl = "http://localhost:3000"; console.log("Connecting to server..."); const client = await SandboxAgent.connect({ baseUrl }); -const session = await client.createSession({ agent: detectAgent(), cwd: "/root" }); +const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/root", mcpServers: [] } }); const sessionId = session.id; console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); diff --git a/examples/boxlite/tsconfig.json b/examples/boxlite/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/boxlite/tsconfig.json +++ b/examples/boxlite/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/cloudflare/Dockerfile b/examples/cloudflare/Dockerfile index 738f8a2..d0796cb 100644 --- a/examples/cloudflare/Dockerfile +++ b/examples/cloudflare/Dockerfile @@ -1,7 +1,7 @@ FROM cloudflare/sandbox:0.7.0 # Install sandbox-agent -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh # Pre-install agents RUN sandbox-agent install-agent claude && \ diff --git a/examples/cloudflare/tests/cloudflare.test.ts b/examples/cloudflare/tests/cloudflare.test.ts deleted file mode 100644 index d00c2ce..0000000 --- a/examples/cloudflare/tests/cloudflare.test.ts +++ /dev/null @@ -1,154 +0,0 @@ -import { describe, it, expect } from "vitest"; -import { spawn, type ChildProcess } from "node:child_process"; -import { resolve, dirname } from "node:path"; -import { fileURLToPath } from "node:url"; -import { execSync } from "node:child_process"; - -const __dirname = dirname(fileURLToPath(import.meta.url)); -const PROJECT_DIR = resolve(__dirname, ".."); - -/** - * Cloudflare Workers integration test. - * - * Set RUN_CLOUDFLARE_EXAMPLES=1 to enable. Requires wrangler and Docker. - * - * This starts `wrangler dev` which: - * 1. Builds the Dockerfile (cloudflare/sandbox base + sandbox-agent) - * 2. Starts a local Workers runtime with Durable Objects and containers - * 3. Exposes the app on a local port - * - * We then test through the proxy endpoint which forwards to sandbox-agent - * running inside the container. - */ -const shouldRun = process.env.RUN_CLOUDFLARE_EXAMPLES === "1"; -const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) || 600_000; - -const testFn = shouldRun ? it : it.skip; - -interface WranglerDev { - baseUrl: string; - cleanup: () => void; -} - -async function startWranglerDev(): Promise { - // Build frontend assets first (wrangler expects dist/ to exist) - execSync("npx vite build", { cwd: PROJECT_DIR, stdio: "pipe" }); - - return new Promise((resolve, reject) => { - const child: ChildProcess = spawn("npx", ["wrangler", "dev", "--port", "0"], { - cwd: PROJECT_DIR, - stdio: ["ignore", "pipe", "pipe"], - detached: true, - env: { - ...process.env, - // Ensure wrangler picks up API keys to pass to the container - NODE_ENV: "development", - }, - }); - - let stdout = ""; - let stderr = ""; - let resolved = false; - - const cleanup = () => { - if (child.pid) { - // Kill process group to ensure wrangler and its children are cleaned up - try { - process.kill(-child.pid, "SIGTERM"); - } catch { - try { - child.kill("SIGTERM"); - } catch {} - } - } - }; - - const timer = setTimeout(() => { - if (!resolved) { - resolved = true; - cleanup(); - reject(new Error(`wrangler dev did not start within 120s.\nstdout: ${stdout}\nstderr: ${stderr}`)); - } - }, 120_000); - - const onData = (chunk: Buffer) => { - const text = chunk.toString(); - stdout += text; - - // wrangler dev prints "Ready on http://localhost:XXXX" when ready - const match = stdout.match(/Ready on (https?:\/\/[^\s]+)/i) ?? stdout.match(/(https?:\/\/(?:localhost|127\.0\.0\.1):\d+)/); - if (match && !resolved) { - resolved = true; - clearTimeout(timer); - resolve({ baseUrl: match[1], cleanup }); - } - }; - - child.stdout?.on("data", onData); - child.stderr?.on("data", (chunk: Buffer) => { - const text = chunk.toString(); - stderr += text; - // Some wrangler versions print ready message to stderr - const match = text.match(/Ready on (https?:\/\/[^\s]+)/i) ?? text.match(/(https?:\/\/(?:localhost|127\.0\.0\.1):\d+)/); - if (match && !resolved) { - resolved = true; - clearTimeout(timer); - resolve({ baseUrl: match[1], cleanup }); - } - }); - - child.on("error", (err) => { - if (!resolved) { - resolved = true; - clearTimeout(timer); - reject(new Error(`wrangler dev failed to start: ${err.message}`)); - } - }); - - child.on("exit", (code) => { - if (!resolved) { - resolved = true; - clearTimeout(timer); - reject(new Error(`wrangler dev exited with code ${code}.\nstdout: ${stdout}\nstderr: ${stderr}`)); - } - }); - }); -} - -describe("cloudflare example", () => { - testFn( - "starts wrangler dev and sandbox-agent responds via proxy", - async () => { - const { baseUrl, cleanup } = await startWranglerDev(); - try { - // The Cloudflare example proxies requests through /sandbox/:name/proxy/* - // Wait for the container inside the Durable Object to start sandbox-agent - const healthUrl = `${baseUrl}/sandbox/test/proxy/v1/health`; - - let healthy = false; - for (let i = 0; i < 120; i++) { - try { - const res = await fetch(healthUrl); - if (res.ok) { - const data = await res.json(); - // The proxied health endpoint returns {name: "Sandbox Agent", ...} - if (data.status === "ok" || data.name === "Sandbox Agent") { - healthy = true; - break; - } - } - } catch {} - await new Promise((r) => setTimeout(r, 2000)); - } - expect(healthy).toBe(true); - - // Confirm a second request also works - const response = await fetch(healthUrl); - expect(response.ok).toBe(true); - } finally { - cleanup(); - } - }, - timeoutMs, - ); -}); diff --git a/examples/computesdk/package.json b/examples/computesdk/package.json index 243b3b1..e22b51b 100644 --- a/examples/computesdk/package.json +++ b/examples/computesdk/package.json @@ -3,7 +3,7 @@ "private": true, "type": "module", "scripts": { - "start": "tsx src/index.ts", + "start": "tsx src/computesdk.ts", "typecheck": "tsc --noEmit" }, "dependencies": { diff --git a/examples/computesdk/src/computesdk.ts b/examples/computesdk/src/computesdk.ts new file mode 100644 index 0000000..46f43d6 --- /dev/null +++ b/examples/computesdk/src/computesdk.ts @@ -0,0 +1,151 @@ +import { + compute, + detectProvider, + getMissingEnvVars, + getProviderConfigFromEnv, + isProviderAuthComplete, + isValidProvider, + PROVIDER_NAMES, + type ExplicitComputeConfig, + type ProviderName, +} from "computesdk"; +import { SandboxAgent } from "sandbox-agent"; +import { detectAgent, buildInspectorUrl } from "@sandbox-agent/example-shared"; +import { fileURLToPath } from "node:url"; +import { resolve } from "node:path"; + +const PORT = 3000; +const REQUEST_TIMEOUT_MS = Number.parseInt(process.env.COMPUTESDK_TIMEOUT_MS || "", 10) || 120_000; + +/** + * Detects and validates the provider to use. + * Priority: COMPUTESDK_PROVIDER env var > auto-detection from API keys + */ +function resolveProvider(): ProviderName { + const providerOverride = process.env.COMPUTESDK_PROVIDER; + + if (providerOverride) { + if (!isValidProvider(providerOverride)) { + throw new Error(`Unsupported ComputeSDK provider "${providerOverride}". Supported providers: ${PROVIDER_NAMES.join(", ")}`); + } + if (!isProviderAuthComplete(providerOverride)) { + const missing = getMissingEnvVars(providerOverride); + throw new Error(`Missing credentials for provider "${providerOverride}". Set: ${missing.join(", ")}`); + } + console.log(`Using ComputeSDK provider: ${providerOverride} (explicit)`); + return providerOverride as ProviderName; + } + + const detected = detectProvider(); + if (!detected) { + throw new Error(`No provider credentials found. Set one of: ${PROVIDER_NAMES.map((p) => getMissingEnvVars(p).join(", ")).join(" | ")}`); + } + console.log(`Using ComputeSDK provider: ${detected} (auto-detected)`); + return detected as ProviderName; +} + +function configureComputeSDK(): void { + const provider = resolveProvider(); + + const config: ExplicitComputeConfig = { + provider, + computesdkApiKey: process.env.COMPUTESDK_API_KEY, + requestTimeoutMs: REQUEST_TIMEOUT_MS, + }; + + const providerConfig = getProviderConfigFromEnv(provider); + if (Object.keys(providerConfig).length > 0) { + const configWithProvider = config as ExplicitComputeConfig & Record>; + configWithProvider[provider] = providerConfig; + } + + compute.setConfig(config); +} + +configureComputeSDK(); + +const buildEnv = (): Record => { + const env: Record = {}; + if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; + if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY; + return env; +}; + +export async function setupComputeSdkSandboxAgent(): Promise<{ + baseUrl: string; + cleanup: () => Promise; +}> { + const env = buildEnv(); + + console.log("Creating ComputeSDK sandbox..."); + const sandbox = await compute.sandbox.create({ + envs: Object.keys(env).length > 0 ? env : undefined, + }); + + const run = async (cmd: string, options?: { background?: boolean }) => { + const result = await sandbox.runCommand(cmd, options); + if (typeof result?.exitCode === "number" && result.exitCode !== 0) { + throw new Error(`Command failed: ${cmd} (exit ${result.exitCode})\n${result.stderr || ""}`); + } + return result; + }; + + console.log("Installing sandbox-agent..."); + await run("curl -fsSL https://releases.rivet.dev/sandbox-agent/latest/install.sh | sh"); + + if (env.ANTHROPIC_API_KEY) { + console.log("Installing Claude agent..."); + await run("sandbox-agent install-agent claude"); + } + + if (env.OPENAI_API_KEY) { + console.log("Installing Codex agent..."); + await run("sandbox-agent install-agent codex"); + } + + console.log("Starting server..."); + await run(`sandbox-agent server --no-token --host 0.0.0.0 --port ${PORT}`, { background: true }); + + const baseUrl = await sandbox.getUrl({ port: PORT }); + + const cleanup = async () => { + try { + await sandbox.destroy(); + } catch (error) { + console.warn("Cleanup failed:", error instanceof Error ? error.message : error); + } + }; + + return { baseUrl, cleanup }; +} + +export async function runComputeSdkExample(): Promise { + const { baseUrl, cleanup } = await setupComputeSdkSandboxAgent(); + + const handleExit = async () => { + await cleanup(); + process.exit(0); + }; + + process.once("SIGINT", handleExit); + process.once("SIGTERM", handleExit); + + const client = await SandboxAgent.connect({ baseUrl }); + const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/home", mcpServers: [] } }); + const sessionId = session.id; + + console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); + console.log(" Press Ctrl+C to stop."); + + // Keep alive until SIGINT/SIGTERM triggers cleanup above + await new Promise(() => {}); +} + +const isDirectRun = Boolean(process.argv[1] && resolve(process.argv[1]) === fileURLToPath(import.meta.url)); + +if (isDirectRun) { + runComputeSdkExample().catch((error) => { + console.error(error instanceof Error ? error.message : error); + process.exit(1); + }); +} diff --git a/examples/computesdk/src/index.ts b/examples/computesdk/src/index.ts deleted file mode 100644 index 63d4aee..0000000 --- a/examples/computesdk/src/index.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { SandboxAgent } from "sandbox-agent"; -import { computesdk } from "sandbox-agent/computesdk"; -import { detectAgent } from "@sandbox-agent/example-shared"; - -const envs: Record = {}; -if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; -if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; - -const client = await SandboxAgent.start({ - sandbox: computesdk({ - create: { envs }, - }), -}); - -console.log(`UI: ${client.inspectorUrl}`); - -const session = await client.createSession({ - agent: detectAgent(), -}); - -session.onEvent((event) => { - console.log(`[${event.sender}]`, JSON.stringify(event.payload)); -}); - -session.prompt([{ type: "text", text: "Say hello from ComputeSDK in one sentence." }]); - -process.once("SIGINT", async () => { - await client.destroySandbox(); - process.exit(0); -}); diff --git a/examples/computesdk/tests/computesdk.test.ts b/examples/computesdk/tests/computesdk.test.ts index 61ebb2c..0bbd24c 100644 --- a/examples/computesdk/tests/computesdk.test.ts +++ b/examples/computesdk/tests/computesdk.test.ts @@ -1,6 +1,6 @@ import { describe, it, expect } from "vitest"; -import { SandboxAgent } from "sandbox-agent"; -import { computesdk } from "sandbox-agent/computesdk"; +import { buildHeaders } from "@sandbox-agent/example-shared"; +import { setupComputeSdkSandboxAgent } from "../src/computesdk.ts"; const hasModal = Boolean(process.env.MODAL_TOKEN_ID && process.env.MODAL_TOKEN_SECRET); const hasVercel = Boolean(process.env.VERCEL_TOKEN || process.env.VERCEL_OIDC_TOKEN); @@ -13,23 +13,20 @@ const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) const testFn = shouldRun ? it : it.skip; -describe("computesdk provider", () => { +describe("computesdk example", () => { testFn( "starts sandbox-agent and responds to /v1/health", async () => { - const envs: Record = {}; - if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; - if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; - - const sdk = await SandboxAgent.start({ - sandbox: computesdk({ create: { envs } }), - }); - + const { baseUrl, cleanup } = await setupComputeSdkSandboxAgent(); try { - const health = await sdk.getHealth(); - expect(health.status).toBe("ok"); + const response = await fetch(`${baseUrl}/v1/health`, { + headers: buildHeaders({}), + }); + expect(response.ok).toBe(true); + const data = await response.json(); + expect(data.status).toBe("ok"); } finally { - await sdk.destroySandbox(); + await cleanup(); } }, timeoutMs, diff --git a/examples/computesdk/tsconfig.json b/examples/computesdk/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/computesdk/tsconfig.json +++ b/examples/computesdk/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/daytona/src/daytona.ts b/examples/daytona/src/daytona.ts deleted file mode 100644 index ccffc94..0000000 --- a/examples/daytona/src/daytona.ts +++ /dev/null @@ -1,33 +0,0 @@ -import { SandboxAgent } from "sandbox-agent"; -import { daytona } from "sandbox-agent/daytona"; - -function collectEnvVars(): Record { - const envVars: Record = {}; - if (process.env.ANTHROPIC_API_KEY) envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; - if (process.env.OPENAI_API_KEY) envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY; - return envVars; -} - -function inspectorUrlToBaseUrl(inspectorUrl: string): string { - return inspectorUrl.replace(/\/ui\/$/, ""); -} - -export async function setupDaytonaSandboxAgent(): Promise<{ - baseUrl: string; - token?: string; - extraHeaders?: Record; - cleanup: () => Promise; -}> { - const client = await SandboxAgent.start({ - sandbox: daytona({ - create: { envVars: collectEnvVars() }, - }), - }); - - return { - baseUrl: inspectorUrlToBaseUrl(client.inspectorUrl), - cleanup: async () => { - await client.killSandbox(); - }, - }; -} diff --git a/examples/daytona/src/index.ts b/examples/daytona/src/index.ts index 9c4cf85..09f4cff 100644 --- a/examples/daytona/src/index.ts +++ b/examples/daytona/src/index.ts @@ -1,30 +1,42 @@ +import { Daytona } from "@daytonaio/sdk"; import { SandboxAgent } from "sandbox-agent"; -import { daytona } from "sandbox-agent/daytona"; -import { detectAgent } from "@sandbox-agent/example-shared"; +import { detectAgent, buildInspectorUrl } from "@sandbox-agent/example-shared"; + +const daytona = new Daytona(); const envVars: Record = {}; if (process.env.ANTHROPIC_API_KEY) envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.OPENAI_API_KEY) envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY; -const client = await SandboxAgent.start({ - sandbox: daytona({ - create: { envVars }, - }), -}); +// Use default image and install sandbox-agent at runtime (faster startup, no snapshot build) +console.log("Creating Daytona sandbox..."); +const sandbox = await daytona.create({ envVars, autoStopInterval: 0 }); -console.log(`UI: ${client.inspectorUrl}`); +// Install sandbox-agent and start server +console.log("Installing sandbox-agent..."); +await sandbox.process.executeCommand("curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh"); -const session = await client.createSession({ - agent: detectAgent(), -}); +console.log("Installing agents..."); +await sandbox.process.executeCommand("sandbox-agent install-agent claude"); +await sandbox.process.executeCommand("sandbox-agent install-agent codex"); -session.onEvent((event) => { - console.log(`[${event.sender}]`, JSON.stringify(event.payload)); -}); +await sandbox.process.executeCommand("nohup sandbox-agent server --no-token --host 0.0.0.0 --port 3000 >/tmp/sandbox-agent.log 2>&1 &"); -session.prompt([{ type: "text", text: "Say hello from Daytona in one sentence." }]); +const baseUrl = (await sandbox.getSignedPreviewUrl(3000, 4 * 60 * 60)).url; -process.once("SIGINT", async () => { - await client.destroySandbox(); +console.log("Connecting to server..."); +const client = await SandboxAgent.connect({ baseUrl }); +const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/home/daytona", mcpServers: [] } }); +const sessionId = session.id; + +console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); +console.log(" Press Ctrl+C to stop."); + +const keepAlive = setInterval(() => {}, 60_000); +const cleanup = async () => { + clearInterval(keepAlive); + await sandbox.delete(60); process.exit(0); -}); +}; +process.once("SIGINT", cleanup); +process.once("SIGTERM", cleanup); diff --git a/examples/daytona/tsconfig.json b/examples/daytona/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/daytona/tsconfig.json +++ b/examples/daytona/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/docker/package.json b/examples/docker/package.json index 7b796c9..2c29cfe 100644 --- a/examples/docker/package.json +++ b/examples/docker/package.json @@ -9,10 +9,10 @@ "dependencies": { "@sandbox-agent/example-shared": "workspace:*", "dockerode": "latest", - "get-port": "latest", "sandbox-agent": "workspace:*" }, "devDependencies": { + "@types/dockerode": "latest", "@types/node": "latest", "tsx": "latest", "typescript": "latest", diff --git a/examples/docker/src/index.ts b/examples/docker/src/index.ts index 9f50859..74469f3 100644 --- a/examples/docker/src/index.ts +++ b/examples/docker/src/index.ts @@ -1,40 +1,68 @@ +import Docker from "dockerode"; import fs from "node:fs"; import path from "node:path"; import { SandboxAgent } from "sandbox-agent"; -import { docker } from "sandbox-agent/docker"; -import { detectAgent } from "@sandbox-agent/example-shared"; +import { detectAgent, buildInspectorUrl } from "@sandbox-agent/example-shared"; import { FULL_IMAGE } from "@sandbox-agent/example-shared/docker"; +const IMAGE = FULL_IMAGE; +const PORT = 3000; +const agent = detectAgent(); const codexAuthPath = process.env.HOME ? path.join(process.env.HOME, ".codex", "auth.json") : null; const bindMounts = codexAuthPath && fs.existsSync(codexAuthPath) ? [`${codexAuthPath}:/home/sandbox/.codex/auth.json:ro`] : []; -const env = [ - process.env.ANTHROPIC_API_KEY ? `ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}` : "", - process.env.OPENAI_API_KEY ? `OPENAI_API_KEY=${process.env.OPENAI_API_KEY}` : "", - process.env.CODEX_API_KEY ? `CODEX_API_KEY=${process.env.CODEX_API_KEY}` : "", -].filter(Boolean); -const client = await SandboxAgent.start({ - sandbox: docker({ - image: FULL_IMAGE, - env, - binds: bindMounts, - }), +const docker = new Docker({ socketPath: "/var/run/docker.sock" }); + +// Pull image if needed +try { + await docker.getImage(IMAGE).inspect(); +} catch { + console.log(`Pulling ${IMAGE}...`); + await new Promise((resolve, reject) => { + docker.pull(IMAGE, (err: Error | null, stream: NodeJS.ReadableStream) => { + if (err) return reject(err); + docker.modem.followProgress(stream, (err: Error | null) => (err ? reject(err) : resolve())); + }); + }); +} + +console.log("Starting container..."); +const container = await docker.createContainer({ + Image: IMAGE, + Cmd: ["server", "--no-token", "--host", "0.0.0.0", "--port", `${PORT}`], + Env: [ + process.env.ANTHROPIC_API_KEY ? `ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}` : "", + process.env.OPENAI_API_KEY ? `OPENAI_API_KEY=${process.env.OPENAI_API_KEY}` : "", + process.env.CODEX_API_KEY ? `CODEX_API_KEY=${process.env.CODEX_API_KEY}` : "", + ].filter(Boolean), + ExposedPorts: { [`${PORT}/tcp`]: {} }, + HostConfig: { + AutoRemove: true, + PortBindings: { [`${PORT}/tcp`]: [{ HostPort: `${PORT}` }] }, + Binds: bindMounts, + }, }); +await container.start(); -console.log(`UI: ${client.inspectorUrl}`); +const baseUrl = `http://127.0.0.1:${PORT}`; -const session = await client.createSession({ - agent: detectAgent(), - cwd: "/home/sandbox", -}); +const client = await SandboxAgent.connect({ baseUrl }); +const session = await client.createSession({ agent, sessionInit: { cwd: "/home/sandbox", mcpServers: [] } }); +const sessionId = session.id; -session.onEvent((event) => { - console.log(`[${event.sender}]`, JSON.stringify(event.payload)); -}); +console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); +console.log(" Press Ctrl+C to stop."); -session.prompt([{ type: "text", text: "Say hello from Docker in one sentence." }]); - -process.once("SIGINT", async () => { - await client.destroySandbox(); +const keepAlive = setInterval(() => {}, 60_000); +const cleanup = async () => { + clearInterval(keepAlive); + try { + await container.stop({ t: 5 }); + } catch {} + try { + await container.remove({ force: true }); + } catch {} process.exit(0); -}); +}; +process.once("SIGINT", cleanup); +process.once("SIGTERM", cleanup); diff --git a/examples/docker/tests/docker.test.ts b/examples/docker/tests/docker.test.ts index 683f033..66730f0 100644 --- a/examples/docker/tests/docker.test.ts +++ b/examples/docker/tests/docker.test.ts @@ -1,15 +1,8 @@ import { describe, it, expect } from "vitest"; -import { startDockerSandbox } from "@sandbox-agent/example-shared/docker"; +import { buildHeaders } from "@sandbox-agent/example-shared"; +import { setupDockerSandboxAgent } from "../src/docker.ts"; -/** - * Docker integration test. - * - * Set SANDBOX_AGENT_DOCKER_IMAGE to the image tag to test (e.g. a locally-built - * full image). The test starts a container from that image, waits for - * sandbox-agent to become healthy, and validates the /v1/health endpoint. - */ -const image = process.env.SANDBOX_AGENT_DOCKER_IMAGE; -const shouldRun = Boolean(image); +const shouldRun = process.env.RUN_DOCKER_EXAMPLES === "1"; const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) || 300_000; const testFn = shouldRun ? it : it.skip; @@ -18,29 +11,11 @@ describe("docker example", () => { testFn( "starts sandbox-agent and responds to /v1/health", async () => { - const { baseUrl, cleanup } = await startDockerSandbox({ - port: 2468, - image: image!, - }); + const { baseUrl, token, cleanup } = await setupDockerSandboxAgent(); try { - // Wait for health check - let healthy = false; - for (let i = 0; i < 60; i++) { - try { - const res = await fetch(`${baseUrl}/v1/health`); - if (res.ok) { - const data = await res.json(); - if (data.status === "ok") { - healthy = true; - break; - } - } - } catch {} - await new Promise((r) => setTimeout(r, 1000)); - } - expect(healthy).toBe(true); - - const response = await fetch(`${baseUrl}/v1/health`); + const response = await fetch(`${baseUrl}/v1/health`, { + headers: buildHeaders({ token }), + }); expect(response.ok).toBe(true); const data = await response.json(); expect(data.status).toBe("ok"); diff --git a/examples/docker/tsconfig.json b/examples/docker/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/docker/tsconfig.json +++ b/examples/docker/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/e2b/src/e2b.ts b/examples/e2b/src/e2b.ts deleted file mode 100644 index 17762a2..0000000 --- a/examples/e2b/src/e2b.ts +++ /dev/null @@ -1,34 +0,0 @@ -import { SandboxAgent } from "sandbox-agent"; -import { e2b } from "sandbox-agent/e2b"; - -function collectEnvVars(): Record { - const envs: Record = {}; - if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; - if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; - return envs; -} - -function inspectorUrlToBaseUrl(inspectorUrl: string): string { - return inspectorUrl.replace(/\/ui\/$/, ""); -} - -export async function setupE2BSandboxAgent(): Promise<{ - baseUrl: string; - token?: string; - cleanup: () => Promise; -}> { - const template = process.env.E2B_TEMPLATE; - const client = await SandboxAgent.start({ - sandbox: e2b({ - template, - create: { envs: collectEnvVars() }, - }), - }); - - return { - baseUrl: inspectorUrlToBaseUrl(client.inspectorUrl), - cleanup: async () => { - await client.killSandbox(); - }, - }; -} diff --git a/examples/e2b/src/index.ts b/examples/e2b/src/index.ts index 67b74dc..7dd2882 100644 --- a/examples/e2b/src/index.ts +++ b/examples/e2b/src/index.ts @@ -1,28 +1,45 @@ +import { Sandbox } from "@e2b/code-interpreter"; import { SandboxAgent } from "sandbox-agent"; -import { e2b } from "sandbox-agent/e2b"; -import { detectAgent } from "@sandbox-agent/example-shared"; +import { detectAgent, buildInspectorUrl } from "@sandbox-agent/example-shared"; const envs: Record = {}; if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; -const template = process.env.E2B_TEMPLATE; -const client = await SandboxAgent.start({ - // ✨ NEW ✨ - sandbox: e2b({ template, create: { envs } }), -}); +console.log("Creating E2B sandbox..."); +const sandbox = await Sandbox.create({ allowInternetAccess: true, envs }); -const session = await client.createSession({ - agent: detectAgent(), -}); +const run = async (cmd: string) => { + const result = await sandbox.commands.run(cmd); + if (result.exitCode !== 0) throw new Error(`Command failed: ${cmd}\n${result.stderr}`); + return result; +}; -session.onEvent((event) => { - console.log(`[${event.sender}]`, JSON.stringify(event.payload)); -}); +console.log("Installing sandbox-agent..."); +await run("curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh"); -session.prompt([{ type: "text", text: "Say hello from E2B in one sentence." }]); +console.log("Installing agents..."); +await run("sandbox-agent install-agent claude"); +await run("sandbox-agent install-agent codex"); -process.once("SIGINT", async () => { - await client.destroySandbox(); +console.log("Starting server..."); +await sandbox.commands.run("sandbox-agent server --no-token --host 0.0.0.0 --port 3000", { background: true, timeoutMs: 0 }); + +const baseUrl = `https://${sandbox.getHost(3000)}`; + +console.log("Connecting to server..."); +const client = await SandboxAgent.connect({ baseUrl }); +const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/home/user", mcpServers: [] } }); +const sessionId = session.id; + +console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); +console.log(" Press Ctrl+C to stop."); + +const keepAlive = setInterval(() => {}, 60_000); +const cleanup = async () => { + clearInterval(keepAlive); + await sandbox.kill(); process.exit(0); -}); +}; +process.once("SIGINT", cleanup); +process.once("SIGTERM", cleanup); diff --git a/examples/e2b/tsconfig.json b/examples/e2b/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/e2b/tsconfig.json +++ b/examples/e2b/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/file-system/src/index.ts b/examples/file-system/src/index.ts index 71d65c0..abe4e08 100644 --- a/examples/file-system/src/index.ts +++ b/examples/file-system/src/index.ts @@ -44,7 +44,7 @@ const readmeText = new TextDecoder().decode(readmeBytes); console.log(` README.md content: ${readmeText.trim()}`); console.log("Creating session..."); -const session = await client.createSession({ agent: detectAgent(), cwd: "/opt/my-project" }); +const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/opt/my-project", mcpServers: [] } }); const sessionId = session.id; console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); console.log(' Try: "read the README in /opt/my-project"'); diff --git a/examples/file-system/tsconfig.json b/examples/file-system/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/file-system/tsconfig.json +++ b/examples/file-system/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/mcp-custom-tool/tsconfig.json b/examples/mcp-custom-tool/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/mcp-custom-tool/tsconfig.json +++ b/examples/mcp-custom-tool/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/mcp/tsconfig.json b/examples/mcp/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/mcp/tsconfig.json +++ b/examples/mcp/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/modal/package.json b/examples/modal/package.json deleted file mode 100644 index d3e51ec..0000000 --- a/examples/modal/package.json +++ /dev/null @@ -1,20 +0,0 @@ -{ - "name": "@sandbox-agent/example-modal", - "private": true, - "type": "module", - "scripts": { - "start": "tsx src/index.ts", - "typecheck": "tsc --noEmit" - }, - "dependencies": { - "modal": "latest", - "@sandbox-agent/example-shared": "workspace:*", - "sandbox-agent": "workspace:*" - }, - "devDependencies": { - "@types/node": "latest", - "tsx": "latest", - "typescript": "latest", - "vitest": "^3.0.0" - } -} diff --git a/examples/modal/src/index.ts b/examples/modal/src/index.ts deleted file mode 100644 index 35eef8d..0000000 --- a/examples/modal/src/index.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { SandboxAgent } from "sandbox-agent"; -import { modal } from "sandbox-agent/modal"; -import { detectAgent } from "@sandbox-agent/example-shared"; - -const secrets: Record = {}; -if (process.env.ANTHROPIC_API_KEY) secrets.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; -if (process.env.OPENAI_API_KEY) secrets.OPENAI_API_KEY = process.env.OPENAI_API_KEY; - -const client = await SandboxAgent.start({ - sandbox: modal({ - create: { secrets }, - }), -}); - -console.log(`UI: ${client.inspectorUrl}`); - -const session = await client.createSession({ - agent: detectAgent(), -}); - -session.onEvent((event) => { - console.log(`[${event.sender}]`, JSON.stringify(event.payload)); -}); - -session.prompt([{ type: "text", text: "Say hello from Modal in one sentence." }]); - -process.once("SIGINT", async () => { - await client.destroySandbox(); - process.exit(0); -}); diff --git a/examples/modal/tests/modal.test.ts b/examples/modal/tests/modal.test.ts deleted file mode 100644 index 010256a..0000000 --- a/examples/modal/tests/modal.test.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { describe, it, expect } from "vitest"; -import { SandboxAgent } from "sandbox-agent"; -import { modal } from "sandbox-agent/modal"; - -const shouldRun = Boolean(process.env.MODAL_TOKEN_ID && process.env.MODAL_TOKEN_SECRET); -const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) || 300_000; - -const testFn = shouldRun ? it : it.skip; - -describe("modal provider", () => { - testFn( - "starts sandbox-agent and responds to /v1/health", - async () => { - const secrets: Record = {}; - if (process.env.ANTHROPIC_API_KEY) secrets.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; - if (process.env.OPENAI_API_KEY) secrets.OPENAI_API_KEY = process.env.OPENAI_API_KEY; - - const sdk = await SandboxAgent.start({ - sandbox: modal({ create: { secrets } }), - }); - - try { - const health = await sdk.getHealth(); - expect(health.status).toBe("ok"); - } finally { - await sdk.destroySandbox(); - } - }, - timeoutMs, - ); -}); diff --git a/examples/modal/tsconfig.json b/examples/modal/tsconfig.json deleted file mode 100644 index ad591c3..0000000 --- a/examples/modal/tsconfig.json +++ /dev/null @@ -1,17 +0,0 @@ -{ - "compilerOptions": { - "target": "ES2022", - "lib": ["ES2022", "DOM"], - "module": "ESNext", - "moduleResolution": "Bundler", - "allowImportingTsExtensions": true, - "noEmit": true, - "esModuleInterop": true, - "strict": true, - "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] - }, - "include": ["src/**/*"], - "exclude": ["node_modules", "**/*.test.ts"] -} diff --git a/examples/permissions/src/index.ts b/examples/permissions/src/index.ts index e684e34..811f65c 100644 --- a/examples/permissions/src/index.ts +++ b/examples/permissions/src/index.ts @@ -2,7 +2,6 @@ import { createInterface } from "node:readline/promises"; import { stdin as input, stdout as output } from "node:process"; import { Command } from "commander"; import { SandboxAgent, type PermissionReply, type SessionPermissionRequest } from "sandbox-agent"; -import { local } from "sandbox-agent/local"; const options = parseOptions(); const agent = options.agent.trim().toLowerCase(); @@ -10,7 +9,10 @@ const autoReply = parsePermissionReply(options.reply); const promptText = options.prompt?.trim() || `Create ./permission-example.txt with the text 'hello from the ${agent} permissions example'.`; const sdk = await SandboxAgent.start({ - sandbox: local({ log: "inherit" }), + spawn: { + enabled: true, + log: "inherit", + }, }); try { @@ -41,7 +43,10 @@ try { const session = await sdk.createSession({ agent, ...(mode ? { mode } : {}), - cwd: process.cwd(), + sessionInit: { + cwd: process.cwd(), + mcpServers: [], + }, }); const rl = autoReply diff --git a/examples/permissions/tsconfig.json b/examples/permissions/tsconfig.json index 4eec283..9c9fe06 100644 --- a/examples/permissions/tsconfig.json +++ b/examples/permissions/tsconfig.json @@ -1,8 +1,7 @@ { "compilerOptions": { "target": "ES2022", - "lib": ["ES2022", "DOM"], - "types": ["node"], + "lib": ["ES2022"], "module": "ESNext", "moduleResolution": "Bundler", "allowImportingTsExtensions": true, diff --git a/examples/persist-memory/tsconfig.json b/examples/persist-memory/tsconfig.json index ec2723c..d1c0065 100644 --- a/examples/persist-memory/tsconfig.json +++ b/examples/persist-memory/tsconfig.json @@ -1,15 +1,13 @@ { "compilerOptions": { "target": "ES2022", - "lib": ["ES2022", "DOM"], "module": "ESNext", "moduleResolution": "Bundler", "allowImportingTsExtensions": true, "noEmit": true, "esModuleInterop": true, "strict": true, - "skipLibCheck": true, - "types": ["node"] + "skipLibCheck": true }, "include": ["src"] } diff --git a/examples/persist-postgres/package.json b/examples/persist-postgres/package.json index 8445516..8114ffb 100644 --- a/examples/persist-postgres/package.json +++ b/examples/persist-postgres/package.json @@ -8,6 +8,7 @@ }, "dependencies": { "@sandbox-agent/example-shared": "workspace:*", + "@sandbox-agent/persist-postgres": "workspace:*", "pg": "latest", "sandbox-agent": "workspace:*" }, diff --git a/examples/persist-postgres/src/index.ts b/examples/persist-postgres/src/index.ts index 43eecbd..73f9f04 100644 --- a/examples/persist-postgres/src/index.ts +++ b/examples/persist-postgres/src/index.ts @@ -3,7 +3,7 @@ import { randomUUID } from "node:crypto"; import { Client } from "pg"; import { setTimeout as delay } from "node:timers/promises"; import { SandboxAgent } from "sandbox-agent"; -import { PostgresSessionPersistDriver } from "./persist.ts"; +import { PostgresSessionPersistDriver } from "@sandbox-agent/persist-postgres"; import { startDockerSandbox } from "@sandbox-agent/example-shared/docker"; import { detectAgent } from "@sandbox-agent/example-shared"; diff --git a/examples/persist-postgres/src/persist.ts b/examples/persist-postgres/src/persist.ts deleted file mode 100644 index 2a6ccff..0000000 --- a/examples/persist-postgres/src/persist.ts +++ /dev/null @@ -1,336 +0,0 @@ -import { Pool, type PoolConfig } from "pg"; -import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent"; - -const DEFAULT_LIST_LIMIT = 100; - -export interface PostgresSessionPersistDriverOptions { - connectionString?: string; - pool?: Pool; - poolConfig?: PoolConfig; - schema?: string; -} - -export class PostgresSessionPersistDriver implements SessionPersistDriver { - private readonly pool: Pool; - private readonly ownsPool: boolean; - private readonly schema: string; - private readonly initialized: Promise; - - constructor(options: PostgresSessionPersistDriverOptions = {}) { - this.schema = normalizeSchema(options.schema ?? "public"); - - if (options.pool) { - this.pool = options.pool; - this.ownsPool = false; - } else { - this.pool = new Pool({ - connectionString: options.connectionString, - ...options.poolConfig, - }); - this.ownsPool = true; - } - - this.initialized = this.initialize(); - } - - async getSession(id: string): Promise { - await this.ready(); - - const result = await this.pool.query( - `SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json - FROM ${this.table("sessions")} - WHERE id = $1`, - [id], - ); - - if (result.rows.length === 0) { - return undefined; - } - - return decodeSessionRow(result.rows[0]); - } - - async listSessions(request: ListPageRequest = {}): Promise> { - await this.ready(); - - const offset = parseCursor(request.cursor); - const limit = normalizeLimit(request.limit); - - const rowsResult = await this.pool.query( - `SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json - FROM ${this.table("sessions")} - ORDER BY created_at ASC, id ASC - LIMIT $1 OFFSET $2`, - [limit, offset], - ); - - const countResult = await this.pool.query<{ count: string }>(`SELECT COUNT(*) AS count FROM ${this.table("sessions")}`); - const total = parseInteger(countResult.rows[0]?.count ?? "0"); - const nextOffset = offset + rowsResult.rows.length; - - return { - items: rowsResult.rows.map(decodeSessionRow), - nextCursor: nextOffset < total ? String(nextOffset) : undefined, - }; - } - - async updateSession(session: SessionRecord): Promise { - await this.ready(); - - await this.pool.query( - `INSERT INTO ${this.table("sessions")} ( - id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json - ) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10) - ON CONFLICT(id) DO UPDATE SET - agent = EXCLUDED.agent, - agent_session_id = EXCLUDED.agent_session_id, - last_connection_id = EXCLUDED.last_connection_id, - created_at = EXCLUDED.created_at, - destroyed_at = EXCLUDED.destroyed_at, - sandbox_id = EXCLUDED.sandbox_id, - session_init_json = EXCLUDED.session_init_json, - config_options_json = EXCLUDED.config_options_json, - modes_json = EXCLUDED.modes_json`, - [ - session.id, - session.agent, - session.agentSessionId, - session.lastConnectionId, - session.createdAt, - session.destroyedAt ?? null, - session.sandboxId ?? null, - session.sessionInit ? JSON.stringify(session.sessionInit) : null, - session.configOptions ? JSON.stringify(session.configOptions) : null, - session.modes !== undefined ? JSON.stringify(session.modes) : null, - ], - ); - } - - async listEvents(request: ListEventsRequest): Promise> { - await this.ready(); - - const offset = parseCursor(request.cursor); - const limit = normalizeLimit(request.limit); - - const rowsResult = await this.pool.query( - `SELECT id, event_index, session_id, created_at, connection_id, sender, payload_json - FROM ${this.table("events")} - WHERE session_id = $1 - ORDER BY event_index ASC, id ASC - LIMIT $2 OFFSET $3`, - [request.sessionId, limit, offset], - ); - - const countResult = await this.pool.query<{ count: string }>(`SELECT COUNT(*) AS count FROM ${this.table("events")} WHERE session_id = $1`, [ - request.sessionId, - ]); - const total = parseInteger(countResult.rows[0]?.count ?? "0"); - const nextOffset = offset + rowsResult.rows.length; - - return { - items: rowsResult.rows.map(decodeEventRow), - nextCursor: nextOffset < total ? String(nextOffset) : undefined, - }; - } - - async insertEvent(_sessionId: string, event: SessionEvent): Promise { - await this.ready(); - - await this.pool.query( - `INSERT INTO ${this.table("events")} ( - id, event_index, session_id, created_at, connection_id, sender, payload_json - ) VALUES ($1, $2, $3, $4, $5, $6, $7) - ON CONFLICT(id) DO UPDATE SET - event_index = EXCLUDED.event_index, - session_id = EXCLUDED.session_id, - created_at = EXCLUDED.created_at, - connection_id = EXCLUDED.connection_id, - sender = EXCLUDED.sender, - payload_json = EXCLUDED.payload_json`, - [event.id, event.eventIndex, event.sessionId, event.createdAt, event.connectionId, event.sender, event.payload], - ); - } - - async close(): Promise { - if (!this.ownsPool) { - return; - } - await this.pool.end(); - } - - private async ready(): Promise { - await this.initialized; - } - - private table(name: "sessions" | "events"): string { - return `"${this.schema}"."${name}"`; - } - - private async initialize(): Promise { - await this.pool.query(`CREATE SCHEMA IF NOT EXISTS "${this.schema}"`); - - await this.pool.query(` - CREATE TABLE IF NOT EXISTS ${this.table("sessions")} ( - id TEXT PRIMARY KEY, - agent TEXT NOT NULL, - agent_session_id TEXT NOT NULL, - last_connection_id TEXT NOT NULL, - created_at BIGINT NOT NULL, - destroyed_at BIGINT, - sandbox_id TEXT, - session_init_json JSONB, - config_options_json JSONB, - modes_json JSONB - ) - `); - - await this.pool.query(` - ALTER TABLE ${this.table("sessions")} - ADD COLUMN IF NOT EXISTS sandbox_id TEXT - `); - - await this.pool.query(` - ALTER TABLE ${this.table("sessions")} - ADD COLUMN IF NOT EXISTS config_options_json JSONB - `); - - await this.pool.query(` - ALTER TABLE ${this.table("sessions")} - ADD COLUMN IF NOT EXISTS modes_json JSONB - `); - - await this.pool.query(` - CREATE TABLE IF NOT EXISTS ${this.table("events")} ( - id TEXT PRIMARY KEY, - event_index BIGINT NOT NULL, - session_id TEXT NOT NULL, - created_at BIGINT NOT NULL, - connection_id TEXT NOT NULL, - sender TEXT NOT NULL, - payload_json JSONB NOT NULL - ) - `); - - await this.pool.query(` - ALTER TABLE ${this.table("events")} - ALTER COLUMN id TYPE TEXT USING id::TEXT - `); - - await this.pool.query(` - ALTER TABLE ${this.table("events")} - ADD COLUMN IF NOT EXISTS event_index BIGINT - `); - - await this.pool.query(` - WITH ranked AS ( - SELECT id, ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC) AS ranked_index - FROM ${this.table("events")} - ) - UPDATE ${this.table("events")} AS current_events - SET event_index = ranked.ranked_index - FROM ranked - WHERE current_events.id = ranked.id - AND current_events.event_index IS NULL - `); - - await this.pool.query(` - ALTER TABLE ${this.table("events")} - ALTER COLUMN event_index SET NOT NULL - `); - - await this.pool.query(` - CREATE INDEX IF NOT EXISTS idx_events_session_order - ON ${this.table("events")}(session_id, event_index, id) - `); - } -} - -type SessionRow = { - id: string; - agent: string; - agent_session_id: string; - last_connection_id: string; - created_at: string | number; - destroyed_at: string | number | null; - sandbox_id: string | null; - session_init_json: unknown | null; - config_options_json: unknown | null; - modes_json: unknown | null; -}; - -type EventRow = { - id: string | number; - event_index: string | number; - session_id: string; - created_at: string | number; - connection_id: string; - sender: string; - payload_json: unknown; -}; - -function decodeSessionRow(row: SessionRow): SessionRecord { - return { - id: row.id, - agent: row.agent, - agentSessionId: row.agent_session_id, - lastConnectionId: row.last_connection_id, - createdAt: parseInteger(row.created_at), - destroyedAt: row.destroyed_at === null ? undefined : parseInteger(row.destroyed_at), - sandboxId: row.sandbox_id ?? undefined, - sessionInit: row.session_init_json ? (row.session_init_json as SessionRecord["sessionInit"]) : undefined, - configOptions: row.config_options_json ? (row.config_options_json as SessionRecord["configOptions"]) : undefined, - modes: row.modes_json ? (row.modes_json as SessionRecord["modes"]) : undefined, - }; -} - -function decodeEventRow(row: EventRow): SessionEvent { - return { - id: String(row.id), - eventIndex: parseInteger(row.event_index), - sessionId: row.session_id, - createdAt: parseInteger(row.created_at), - connectionId: row.connection_id, - sender: parseSender(row.sender), - payload: row.payload_json as SessionEvent["payload"], - }; -} - -function normalizeLimit(limit: number | undefined): number { - if (!Number.isFinite(limit) || (limit ?? 0) < 1) { - return DEFAULT_LIST_LIMIT; - } - return Math.floor(limit as number); -} - -function parseCursor(cursor: string | undefined): number { - if (!cursor) { - return 0; - } - const parsed = Number.parseInt(cursor, 10); - if (!Number.isFinite(parsed) || parsed < 0) { - return 0; - } - return parsed; -} - -function parseInteger(value: string | number): number { - const parsed = typeof value === "number" ? value : Number.parseInt(value, 10); - if (!Number.isFinite(parsed)) { - throw new Error(`Invalid integer value returned by postgres: ${String(value)}`); - } - return parsed; -} - -function parseSender(value: string): SessionEvent["sender"] { - if (value === "agent" || value === "client") { - return value; - } - throw new Error(`Invalid sender value returned by postgres: ${value}`); -} - -function normalizeSchema(schema: string): string { - if (!/^[A-Za-z_][A-Za-z0-9_]*$/.test(schema)) { - throw new Error(`Invalid schema name '${schema}'. Use letters, numbers, and underscores only.`); - } - return schema; -} diff --git a/examples/persist-postgres/tsconfig.json b/examples/persist-postgres/tsconfig.json index ec2723c..d1c0065 100644 --- a/examples/persist-postgres/tsconfig.json +++ b/examples/persist-postgres/tsconfig.json @@ -1,15 +1,13 @@ { "compilerOptions": { "target": "ES2022", - "lib": ["ES2022", "DOM"], "module": "ESNext", "moduleResolution": "Bundler", "allowImportingTsExtensions": true, "noEmit": true, "esModuleInterop": true, "strict": true, - "skipLibCheck": true, - "types": ["node"] + "skipLibCheck": true }, "include": ["src"] } diff --git a/examples/persist-sqlite/package.json b/examples/persist-sqlite/package.json index be6bf0d..8b7b822 100644 --- a/examples/persist-sqlite/package.json +++ b/examples/persist-sqlite/package.json @@ -8,11 +8,10 @@ }, "dependencies": { "@sandbox-agent/example-shared": "workspace:*", - "better-sqlite3": "^11.0.0", + "@sandbox-agent/persist-sqlite": "workspace:*", "sandbox-agent": "workspace:*" }, "devDependencies": { - "@types/better-sqlite3": "^7.0.0", "@types/node": "latest", "tsx": "latest", "typescript": "latest" diff --git a/examples/persist-sqlite/src/index.ts b/examples/persist-sqlite/src/index.ts index 943e902..d2c4ef2 100644 --- a/examples/persist-sqlite/src/index.ts +++ b/examples/persist-sqlite/src/index.ts @@ -1,5 +1,5 @@ import { SandboxAgent } from "sandbox-agent"; -import { SQLiteSessionPersistDriver } from "./persist.ts"; +import { SQLiteSessionPersistDriver } from "@sandbox-agent/persist-sqlite"; import { startDockerSandbox } from "@sandbox-agent/example-shared/docker"; import { detectAgent } from "@sandbox-agent/example-shared"; diff --git a/examples/persist-sqlite/src/persist.ts b/examples/persist-sqlite/src/persist.ts deleted file mode 100644 index 2292903..0000000 --- a/examples/persist-sqlite/src/persist.ts +++ /dev/null @@ -1,310 +0,0 @@ -import Database from "better-sqlite3"; -import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent"; - -const DEFAULT_LIST_LIMIT = 100; - -export interface SQLiteSessionPersistDriverOptions { - filename?: string; -} - -export class SQLiteSessionPersistDriver implements SessionPersistDriver { - private readonly db: Database.Database; - - constructor(options: SQLiteSessionPersistDriverOptions = {}) { - this.db = new Database(options.filename ?? ":memory:"); - this.initialize(); - } - - async getSession(id: string): Promise { - const row = this.db - .prepare( - `SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json - FROM sessions WHERE id = ?`, - ) - .get(id) as SessionRow | undefined; - - if (!row) { - return undefined; - } - - return decodeSessionRow(row); - } - - async listSessions(request: ListPageRequest = {}): Promise> { - const offset = parseCursor(request.cursor); - const limit = normalizeLimit(request.limit); - - const rows = this.db - .prepare( - `SELECT id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json - FROM sessions - ORDER BY created_at ASC, id ASC - LIMIT ? OFFSET ?`, - ) - .all(limit, offset) as SessionRow[]; - - const countRow = this.db.prepare(`SELECT COUNT(*) as count FROM sessions`).get() as { count: number }; - const nextOffset = offset + rows.length; - - return { - items: rows.map(decodeSessionRow), - nextCursor: nextOffset < countRow.count ? String(nextOffset) : undefined, - }; - } - - async updateSession(session: SessionRecord): Promise { - this.db - .prepare( - `INSERT INTO sessions ( - id, agent, agent_session_id, last_connection_id, created_at, destroyed_at, sandbox_id, session_init_json, config_options_json, modes_json - ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?) - ON CONFLICT(id) DO UPDATE SET - agent = excluded.agent, - agent_session_id = excluded.agent_session_id, - last_connection_id = excluded.last_connection_id, - created_at = excluded.created_at, - destroyed_at = excluded.destroyed_at, - sandbox_id = excluded.sandbox_id, - session_init_json = excluded.session_init_json, - config_options_json = excluded.config_options_json, - modes_json = excluded.modes_json`, - ) - .run( - session.id, - session.agent, - session.agentSessionId, - session.lastConnectionId, - session.createdAt, - session.destroyedAt ?? null, - session.sandboxId ?? null, - session.sessionInit ? JSON.stringify(session.sessionInit) : null, - session.configOptions ? JSON.stringify(session.configOptions) : null, - session.modes !== undefined ? JSON.stringify(session.modes) : null, - ); - } - - async listEvents(request: ListEventsRequest): Promise> { - const offset = parseCursor(request.cursor); - const limit = normalizeLimit(request.limit); - - const rows = this.db - .prepare( - `SELECT id, event_index, session_id, created_at, connection_id, sender, payload_json - FROM events - WHERE session_id = ? - ORDER BY event_index ASC, id ASC - LIMIT ? OFFSET ?`, - ) - .all(request.sessionId, limit, offset) as EventRow[]; - - const countRow = this.db.prepare(`SELECT COUNT(*) as count FROM events WHERE session_id = ?`).get(request.sessionId) as { count: number }; - - const nextOffset = offset + rows.length; - - return { - items: rows.map(decodeEventRow), - nextCursor: nextOffset < countRow.count ? String(nextOffset) : undefined, - }; - } - - async insertEvent(_sessionId: string, event: SessionEvent): Promise { - this.db - .prepare( - `INSERT INTO events ( - id, event_index, session_id, created_at, connection_id, sender, payload_json - ) VALUES (?, ?, ?, ?, ?, ?, ?) - ON CONFLICT(id) DO UPDATE SET - event_index = excluded.event_index, - session_id = excluded.session_id, - created_at = excluded.created_at, - connection_id = excluded.connection_id, - sender = excluded.sender, - payload_json = excluded.payload_json`, - ) - .run(event.id, event.eventIndex, event.sessionId, event.createdAt, event.connectionId, event.sender, JSON.stringify(event.payload)); - } - - close(): void { - this.db.close(); - } - - private initialize(): void { - this.db.exec(` - CREATE TABLE IF NOT EXISTS sessions ( - id TEXT PRIMARY KEY, - agent TEXT NOT NULL, - agent_session_id TEXT NOT NULL, - last_connection_id TEXT NOT NULL, - created_at INTEGER NOT NULL, - destroyed_at INTEGER, - sandbox_id TEXT, - session_init_json TEXT, - config_options_json TEXT, - modes_json TEXT - ) - `); - - const sessionColumns = this.db.prepare(`PRAGMA table_info(sessions)`).all() as TableInfoRow[]; - if (!sessionColumns.some((column) => column.name === "sandbox_id")) { - this.db.exec(`ALTER TABLE sessions ADD COLUMN sandbox_id TEXT`); - } - if (!sessionColumns.some((column) => column.name === "config_options_json")) { - this.db.exec(`ALTER TABLE sessions ADD COLUMN config_options_json TEXT`); - } - if (!sessionColumns.some((column) => column.name === "modes_json")) { - this.db.exec(`ALTER TABLE sessions ADD COLUMN modes_json TEXT`); - } - - this.ensureEventsTable(); - } - - private ensureEventsTable(): void { - const tableInfo = this.db.prepare(`PRAGMA table_info(events)`).all() as TableInfoRow[]; - if (tableInfo.length === 0) { - this.createEventsTable(); - return; - } - - const idColumn = tableInfo.find((column) => column.name === "id"); - const hasEventIndex = tableInfo.some((column) => column.name === "event_index"); - const idType = (idColumn?.type ?? "").trim().toUpperCase(); - const idIsText = idType === "TEXT"; - - if (!idIsText || !hasEventIndex) { - this.rebuildEventsTable(hasEventIndex); - } - - this.db.exec(` - CREATE INDEX IF NOT EXISTS idx_events_session_order - ON events(session_id, event_index, id) - `); - } - - private createEventsTable(): void { - this.db.exec(` - CREATE TABLE IF NOT EXISTS events ( - id TEXT PRIMARY KEY, - event_index INTEGER NOT NULL, - session_id TEXT NOT NULL, - created_at INTEGER NOT NULL, - connection_id TEXT NOT NULL, - sender TEXT NOT NULL, - payload_json TEXT NOT NULL - ); - - CREATE INDEX IF NOT EXISTS idx_events_session_order - ON events(session_id, event_index, id) - `); - } - - private rebuildEventsTable(hasEventIndex: boolean): void { - this.db.exec(` - ALTER TABLE events RENAME TO events_legacy; - `); - - this.createEventsTable(); - - if (hasEventIndex) { - this.db.exec(` - INSERT INTO events (id, event_index, session_id, created_at, connection_id, sender, payload_json) - SELECT - CAST(id AS TEXT), - COALESCE(event_index, ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC)), - session_id, - created_at, - connection_id, - sender, - payload_json - FROM events_legacy - `); - } else { - this.db.exec(` - INSERT INTO events (id, event_index, session_id, created_at, connection_id, sender, payload_json) - SELECT - CAST(id AS TEXT), - ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY created_at ASC, id ASC), - session_id, - created_at, - connection_id, - sender, - payload_json - FROM events_legacy - `); - } - - this.db.exec(`DROP TABLE events_legacy`); - } -} - -type SessionRow = { - id: string; - agent: string; - agent_session_id: string; - last_connection_id: string; - created_at: number; - destroyed_at: number | null; - sandbox_id: string | null; - session_init_json: string | null; - config_options_json: string | null; - modes_json: string | null; -}; - -type EventRow = { - id: string; - event_index: number; - session_id: string; - created_at: number; - connection_id: string; - sender: "client" | "agent"; - payload_json: string; -}; - -type TableInfoRow = { - name: string; - type: string; -}; - -function decodeSessionRow(row: SessionRow): SessionRecord { - return { - id: row.id, - agent: row.agent, - agentSessionId: row.agent_session_id, - lastConnectionId: row.last_connection_id, - createdAt: row.created_at, - destroyedAt: row.destroyed_at ?? undefined, - sandboxId: row.sandbox_id ?? undefined, - sessionInit: row.session_init_json ? (JSON.parse(row.session_init_json) as SessionRecord["sessionInit"]) : undefined, - configOptions: row.config_options_json ? (JSON.parse(row.config_options_json) as SessionRecord["configOptions"]) : undefined, - modes: row.modes_json ? (JSON.parse(row.modes_json) as SessionRecord["modes"]) : undefined, - }; -} - -function decodeEventRow(row: EventRow): SessionEvent { - return { - id: row.id, - eventIndex: row.event_index, - sessionId: row.session_id, - createdAt: row.created_at, - connectionId: row.connection_id, - sender: row.sender, - payload: JSON.parse(row.payload_json), - }; -} - -function normalizeLimit(limit: number | undefined): number { - if (!Number.isFinite(limit) || (limit ?? 0) < 1) { - return DEFAULT_LIST_LIMIT; - } - return Math.floor(limit as number); -} - -function parseCursor(cursor: string | undefined): number { - if (!cursor) { - return 0; - } - const parsed = Number.parseInt(cursor, 10); - if (!Number.isFinite(parsed) || parsed < 0) { - return 0; - } - return parsed; -} diff --git a/examples/persist-sqlite/tsconfig.json b/examples/persist-sqlite/tsconfig.json index ec2723c..d1c0065 100644 --- a/examples/persist-sqlite/tsconfig.json +++ b/examples/persist-sqlite/tsconfig.json @@ -1,15 +1,13 @@ { "compilerOptions": { "target": "ES2022", - "lib": ["ES2022", "DOM"], "module": "ESNext", "moduleResolution": "Bundler", "allowImportingTsExtensions": true, "noEmit": true, "esModuleInterop": true, "strict": true, - "skipLibCheck": true, - "types": ["node"] + "skipLibCheck": true }, "include": ["src"] } diff --git a/examples/shared/src/docker.ts b/examples/shared/src/docker.ts index f4161fb..2feca37 100644 --- a/examples/shared/src/docker.ts +++ b/examples/shared/src/docker.ts @@ -9,7 +9,7 @@ const __dirname = path.dirname(fileURLToPath(import.meta.url)); const REPO_ROOT = path.resolve(__dirname, "..", "..", ".."); /** Pre-built Docker image with all agents installed. */ -export const FULL_IMAGE = "rivetdev/sandbox-agent:0.4.2-full"; +export const FULL_IMAGE = "rivetdev/sandbox-agent:0.3.1-full"; export interface DockerSandboxOptions { /** Container port used by sandbox-agent inside Docker. */ @@ -78,11 +78,11 @@ function readClaudeCredentialFiles(): ClaudeCredentialFile[] { const candidates: Array<{ hostPath: string; containerPath: string }> = [ { hostPath: path.join(homeDir, ".claude", ".credentials.json"), - containerPath: ".claude/.credentials.json", + containerPath: "/root/.claude/.credentials.json", }, { hostPath: path.join(homeDir, ".claude-oauth-credentials.json"), - containerPath: ".claude-oauth-credentials.json", + containerPath: "/root/.claude-oauth-credentials.json", }, ]; @@ -180,9 +180,10 @@ export async function startDockerSandbox(opts: DockerSandboxOptions): Promise { const envKey = `SANDBOX_AGENT_CLAUDE_CREDENTIAL_${index}_B64`; bootstrapEnv[envKey] = file.base64Content; - // Use $HOME-relative paths so credentials work regardless of container user - const containerDir = path.posix.dirname(file.containerPath); - return [`mkdir -p "$HOME/${containerDir}"`, `printf %s "$${envKey}" | base64 -d > "$HOME/${file.containerPath}"`]; + return [ + `mkdir -p ${shellSingleQuotedLiteral(path.posix.dirname(file.containerPath))}`, + `printf %s "$${envKey}" | base64 -d > ${shellSingleQuotedLiteral(file.containerPath)}`, + ]; }); setupCommands.unshift(...credentialBootstrapCommands); } @@ -199,9 +200,8 @@ export async function startDockerSandbox(opts: DockerSandboxOptions): Promise `${key}=${value}`), ...Object.entries(bootstrapEnv).map(([key, value]) => `${key}=${value}`)], ExposedPorts: { [`${port}/tcp`]: {} }, HostConfig: { @@ -253,13 +253,10 @@ export async function startDockerSandbox(opts: DockerSandboxOptions): Promise { - await cleanup(); process.exit(0); }; - process.once("SIGINT", signalCleanup); - process.once("SIGTERM", signalCleanup); + process.once("SIGINT", cleanup); + process.once("SIGTERM", cleanup); return { baseUrl, cleanup }; } diff --git a/examples/skills-custom-tool/src/index.ts b/examples/skills-custom-tool/src/index.ts index 490be64..44b2161 100644 --- a/examples/skills-custom-tool/src/index.ts +++ b/examples/skills-custom-tool/src/index.ts @@ -36,7 +36,7 @@ await client.setSkillsConfig({ directory: "/", skillName: "random-number" }, { s // Create a session. console.log("Creating session with custom skill..."); -const session = await client.createSession({ agent: detectAgent(), cwd: "/root" }); +const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/root", mcpServers: [] } }); const sessionId = session.id; console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); console.log(' Try: "generate a random number between 1 and 100"'); diff --git a/examples/skills-custom-tool/tsconfig.json b/examples/skills-custom-tool/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/skills-custom-tool/tsconfig.json +++ b/examples/skills-custom-tool/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/skills/src/index.ts b/examples/skills/src/index.ts index 3087ecc..c04815c 100644 --- a/examples/skills/src/index.ts +++ b/examples/skills/src/index.ts @@ -15,7 +15,7 @@ await client.setSkillsConfig( ); console.log("Creating session..."); -const session = await client.createSession({ agent: detectAgent(), cwd: "/root" }); +const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/root", mcpServers: [] } }); const sessionId = session.id; console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); console.log(' Try: "How do I start sandbox-agent?"'); diff --git a/examples/skills/tsconfig.json b/examples/skills/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/skills/tsconfig.json +++ b/examples/skills/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/sprites/package.json b/examples/sprites/package.json deleted file mode 100644 index df808e8..0000000 --- a/examples/sprites/package.json +++ /dev/null @@ -1,20 +0,0 @@ -{ - "name": "@sandbox-agent/example-sprites", - "private": true, - "type": "module", - "scripts": { - "start": "tsx src/index.ts", - "typecheck": "tsc --noEmit" - }, - "dependencies": { - "@fly/sprites": "latest", - "@sandbox-agent/example-shared": "workspace:*", - "sandbox-agent": "workspace:*" - }, - "devDependencies": { - "@types/node": "latest", - "tsx": "latest", - "typescript": "latest", - "vitest": "^3.0.0" - } -} diff --git a/examples/sprites/src/index.ts b/examples/sprites/src/index.ts deleted file mode 100644 index bf95e5d..0000000 --- a/examples/sprites/src/index.ts +++ /dev/null @@ -1,21 +0,0 @@ -import { SandboxAgent } from "sandbox-agent"; -import { sprites } from "sandbox-agent/sprites"; - -const env: Record = {}; -if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; -if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY; - -const client = await SandboxAgent.start({ - sandbox: sprites({ - token: process.env.SPRITES_API_KEY ?? process.env.SPRITE_TOKEN ?? process.env.SPRITES_TOKEN, - env, - }), -}); - -console.log(`UI: ${client.inspectorUrl}`); -console.log(await client.getHealth()); - -process.once("SIGINT", async () => { - await client.destroySandbox(); - process.exit(0); -}); diff --git a/examples/sprites/tests/sprites.test.ts b/examples/sprites/tests/sprites.test.ts deleted file mode 100644 index dfd1594..0000000 --- a/examples/sprites/tests/sprites.test.ts +++ /dev/null @@ -1,34 +0,0 @@ -import { describe, it, expect } from "vitest"; -import { SandboxAgent } from "sandbox-agent"; -import { sprites } from "sandbox-agent/sprites"; - -const shouldRun = Boolean(process.env.SPRITES_API_KEY || process.env.SPRITE_TOKEN || process.env.SPRITES_TOKEN); -const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) || 300_000; - -const testFn = shouldRun ? it : it.skip; - -describe("sprites provider", () => { - testFn( - "starts sandbox-agent and responds to /v1/health", - async () => { - const env: Record = {}; - if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; - if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY; - - const sdk = await SandboxAgent.start({ - sandbox: sprites({ - token: process.env.SPRITES_API_KEY ?? process.env.SPRITE_TOKEN ?? process.env.SPRITES_TOKEN, - env, - }), - }); - - try { - const health = await sdk.getHealth(); - expect(health.status).toBe("ok"); - } finally { - await sdk.destroySandbox(); - } - }, - timeoutMs, - ); -}); diff --git a/examples/sprites/tsconfig.json b/examples/sprites/tsconfig.json deleted file mode 100644 index ad591c3..0000000 --- a/examples/sprites/tsconfig.json +++ /dev/null @@ -1,17 +0,0 @@ -{ - "compilerOptions": { - "target": "ES2022", - "lib": ["ES2022", "DOM"], - "module": "ESNext", - "moduleResolution": "Bundler", - "allowImportingTsExtensions": true, - "noEmit": true, - "esModuleInterop": true, - "strict": true, - "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] - }, - "include": ["src/**/*"], - "exclude": ["node_modules", "**/*.test.ts"] -} diff --git a/examples/vercel/src/index.ts b/examples/vercel/src/index.ts index 5a83e0c..4a63bfc 100644 --- a/examples/vercel/src/index.ts +++ b/examples/vercel/src/index.ts @@ -1,33 +1,56 @@ +import { Sandbox } from "@vercel/sandbox"; import { SandboxAgent } from "sandbox-agent"; -import { vercel } from "sandbox-agent/vercel"; -import { detectAgent } from "@sandbox-agent/example-shared"; +import { detectAgent, buildInspectorUrl } from "@sandbox-agent/example-shared"; -const env: Record = {}; -if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; -if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY; +const envs: Record = {}; +if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; +if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; -const client = await SandboxAgent.start({ - sandbox: vercel({ - create: { - runtime: "node24", - env, - }, - }), +console.log("Creating Vercel sandbox..."); +const sandbox = await Sandbox.create({ + runtime: "node24", + ports: [3000], }); -console.log(`UI: ${client.inspectorUrl}`); +const run = async (cmd: string, args: string[] = []) => { + const result = await sandbox.runCommand({ cmd, args, env: envs }); + if (result.exitCode !== 0) { + const stderr = await result.stderr(); + throw new Error(`Command failed: ${cmd} ${args.join(" ")}\n${stderr}`); + } + return result; +}; -const session = await client.createSession({ - agent: detectAgent(), +console.log("Installing sandbox-agent..."); +await run("sh", ["-c", "curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh"]); + +console.log("Installing agents..."); +await run("sandbox-agent", ["install-agent", "claude"]); +await run("sandbox-agent", ["install-agent", "codex"]); + +console.log("Starting server..."); +await sandbox.runCommand({ + cmd: "sandbox-agent", + args: ["server", "--no-token", "--host", "0.0.0.0", "--port", "3000"], + env: envs, + detached: true, }); -session.onEvent((event) => { - console.log(`[${event.sender}]`, JSON.stringify(event.payload)); -}); +const baseUrl = sandbox.domain(3000); -session.prompt([{ type: "text", text: "Say hello from Vercel in one sentence." }]); +console.log("Connecting to server..."); +const client = await SandboxAgent.connect({ baseUrl }); +const session = await client.createSession({ agent: detectAgent(), sessionInit: { cwd: "/home/vercel-sandbox", mcpServers: [] } }); +const sessionId = session.id; -process.once("SIGINT", async () => { - await client.destroySandbox(); +console.log(` UI: ${buildInspectorUrl({ baseUrl, sessionId })}`); +console.log(" Press Ctrl+C to stop."); + +const keepAlive = setInterval(() => {}, 60_000); +const cleanup = async () => { + clearInterval(keepAlive); + await sandbox.stop(); process.exit(0); -}); +}; +process.once("SIGINT", cleanup); +process.once("SIGTERM", cleanup); diff --git a/examples/vercel/src/vercel.ts b/examples/vercel/src/vercel.ts deleted file mode 100644 index 742cd5a..0000000 --- a/examples/vercel/src/vercel.ts +++ /dev/null @@ -1,35 +0,0 @@ -import { SandboxAgent } from "sandbox-agent"; -import { vercel } from "sandbox-agent/vercel"; - -function collectEnvVars(): Record { - const env: Record = {}; - if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; - if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY; - return env; -} - -function inspectorUrlToBaseUrl(inspectorUrl: string): string { - return inspectorUrl.replace(/\/ui\/$/, ""); -} - -export async function setupVercelSandboxAgent(): Promise<{ - baseUrl: string; - token?: string; - cleanup: () => Promise; -}> { - const client = await SandboxAgent.start({ - sandbox: vercel({ - create: { - runtime: "node24", - env: collectEnvVars(), - }, - }), - }); - - return { - baseUrl: inspectorUrlToBaseUrl(client.inspectorUrl), - cleanup: async () => { - await client.killSandbox(); - }, - }; -} diff --git a/examples/vercel/tsconfig.json b/examples/vercel/tsconfig.json index ad591c3..96ba2fd 100644 --- a/examples/vercel/tsconfig.json +++ b/examples/vercel/tsconfig.json @@ -9,8 +9,7 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true, - "types": ["node"] + "resolveJsonModule": true }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/foundry/AGENT-HANDOFF.md b/foundry/AGENT-HANDOFF.md deleted file mode 100644 index 20bade7..0000000 --- a/foundry/AGENT-HANDOFF.md +++ /dev/null @@ -1,179 +0,0 @@ -# Foundry Agent Handoff - -## Baseline - -- Repo: `rivet-dev/sandbox-agent` -- Branch: `columbus-v2` -- Last pushed commit: `3174fe73` (`feat(foundry): checkpoint actor and workspace refactor`) -- Progress/spec tracker: [FOUNDRY-CHANGES.md](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/FOUNDRY-CHANGES.md) - -## What is already landed - -These spec slices are already implemented and pushed: - -- Item `1`: backend actor rename `auth-user` -> `user` -- Item `2`: Better Auth mapping comments -- Item `5`: task raw SQL cleanup into migrations -- Item `6`: `history` -> `audit-log` -- Item `7`: default model moved to user-scoped app state -- Item `20`: admin action prefixing -- Item `23`: dead `getTaskEnriched` / `enrichTaskRecord` removal -- Item `25`: `Workbench` -> `Workspace` rename across backend/shared/client/frontend -- Item `26`: branch rename deleted -- Organization realtime was already collapsed to full-snapshot `organizationUpdated` -- Task realtime was already aligned to `taskUpdated` - -## Known blocker - -Spec item `3` is only partially done. The singleton constraint for the Better Auth `user` table is still blocked. - -- File: [foundry/packages/backend/src/actors/user/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/user/db/schema.ts) -- Reason: Better Auth still depends on external string `user.id`, so a literal singleton `CHECK (id = 1)` on that table is not a safe mechanical change. - -## Important current state - -There are uncommitted edits on top of the pushed checkpoint. Another agent should start from the current worktree, not just `origin/columbus-v2`. - -Current dirty files: - -- [foundry/packages/backend/src/actors/github-data/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/github-data/index.ts) -- [foundry/packages/backend/src/actors/organization/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/actions.ts) -- [foundry/packages/backend/src/actors/repository/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/repository/actions.ts) -- [foundry/packages/backend/src/actors/task/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workspace.ts) -- [foundry/packages/client/src/mock/backend-client.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/client/src/mock/backend-client.ts) - -These files are the current hot path for the unfinished structural work. - -## What is partially in place but not finished - -### User-owned task UI state - -The user actor already has the schema and CRUD surface for per-user task/session UI state: - -- [foundry/packages/backend/src/actors/user/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/user/db/schema.ts) - `user_task_state` -- [foundry/packages/backend/src/actors/user/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/user/index.ts) - `getTaskState`, `upsertTaskState`, `deleteTaskState` - -But the task actor and UI are still reading/writing the old task-global fields: - -- [foundry/packages/backend/src/actors/task/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/db/schema.ts) - still contains `task_runtime.active_session_id` and session `unread` / `draft_*` -- [foundry/packages/backend/src/actors/task/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workspace.ts) - still derives unread/draft/active-session from task-local rows -- [foundry/packages/frontend/src/components/mock-layout.tsx](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/frontend/src/components/mock-layout.tsx) - still treats `activeSessionId` as frontend-local and uses task-level unread/draft state - -So items `21`, `22`, `24`, and part of `19` are only half-done. - -### Coordinator ownership - -The current architecture still violates the intended coordinator pattern: - -- Organization still owns `taskLookup` and `taskSummaries` - - [foundry/packages/backend/src/actors/organization/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/db/schema.ts) -- Organization still resolves `taskId -> repoId` - - [foundry/packages/backend/src/actors/organization/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/actions.ts) -- Task still pushes summary updates to organization instead of repository - - [foundry/packages/backend/src/actors/task/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workspace.ts) -- Repository still does not own a `tasks` projection table yet - - [foundry/packages/backend/src/actors/repository/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/repository/db/schema.ts) - -So items `9`, `13`, and `15` are still open. - -### Queue-only mutations - -Task actor workspace commands already go through queue sends. Other actors still do not fully follow the queue-only mutation rule: - -- [foundry/packages/backend/src/actors/user/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/user/index.ts) -- [foundry/packages/backend/src/actors/github-data/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/github-data/index.ts) -- [foundry/packages/backend/src/actors/organization/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/actions.ts) -- [foundry/packages/backend/src/actors/organization/app-shell.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/app-shell.ts) - -So items `4`, `10`, and `11` are still open. - -### Dynamic model/agent data - -The frontend/client still hardcode model groups: - -- [foundry/packages/frontend/src/components/mock-layout/view-model.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/frontend/src/components/mock-layout/view-model.ts) -- [foundry/packages/client/src/workspace-model.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/client/src/workspace-model.ts) -- [foundry/packages/shared/src/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/shared/src/workspace.ts) - `WorkspaceModelId` is still a hardcoded union - -The repo already has the API source of truth available through the TypeScript SDK: - -- [sdks/typescript/src/client.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/sdks/typescript/src/client.ts) - `SandboxAgent.listAgents({ config: true })` -- [server/packages/sandbox-agent/src/router.rs](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/server/packages/sandbox-agent/src/router.rs) - `/v1/agents` -- [server/packages/sandbox-agent/src/router/support.rs](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/server/packages/sandbox-agent/src/router/support.rs) - `fallback_config_options` - -So item `8` is still open. - -### GitHub sync chunking/progress - -GitHub data sync is still a delete-and-replace flow: - -- [foundry/packages/backend/src/actors/github-data/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/github-data/index.ts) - `replaceRepositories`, `replaceBranches`, `replaceMembers`, `replacePullRequests`, and full-sync flow -- [foundry/packages/backend/src/actors/github-data/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/github-data/db/schema.ts) - no generation/progress columns yet -- [foundry/packages/shared/src/app-shell.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/shared/src/app-shell.ts) - no structured sync progress field yet - -So item `16` is still open. - -## Recommended next order - -If another agent picks this up, this is the safest order: - -1. Finish items `21`, `22`, `24`, `19` together. - Reason: user-owned task UI state is already half-wired, and task schema cleanup depends on the same files. - -2. Finish items `9`, `13`, `15` together. - Reason: coordinator ownership, repo-owned task projections, and PR/task unification are the same refactor seam. - -3. Finish item `16`. - Reason: GitHub sync chunking is mostly isolated to `github-data` plus app-shell/shared snapshot wiring. - -4. Finish item `8`. - Reason: dynamic model/agent data is largely independent once user default model is already user-scoped. - -5. Finish items `4`, `10`, `11`, `12`, `18`, final event audit. - -6. Do item `17` last. - -## Concrete file hotspots for the next agent - -Backend: - -- [foundry/packages/backend/src/actors/task/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workspace.ts) -- [foundry/packages/backend/src/actors/task/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/db/schema.ts) -- [foundry/packages/backend/src/actors/task/workflow/common.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workflow/common.ts) -- [foundry/packages/backend/src/actors/task/workflow/commands.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workflow/commands.ts) -- [foundry/packages/backend/src/actors/task/workflow/init.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workflow/init.ts) -- [foundry/packages/backend/src/actors/repository/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/repository/actions.ts) -- [foundry/packages/backend/src/actors/repository/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/repository/db/schema.ts) -- [foundry/packages/backend/src/actors/organization/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/actions.ts) -- [foundry/packages/backend/src/actors/github-data/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/github-data/index.ts) -- [foundry/packages/backend/src/actors/user/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/user/index.ts) - -Shared/client/frontend: - -- [foundry/packages/shared/src/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/shared/src/workspace.ts) -- [foundry/packages/shared/src/contracts.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/shared/src/contracts.ts) -- [foundry/packages/shared/src/app-shell.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/shared/src/app-shell.ts) -- [foundry/packages/client/src/backend-client.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/client/src/backend-client.ts) -- [foundry/packages/client/src/workspace-model.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/client/src/workspace-model.ts) -- [foundry/packages/frontend/src/components/mock-layout.tsx](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/frontend/src/components/mock-layout.tsx) -- [foundry/packages/frontend/src/components/mock-layout/view-model.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/frontend/src/components/mock-layout/model-picker.tsx) -- [foundry/packages/frontend/src/features/tasks/status.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/frontend/src/features/tasks/status.ts) - -## Notes that matter - -- The pushed checkpoint is useful, but it is not the full current state. There are uncommitted edits in the hot-path backend files listed above. -- The current tree already contains a partially added `user_task_state` path. Do not duplicate that work; finish the migration by removing the old task-owned fields and rewiring readers/writers. -- The current task actor still reads mutable fields from `c.state` such as `repoRemote`, `branchName`, `title`, `task`, `sandboxProviderId`, and `agentType`. That is part of item `19`. -- The current frontend still synthesizes PR-only rows into fake tasks. That should go away as part of repo-owned task projection / PR unification. diff --git a/foundry/CLAUDE.md b/foundry/CLAUDE.md index 2d9bcbb..8af6c92 100644 --- a/foundry/CLAUDE.md +++ b/foundry/CLAUDE.md @@ -12,10 +12,10 @@ Use TypeScript for all source code. Use `pnpm` workspaces and Turborepo. -- Repository root uses `pnpm-workspace.yaml` and `turbo.json`. +- Workspace root uses `pnpm-workspace.yaml` and `turbo.json`. - Packages live in `packages/*`. - `core` is renamed to `shared`. -- `packages/cli` is disabled and excluded from active monorepo validation. +- `packages/cli` is disabled and excluded from active workspace validation. - Integrations and providers live under `packages/backend/src/{integrations,providers}`. ## CLI Status @@ -23,14 +23,14 @@ Use `pnpm` workspaces and Turborepo. - `packages/cli` is fully disabled for active development. - Do not implement new behavior in `packages/cli` unless explicitly requested. - Frontend is the primary product surface; prioritize `packages/frontend` + supporting `packages/client`/`packages/backend`. -- Monorepo `build`, `typecheck`, and `test` intentionally exclude `@sandbox-agent/foundry-cli`. -- `pnpm-workspace.yaml` excludes `packages/cli` from monorepo package resolution. +- Workspace `build`, `typecheck`, and `test` intentionally exclude `@sandbox-agent/foundry-cli`. +- `pnpm-workspace.yaml` excludes `packages/cli` from workspace package resolution. ## Common Commands - Foundry is the canonical name for this product tree. Do not introduce or preserve legacy pre-Foundry naming in code, docs, commands, or runtime paths. - Install deps: `pnpm install` -- Full active-monorepo validation: `pnpm -w typecheck`, `pnpm -w build`, `pnpm -w test` +- Full active-workspace validation: `pnpm -w typecheck`, `pnpm -w build`, `pnpm -w test` - Start the full dev stack (real backend + frontend): `just foundry-dev` — frontend on **port 4173**, backend on **port 7741** (Docker via `compose.dev.yaml`) - Start the mock frontend stack (no backend): `just foundry-mock` — mock frontend on **port 4174** (Docker via `compose.mock.yaml`) - Start the local production-build preview stack: `just foundry-preview` @@ -56,47 +56,12 @@ Use `pnpm` workspaces and Turborepo. - mock frontend changes: `just foundry-mock` or restart with `just foundry-mock-down && just foundry-mock` - local frontend-only work outside Docker: restart `pnpm --filter @sandbox-agent/foundry-frontend dev` or `just foundry-dev-mock` as appropriate - The backend does **not** hot reload. Bun's `--hot` flag causes the server to re-bind on a different port (e.g. 6421 instead of 6420), breaking all client connections while the container still exposes the original port. After backend code changes, restart the backend container: `just foundry-dev-down && just foundry-dev`. -- The dev server has debug logging enabled by default (`RIVET_LOG_LEVEL=debug`, `FOUNDRY_LOG_LEVEL=debug`) via `compose.dev.yaml`. Error stacks and timestamps are also enabled. -- The frontend client uses JSON encoding for RivetKit in development (`import.meta.env.DEV`) for easier debugging. Production uses the default encoding. - -## Foundry Base Sandbox Image - -Local Docker sandboxes use the `rivetdev/sandbox-agent:foundry-base-latest` image by default. This image extends the sandbox-agent runtime with sudo, git, neovim, gh, node, bun, chromium, and agent-browser. - -- **Dockerfile:** `docker/foundry-base.Dockerfile` (builds sandbox-agent from source, x86_64 only) -- **Publish script:** `scripts/publish-foundry-base.sh` (builds and pushes to Docker Hub `rivetdev/sandbox-agent`) -- **Tags:** `foundry-base-TZ` (timestamped) + `foundry-base-latest` (rolling) -- **Build from repo root:** `./foundry/scripts/publish-foundry-base.sh` (or `--dry-run` to skip push) -- **Override image in dev:** set `HF_LOCAL_SANDBOX_IMAGE` in `foundry/.env` or environment. The env var is passed through `compose.dev.yaml` to the backend. -- **Resolution order:** `config.sandboxProviders.local.image` (config.toml) > `HF_LOCAL_SANDBOX_IMAGE` (env var) > `DEFAULT_LOCAL_SANDBOX_IMAGE` constant in `packages/backend/src/actors/sandbox/index.ts`. -- The image must be built with `--platform linux/amd64`. The Rust build is memory-intensive; Docker Desktop needs at least 8GB RAM allocated. -- When updating the base image contents (new system packages, agent versions), rebuild and push with the publish script, then update the `foundry-base-latest` tag. - -## Production GitHub App + OAuth App - -Foundry uses two separate GitHub entities in production: - -- **OAuth App** (`GITHUB_CLIENT_ID` / `GITHUB_CLIENT_SECRET`) — handles "Sign in with GitHub" via Better Auth. This is a standard OAuth App. -- **GitHub App** (`GITHUB_APP_ID` / `GITHUB_APP_CLIENT_ID` / `GITHUB_APP_CLIENT_SECRET` / `GITHUB_APP_PRIVATE_KEY`) — handles webhooks, installation tokens for repo access, and GitHub API sync (repos, PRs). Must be manually installed on each org. - -Key env vars and where they connect: - -- `GITHUB_REDIRECT_URI` — OAuth callback, must point to `https://api.sandboxagent.dev/v1/auth/callback/github` -- `GITHUB_WEBHOOK_SECRET` — must match the secret configured on the GitHub App's Webhook settings page exactly. Mismatches cause silent 500s on webhook delivery (signature verification fails inside the actor, surfaced as a generic RivetKit `internal_error`). -- `BETTER_AUTH_URL` — must be the **API** URL (`https://api.sandboxagent.dev`), not the frontend URL. Better Auth uses this internally for sign-out and session management calls. -- `APP_URL` — the **frontend** URL (`https://foundry.sandboxagent.dev`). - -Troubleshooting: - -- **"GitHub App not installed"** — The GitHub App must be manually installed on each org. Sign-in does not auto-install it. Go to the GitHub App settings → Install App tab. The sign-in flow can only detect existing installations, not create them. -- **Webhooks not arriving** — Check the GitHub App → Advanced tab for delivery history. If deliveries show 500, the webhook secret likely doesn't match `GITHUB_WEBHOOK_SECRET`. Test with: `echo -n '{"test":true}' | openssl dgst -sha256 -hmac "$SECRET"` and curl the endpoint with the computed signature. -- **Deleting all actors wipes GitHub App installation state.** After a full actor reset, you must trigger a webhook (e.g. redeliver from GitHub App Advanced tab, or re-install the app) to repopulate installation records. ## Railway Logs -- Production Foundry Railway logs can be read from a linked checkout with `railway logs --deployment --lines 200` or `railway logs --deployment --lines 200`. +- Production Foundry Railway logs can be read from a linked workspace with `railway logs --deployment --lines 200` or `railway logs --deployment --lines 200`. - Production deploys should go through `git push` to the deployment branch/workflow. Do not use `railway up` for Foundry deploys. -- If Railway logs fail because the checkout is not linked to the correct Railway project/service/environment, run: +- If Railway logs fail because the workspace is not linked to the correct project/service/environment, run: `railway link --project 33e3e2df-32c5-41c5-a4af-dca8654acb1d --environment cf387142-61fd-4668-8cf7-b3559e0983cb --service 91c7e450-d6d2-481a-b2a4-0a916f4160fc` - That links this directory to the `sandbox-agent` project, `production` environment, and `foundry-api` service. - Production proxy chain: `api.sandboxagent.dev` routes through Cloudflare → Fastly/Varnish → Railway. When debugging request duplication, timeouts, or retry behavior, check headers like `cf-ray`, `x-varnish`, `x-railway-edge`, and `cdn-loop` to identify which layer is involved. @@ -108,14 +73,13 @@ Troubleshooting: - All backend interaction (actor calls, metadata/health checks, backend HTTP endpoint access) must go through the dedicated client library in `packages/client`. - Outside `packages/client`, do not call backend endpoints directly (for example `fetch(.../v1/rivet...)`), except in black-box E2E tests that intentionally exercise raw transport behavior. - GUI state should update in realtime (no manual refresh buttons). Prefer RivetKit push reactivity and actor-driven events; do not add polling/refetch for normal product flows. -- Keep the mock workspace types and mock client in `packages/shared` + `packages/client` up to date with the frontend contract. The mock is the UI testing reference implementation while backend functionality catches up. +- Keep the mock workbench types and mock client in `packages/shared` + `packages/client` up to date with the frontend contract. The mock is the UI testing reference implementation while backend functionality catches up. - Keep frontend route/state coverage current in code and tests; there is no separate page-inventory doc to maintain. - If Foundry uses a shared component from `@sandbox-agent/react`, make changes in `sdks/react` instead of copying or forking that component into Foundry. - When changing shared React components in `sdks/react` for Foundry, verify they still work in the Sandbox Agent Inspector before finishing. -- When making UI changes, verify the live flow with the Chrome DevTools MCP or `agent-browser`, take screenshots of the updated UI, and offer to open those screenshots in Preview when you finish. +- When making UI changes, verify the live flow with `agent-browser`, take screenshots of the updated UI, and offer to open those screenshots in Preview when you finish. - When asked for screenshots, capture all relevant affected screens and modal states, not just a single viewport. Include empty, populated, success, and blocked/error states when they are part of the changed flow. - If a screenshot catches a transition frame, blank modal, or otherwise misleading state, retake it before reporting it. -- When verifying UI in the browser, attempt to sign in by navigating to `/signin` and clicking "Continue with GitHub". If the browser lands on the GitHub login page (github.com/login) and you don't have credentials, stop and ask the user to complete the sign-in. Do not assume the session is invalid just because you see the Foundry sign-in page — always attempt the OAuth flow first. ## Realtime Data Architecture @@ -132,19 +96,19 @@ Do not use polling (`refetchInterval`), empty "go re-fetch" broadcast events, or ### Materialized state in coordinator actors -- **Organization actor** materializes sidebar-level data in its own SQLite: repo catalog, task summaries (title, status, branch, PR, updatedAt), repo summaries (overview/branch state), and session summaries (id, name, status, unread, model — no transcript). Task actors push summary changes to the organization actor when they mutate. The organization actor broadcasts the updated entity to connected clients. `getOrganizationSummary` reads from local tables only — no fan-out to child actors. +- **Workspace actor** materializes sidebar-level data in its own SQLite: repo catalog, task summaries (title, status, branch, PR, updatedAt), repo summaries (overview/branch state), and session summaries (id, name, status, unread, model — no transcript). Task actors push summary changes to the workspace actor when they mutate. The workspace actor broadcasts the updated entity to connected clients. `getWorkspaceSummary` reads from local tables only — no fan-out to child actors. - **Task actor** materializes its own detail state (session summaries, sandbox info, diffs, file tree). `getTaskDetail` reads from the task actor's own SQLite. The task actor broadcasts updates directly to clients connected to it. -- **Session data** lives on the task actor but is a separate subscription topic. The task topic includes `sessions_summary` (list without content). The `session` topic provides full transcript and draft state. Clients subscribe to the `session` topic for whichever session is active, and filter `sessionUpdated` events by session ID (ignoring events for other sessions on the same actor). -- There is no fan-out on the read path. The organization actor owns all task summaries locally. +- **Session data** lives on the task actor but is a separate subscription topic. The task topic includes `sessions_summary` (list without content). The `session` topic provides full transcript and draft state. Clients subscribe to the `session` topic for whichever session tab is active, and filter `sessionUpdated` events by session ID (ignoring events for other sessions on the same actor). +- The expensive fan-out (querying every project/task actor) only exists as a background reconciliation/rebuild path, never on the hot read path. -### Subscription manager +### Interest manager -The subscription manager (`packages/client`) is a global singleton that manages WebSocket connections, cached state, and subscriptions for all topics. It: +The interest manager (`packages/client`) is a global singleton that manages WebSocket connections, cached state, and subscriptions for all topics. It: - **Deduplicates** — multiple subscribers to the same topic share one connection and one cached state. - **Grace period (30s)** — when the last subscriber leaves, the connection and state stay alive for 30 seconds before teardown. This keeps data warm for back-navigation and prevents thrashing. -- **Exposes a single hook** — `useSubscription(topicKey, params)` returns `{ data, status, error }`. Null params = no subscription (conditional subscription). -- **Shared harness, separate implementations** — the `SubscriptionManager` interface is shared between mock and remote implementations. The mock implementation uses in-memory state. The remote implementation uses WebSocket connections. The API/client exposure is identical for both. +- **Exposes a single hook** — `useInterest(topicKey, params)` returns `{ data, status, error }`. Null params = no subscription (conditional interest). +- **Shared harness, separate implementations** — the `InterestManager` interface is shared between mock and remote implementations. The mock implementation uses in-memory state. The remote implementation uses WebSocket connections. The API/client exposure is identical for both. ### Topics @@ -152,48 +116,23 @@ Each topic maps to one actor connection and one event stream: | Topic | Actor | Event | Data | |---|---|---|---| -| `app` | Organization `"app"` | `appUpdated` | Auth, orgs, onboarding | -| `organization` | Organization `{organizationId}` | `organizationUpdated` | Repo catalog, task summaries, repo summaries | -| `task` | Task `{organizationId, repoId, taskId}` | `taskUpdated` | Session summaries, sandbox info, diffs, file tree | -| `session` | Task `{organizationId, repoId, taskId}` (filtered by sessionId) | `sessionUpdated` | Transcript, draft state | +| `app` | Workspace `"app"` | `appUpdated` | Auth, orgs, onboarding | +| `workspace` | Workspace `{workspaceId}` | `workspaceUpdated` | Repo catalog, task summaries, repo summaries | +| `task` | Task `{workspaceId, repoId, taskId}` | `taskUpdated` | Session summaries, sandbox info, diffs, file tree | +| `session` | Task `{workspaceId, repoId, taskId}` (filtered by sessionId) | `sessionUpdated` | Transcript, draft state | | `sandboxProcesses` | SandboxInstance | `processesUpdated` | Process list | -The client subscribes to `app` always, `organization` when entering an organization, `task` when viewing a task, and `session` when viewing a specific session. At most 4 actor connections at a time (app + organization + task + sandbox if terminal is open). The `session` topic reuses the task actor connection and filters by session ID. +The client subscribes to `app` always, `workspace` when entering a workspace, `task` when viewing a task, and `session` when viewing a specific session tab. At most 4 actor connections at a time (app + workspace + task + sandbox if terminal is open). The `session` topic reuses the task actor connection and filters by session ID. ### Rules - Do not add `useQuery` with `refetchInterval` for data that should be push-based. - Do not broadcast empty notification events. Events must carry the full new state of the changed entity. - Do not re-fetch full snapshots after mutations. The mutation triggers a server-side broadcast with the new entity state; the client replaces it in local state. -- All event subscriptions go through the subscription manager. Do not create ad-hoc `handle.connect()` + `conn.on()` patterns. -- Backend mutations that affect sidebar data (task title, status, branch, PR state) must push the updated summary to the parent organization actor, which broadcasts to organization subscribers. +- All event subscriptions go through the interest manager. Do not create ad-hoc `handle.connect()` + `conn.on()` patterns. +- Backend mutations that affect sidebar data (task title, status, branch, PR state) must push the updated summary to the parent workspace actor, which broadcasts to workspace subscribers. - Comment architecture-related code: add doc comments explaining the materialized state pattern, why deltas flow the way they do, and the relationship between parent/child actor broadcasts. New contributors should understand the data flow from comments alone. -## Sandbox Architecture - -- Structurally, the system supports multiple sandboxes per task, but in practice there is exactly one active sandbox per task. Design features assuming one sandbox per task. If multi-sandbox is needed in the future, extend at that time. -- Each task has a **primary user** (owner) whose GitHub OAuth credentials are injected into the sandbox for git operations. The owner swaps when a different user sends a message. See `.context/proposal-task-owner-git-auth.md` for the full design. -- **Security: OAuth token scope.** The user's GitHub OAuth token has `repo` scope, granting full control of all private repositories the user has access to. When the user is the active task owner, their token is injected into the sandbox. This means the agent can read/write ANY repo the user has access to, not just the task's target repo. This is the standard trade-off for OAuth-based git integrations (same as GitHub Codespaces, Gitpod). The user consents to `repo` scope at sign-in time. Credential files in the sandbox are `chmod 600` and overwritten on owner swap. -- All git operations in the sandbox must be auto-authenticated. Never configure git to prompt for credentials (no interactive `GIT_ASKPASS` prompts). Use a credential store file that is pre-populated with the active owner's token. -- All git operation errors (push 401, clone failure, branch protection rejection) must surface in the UI with actionable context. Never silently swallow git errors. - -## Git State Policy - -- The backend stores zero git state. No local clones, no refs, no working trees, and no git-spice. -- Repository metadata (branches, default branch, pull requests) comes from GitHub API data and webhook events already flowing into the system. -- All git operations that require a working tree run inside the task's sandbox via `executeInSandbox()`. -- Do not add backend git clone paths, `git fetch`, `git for-each-ref`, or direct backend git CLI calls. If you need git data, either read stored GitHub metadata or run the command inside a sandbox. -- The `BackendDriver` has no `GitDriver` or `StackDriver`. Only `GithubDriver` and `TmuxDriver` remain. - -## React Hook Dependency Safety - -- **Never use unstable references as `useEffect`/`useMemo`/`useCallback` dependencies.** React compares dependencies by reference, not value. Expressions like `?? []`, `?? {}`, `.map(...)`, `.filter(...)`, or object/array literals create new references every render, causing infinite re-render loops when used as dependencies. -- If the upstream value may be `undefined`/`null` and you need a fallback, either: - - Use the raw upstream value as the dependency and apply the fallback inside the effect body: `useEffect(() => { doThing(value ?? []); }, [value]);` - - Derive a stable primitive key: `const key = JSON.stringify(value ?? []);` then depend on `key` - - Memoize: `const stable = useMemo(() => value ?? [], [value]);` -- When reviewing code, treat any `?? []`, `?? {}`, or inline `.map()/.filter()` in a dependency array as a bug. - ## UI System - Foundry's base UI system is `BaseUI` with `Styletron`, plus Foundry-specific theme/tokens on top. Treat that as the default UI foundation. @@ -204,7 +143,6 @@ The client subscribes to `app` always, `organization` when entering an organizat - If a requested UI cannot be implemented cleanly with an existing `BaseUI` component, stop and ask the user whether they are sure they want to diverge from the system. - In that case, recommend the closest existing `BaseUI` components or compositions that could satisfy the need before proposing custom UI work. - Only introduce custom UI primitives when `BaseUI` and existing Foundry patterns are not sufficient, or when the user explicitly confirms they want the divergence. -- **Styletron atomic CSS rule:** Never mix CSS shorthand properties with their longhand equivalents in the same style object (including nested pseudo-selectors like `:hover`), or in a base styled component whose consumers override with longhand via `$style`. This includes `padding`/`paddingLeft`, `margin`/`marginTop`, `background`/`backgroundColor`, `border`/`borderLeft`, etc. Styletron generates independent atomic classes for shorthand and longhand, so they conflict unpredictably. Use `backgroundColor: "transparent"` instead of `background: "none"` for button resets. Always use longhand properties when any side may be overridden individually. ## Runtime Policy @@ -218,7 +156,6 @@ The client subscribes to `app` always, `organization` when entering an organizat - If the system reaches an unexpected state, raise an explicit error with actionable context. - Do not fail silently, swallow errors, or auto-ignore inconsistent data. - Prefer fail-fast behavior over hidden degradation when correctness is uncertain. -- **Never use bare `catch {}` or `catch { }` blocks.** Every catch must at minimum log the error with `logActorWarning` or `console.warn`. Silent catches hide bugs and make debugging impossible. If a catch is intentionally degrading (e.g. returning empty data when a sandbox is expired), it must still log so operators can see what happened. Use `catch (error) { logActorWarning(..., { error: resolveErrorMessage(error) }); }` or equivalent. ## RivetKit Dependency Policy @@ -228,10 +165,20 @@ For all Rivet/RivetKit implementation: 2. SQLite is **per actor instance** (per actor key), not a shared backend-global database: - Each actor instance gets its own SQLite DB. - Schema design should assume a single actor instance owns the entire DB. - - Do not add `organizationId`/`repoId`/`taskId` columns just to "namespace" rows for a given actor instance; use actor state and/or the actor key instead. - - Example: the `task` actor instance already represents `(organizationId, repoId, taskId)`, so its SQLite tables should not need those columns for primary keys. + - Do not add `workspaceId`/`repoId`/`taskId` columns just to "namespace" rows for a given actor instance; use actor state and/or the actor key instead. + - Example: the `task` actor instance already represents `(workspaceId, repoId, taskId)`, so its SQLite tables should not need those columns for primary keys. 3. Do not use backend-global SQLite singletons; database access must go through actor `db` providers (`c.db`). -4. The default dependency source for RivetKit is the published `rivetkit` package so monorepo installs and CI remain self-contained. +4. The default dependency source for RivetKit is the published `rivetkit` package so workspace installs and CI remain self-contained. +5. When working on coordinated RivetKit changes, you may temporarily relink to a local checkout instead of the published package. + - Dedicated local checkout for this workspace: `/Users/nathan/conductor/workspaces/task/rivet-checkout` + - Preferred local link target: `../rivet-checkout/rivetkit-typescript/packages/rivetkit` + - Sub-packages (`@rivetkit/sqlite-vfs`, etc.) resolve transitively from the RivetKit workspace when using the local checkout. +6. Before using a local checkout, build RivetKit in the rivet repo: + ```bash + cd ../rivet-checkout/rivetkit-typescript + pnpm install + pnpm build -F rivetkit + ``` ## Rivet Routing @@ -239,66 +186,36 @@ For all Rivet/RivetKit implementation: - Do not add an extra proxy or manager-specific route layer in the backend. - Let RivetKit own metadata/public endpoint behavior for `/v1/rivet`. -## Organization + Actor Rules +## Workspace + Actor Rules -- Everything is scoped to an organization. -- Organization resolution order: `--organization` flag -> config default -> `"default"`. -- `ControlPlaneActor` is replaced by `OrganizationActor` (organization coordinator). -- Every actor key must be prefixed with organization namespace (`["org", organizationId, ...]`). +- Everything is scoped to a workspace. +- Workspace resolution order: `--workspace` flag -> config default -> `"default"`. +- `ControlPlaneActor` is replaced by `WorkspaceActor` (workspace coordinator). +- Every actor key must be prefixed with workspace namespace (`["ws", workspaceId, ...]`). - CLI/TUI/GUI must use `@sandbox-agent/foundry-client` (`packages/client`) for backend access; `rivetkit/client` imports are only allowed inside `packages/client`. - Do not add custom backend REST endpoints (no `/v1/*` shim layer). - We own the sandbox-agent project; treat sandbox-agent defects as first-party bugs and fix them instead of working around them. - Keep strict single-writer ownership: each table/row has exactly one actor writer. -- Parent actors (`organization`, `task`, `sandbox-instance`) use command-only loops with no timeout. +- Parent actors (`workspace`, `project`, `task`, `history`, `sandbox-instance`) use command-only loops with no timeout. - Periodic syncing lives in dedicated child actors with one timeout cadence each. -- **Task actors must be created lazily** — never during sync or bulk operations. PR sync writes virtual entries to the org's local `taskIndex`/`taskSummaries` tables. The task actor is created on first user interaction via `getOrCreate`. See `packages/backend/CLAUDE.md` "Lazy Task Actor Creation" for details. - Do not build blocking flows that wait on external systems to become ready or complete. Prefer push-based progression driven by actor messages, events, webhooks, or queue/workflow state changes. - Use workflows/background commands for any repo sync, sandbox provisioning, agent install, branch restack/rebase, or other multi-step external work. Do not keep user-facing actions/requests open while that work runs. - `send` policy: always `await` the `send(...)` call itself so enqueue failures surface immediately, but default to `wait: false`. +- Only use `send(..., { wait: true })` for short, bounded local mutations (e.g. a DB write that returns a result the caller needs). Never use `wait: true` for operations that depend on external readiness, polling actors, provider setup, repo/network I/O, sandbox sessions, GitHub API calls, or long-running queue drains. - Never self-send with `wait: true` from inside a workflow handler — the workflow processes one message at a time, so the handler would deadlock waiting for the new message to be dequeued. +- When an action is void-returning and triggers external work, use `wait: false` and let the UI react to state changes pushed by the workflow. +- Request/action contract: wait only until the minimum resource needed for the client's next step exists. Example: task creation may wait for task actor creation/identity, but not for sandbox provisioning or session bootstrap. - Read paths must not force refresh/sync work inline. Serve the latest cached projection, mark staleness explicitly, and trigger background refresh separately when needed. - If a workflow needs to resume after some external work completes, model that as workflow state plus follow-up messages/events instead of holding the original request open. - No retries: never add retry loops (`withRetries`, `setTimeout` retry, exponential backoff) anywhere in the codebase. If an operation fails, surface the error immediately. If a dependency is not ready yet, model that explicitly with workflow state and resume from a push/event instead of polling or retry loops. -- Never throw errors that expect the caller to retry (e.g. `throw new Error("... retry shortly")`). If a dependency is not ready, write the current state to the DB with an appropriate pending status, enqueue the async work, and return successfully. Let the client observe the pending → ready transition via push events. -- Action return contract: every action that creates a resource must write the resource record to the DB before returning, so the client can immediately query/render it. The record may have a pending status, but it must exist. Never return an ID that doesn't yet have a corresponding DB row. - -### Action handler responsiveness - -Action handlers must return fast. The pattern: - -1. **Creating an entity** — `wait: true` is fine. Do the DB write, return the ID/record. The caller needs the ID to proceed. The record may have a pending status; that's expected. -2. **Enqueuing work** (sending a message, triggering a sandbox operation, starting a sync) — `wait: false`. Write any precondition state to the DB synchronously, enqueue the work, and return. The client observes progress via push events on the relevant topic (session status, task status, etc.). -3. **Validating preconditions** — check state synchronously in the action handler *before* enqueuing. If a precondition isn't met (e.g. session not ready, task not initialized), throw an error immediately. Do not implicitly provision missing dependencies or poll for readiness inside the action handler. It is the client's responsibility to ensure preconditions are met before calling the action. - -Examples: -- `createTask` → `wait: true` (returns `{ taskId }`), then enqueue provisioning with `wait: false`. Client sees task appear immediately with pending status, observes `ready` via organization events. -- `sendWorkspaceMessage` → validate session is `ready` (throw if not), enqueue with `wait: false`. Client observes session transition to `running` → `idle` via session events. -- `createWorkspaceSession` → `wait: true` (returns `{ sessionId }`), enqueue sandbox provisioning with `wait: false`. Client observes `pending_provision` → `ready` via task events. - -Never use `wait: true` for operations that depend on external readiness, sandbox I/O, agent responses, git network operations, polling loops, or long-running queue drains. Never hold an action open while waiting for an external system to become ready — that is a polling/retry loop in disguise. - -### Timeout policy - -All `wait: true` sends must have an explicit `timeout`. Maximum timeout for any `wait: true` send is **10 seconds** (`10_000`). If an operation cannot reliably complete within 10 seconds, it must be restructured: write the initial record to the DB, return it to the caller, and continue the work asynchronously with `wait: false`. The client observes completion via push events. - -`wait: false` sends do not need a timeout (the enqueue is instant; the work runs in the workflow loop with its own step-level timeouts). - -### Task creation: resolve metadata before creating the actor - -When creating a task, all deterministic metadata (title, branch name) must be resolved synchronously in the organization actor *before* the task actor is created. The task actor must never be created with null `branchName` or `title`. - -- Title is derived from the task description via `deriveFallbackTitle()` — pure string manipulation, no external I/O. -- Branch name is derived from the title via `sanitizeBranchName()` + conflict checking against the repository's task index. -- The organization actor owns the task index and reads GitHub-backed default branch metadata from the github-data actor. Resolve the branch name there without local git fetches. -- Do not defer naming to a background provision workflow. Do not poll for names to become available. -- The `onBranch` path (attaching to an existing branch) and the new-task path should both produce a fully-named task record on return. - Actor handle policy: - Prefer explicit `get` or explicit `create` based on workflow intent; do not default to `getOrCreate`. - Use `get`/`getForId` when the actor is expected to already exist; if missing, surface an explicit `Actor not found` error with recovery context. - Use create semantics only on explicit provisioning/create paths where creating a new actor instance is intended. - `getOrCreate` is a last resort for create paths when an explicit create API is unavailable; never use it in read/command paths. - For long-lived cross-actor links (for example sandbox/session runtime access), persist actor identity (`actorId`) and keep a fallback lookup path by actor id. -- RivetKit actor `c.state` is durable, but in Docker it is stored under `/root/.local/share/rivetkit`. If that path is not persisted, actor state-derived indexes can be lost after container recreation even when other data still exists. +- Docker dev: `compose.dev.yaml` mounts a named volume at `/root/.local/share/foundry/repos` to persist backend-managed git clones across restarts. Code must still work if this volume is not present (create directories as needed). +- RivetKit actor `c.state` is durable, but in Docker it is stored under `/root/.local/share/rivetkit`. If that path is not persisted, actor state-derived indexes (for example, in `project` actor state) can be lost after container recreation even when other data still exists. - Workflow history divergence policy: - Production: never auto-delete actor state to resolve `HistoryDivergedError`; ship explicit workflow migrations (`ctx.removed(...)`, step compatibility). - Development: manual local state reset is allowed as an operator recovery path when migrations are not yet available. @@ -316,10 +233,8 @@ When creating a task, all deterministic metadata (title, branch name) must be re - For Foundry live verification, use `rivet-dev/sandbox-agent-testing` as the default testing repo unless the task explicitly says otherwise. - Secrets (e.g. `OPENAI_API_KEY`, `GITHUB_TOKEN`/`GH_TOKEN`) must be provided via environment variables, never hardcoded in the repo. - `~/misc/env.txt` and `~/misc/the-foundry.env` contain the expected local OpenAI + GitHub OAuth/App config for dev. - - For local GitHub webhook development, use the configured Smee proxy (`SMEE_URL`) to forward deliveries into `POST /v1/webhooks/github`. Check `.env` / `foundry/.env` if you need the current channel URL. - - If GitHub repos, PRs, or install state are not showing up, verify that the GitHub App is installed for the organization and that webhook delivery is enabled and healthy. Foundry depends on webhook events for GitHub-backed state; missing webhooks means the product will appear broken. - Do not assume `gh auth token` is sufficient for Foundry task provisioning against private repos. Sandbox/bootstrap git clone, push, and PR flows require a repo-capable `GITHUB_TOKEN`/`GH_TOKEN` in the backend container. - - Preferred product behavior for organizations is to mint a GitHub App installation token from the organization installation and inject it into backend/sandbox git operations. Do not rely on an operator's ambient CLI auth as the long-term solution. + - Preferred product behavior for org workspaces is to mint a GitHub App installation token from the workspace installation and inject it into backend/sandbox git operations. Do not rely on an operator's ambient CLI auth as the long-term solution. - Treat client E2E tests in `packages/client/test` as the primary end-to-end source of truth for product behavior. - Keep backend tests small and targeted. Only retain backend-only tests for invariants or persistence rules that are not well-covered through client E2E. - Do not keep large browser E2E suites around in a broken state. If a frontend browser E2E is not maintained and producing signal, remove it until it can be replaced with a reliable test. @@ -365,9 +280,9 @@ Each entry must include: - Friction/issue - Attempted fix/workaround and outcome -## Audit Log Events +## History Events -Log notable workflow changes to `events` so the audit log remains complete: +Log notable workflow changes to `events` so `hf history` remains complete: - create - attach @@ -376,8 +291,6 @@ Log notable workflow changes to `events` so the audit log remains complete: - status transitions - PR state transitions -When adding new task/workspace commands, always add a corresponding audit log event. - ## Validation After Changes Always run and fix failures: diff --git a/foundry/FOUNDRY-CHANGES.md b/foundry/FOUNDRY-CHANGES.md deleted file mode 100644 index 2bd76d2..0000000 --- a/foundry/FOUNDRY-CHANGES.md +++ /dev/null @@ -1,1456 +0,0 @@ -# Foundry Planned Changes - -## How to use this document - -Work through items checking boxes as you go. Some items have dependencies — do not start an item until its dependencies are checked off. After each item, run `pnpm -w typecheck && pnpm -w build && pnpm -w test` to validate. If an item includes a "CLAUDE.md update" section, apply it in the same change. Commit after each item passes validation. - -## Progress Log - -- 2026-03-14 10: Initial architecture mapping complete. - - Confirmed the current hot spots match the spec: `auth-user` is still mutation-by-action, `history` is still a separate actor with an `append` action wrapper, organization still owns `taskLookup`/`taskSummaries`, and the `Workbench*` surface is still shared across backend/client/frontend. - - Started foundational rename and migration planning for items `1`, `6`, and `25` because they drive most of the later fallout. -- 2026-03-14 11: Audit-log rename slice landed. - - Renamed the backend actor from `history` to `audit-log`, switched the queue name to `auditLog.command.append`, and removed the `append` action wrapper. - - Updated task/repository/organization call sites to send directly to the audit-log queue or read through the renamed audit-log handle. -- 2026-03-14 12: Foundational naming and dead-surface cleanup landed. - - Renamed the backend auth actor surface from `authUser` to `user`, including actor registration, key helpers, handles, and Better Auth service routing. - - Deleted the dead `getTaskEnriched` / `enrichTaskRecord` fan-out path and changed organization task reads to go straight to the task actor. - - Renamed admin-only GitHub rebuild/reload actions with the `admin*` prefix across backend, client, and frontend. - - Collapsed organization realtime to full-snapshot `organizationUpdated` events and aligned task events to `type: "taskUpdated"`. -- 2026-03-14 13: Task schema migration cleanup landed. - - Removed the task actor's runtime `CREATE TABLE IF NOT EXISTS` / `ALTER TABLE` helpers from `task/workbench.ts` and `task/workflow/init.ts`. - - Updated the checked-in task migration artifacts so the schema-defined task/session/runtime columns are created directly by migrations. -- 2026-03-14 14: Item 3 blocker documented. - - The spec's requested literal singleton `CHECK (id = 1)` on the Better Auth `user` table conflicts with the existing Better Auth adapter contract, which relies on external string `user.id`. - - Proceeding safely will require a design adjustment for that table rather than a straight mechanical migration. -- 2026-03-14 15: Better Auth mapping comments landed. - - Added Better Auth vs custom Foundry table/action comments in the user and organization actor schema/action surfaces so the adapter-constrained paths are explicit. -- 2026-03-15 09: Branch rename surface deleted and stale organization subscription fixed. - - Removed the remaining branch-rename surface from the client, mock backend, frontend UI, and repository action layer. There are no remaining `renameBranch` / `renameWorkbenchBranch` references in Foundry. - - Fixed the remote backend client to listen for `organizationUpdated` on the organization connection instead of the dead `workspaceUpdated` event name. -- 2026-03-15 10: Backend workspace rename landed. - - Renamed the backend task UI/workflow surface from `workbench` to `workspace`, including the task actor file, queue topic family, organization proxy actions, and the task session table name (`task_workspace_sessions`). - - Backend actor code no longer contains `Workbench` / `workbench` references, so the remaining shared/client/frontend rename can align to a stable backend target. -- 2026-03-15 11: Default model moved to user-scoped app state. - - Removed `defaultModel` from the organization schema/snapshot and stored it on the user profile instead, exposed through the app snapshot as a user preference. - - Wired `setAppDefaultModel` through the backend/app clients and changed the model picker to persist the starred/default model instead of resetting local React state on reload. -- 2026-03-15 11: Workspace surface completed across Foundry packages. - - Renamed the shared/client/frontend surface from `Workbench` to `Workspace`, including `workspace.ts`, workspace client/model files, DTO/type names, backend-client method names, frontend view-model imports, and the affected e2e/test files. - - Verified that Foundry backend/shared/client/frontend packages no longer contain `Workbench` / `workbench` references. -- 2026-03-15 11: Singleton constraints tightened where safe. - - Added `CHECK (id = 1)` enforcement for `github_meta`, `repo_meta`, `organization_profile`, and `user_profiles`, and updated the affected code paths/migrations to use row id `1`. - - The Better Auth `user` table remains blocked by the adapter contract, so item `3` is still open overall. -- 2026-03-14 12: Confirmed blocker for later user-table singleton work. - - Item `3` conflicts with the current Better Auth adapter contract for the `user` table: the adapter depends on the external string `user.id`, while the spec also asks for a literal singleton `CHECK (id = 1)` on that same table. - - That cannot be applied mechanically without redesigning the Better Auth adapter contract or introducing a separate surrogate identity column. I have not forced that change yet. -- 2026-03-15 13: Task/repository durable-state cleanup and auth-scoped workspace reads landed. - - Removed the remaining task/repository actor durable-state duplication: task `createState` now holds only `(organizationId, repoId, taskId)`, repository `createState` now holds only `(organizationId, repoId)`, task initialization seeds SQLite from the initialize queue payload, and task record reads fetch `repoRemote` through repository metadata instead of stale actor state. - - Removed the repository creation-time `remoteUrl` dependency from actor handles/callers and changed repository metadata to backfill/persist `remoteUrl` from GitHub data when needed. - - Wired Better Auth session ids through the remote client workspace/task-detail reads and through the task workflow queue handlers so user-scoped workspace state is no longer dropped on the floor by the organization/task proxy path. -- 2026-03-15 14: Coordinator routing boundary tightened. - - Removed the organization actor's fallback `taskId -> repoId` scan across repositories; task proxy actions now require `repoId` and route directly to the repository/task coordinator path the client already uses. - - Updated backend architecture notes to reflect the live repo-owned task projection (`tasks`) and the removal of the old organization-owned `taskLookup` / `taskSummaries` indexes. -- 2026-03-15 15: Workspace session-selection and dead task-status cleanup landed. - - Surfaced viewer-scoped `activeSessionId` through workspace task summary/detail DTOs, threaded it through the backend/client/mock surfaces, and added a dedicated workspace `select_session` mutation so session-tab selection now persists in `user_task_state` instead of living only in frontend local state. - - Removed dead task `diffStat` and sandbox `statusMessage` fields from the live workspace/task contracts and backend writes, and updated stale frontend/mock/e2e consumers to stop reading them. -- 2026-03-15 16: GitHub sync progress is now live on the organization topic. - - Added persisted GitHub sync phase/generation/progress fields to the github-data actor meta row and the organization profile projection, and exposed them through `organizationUpdated` snapshots so workspace consumers no longer wait on stale app-topic state during repo imports. - - Chunked branch and pull-request fetches by repository batches, added generation markers to imported GitHub rows, switched sync refreshes to upsert+sweep instead of delete-then-replace, and updated the workspace shell/dev panel to show live sync phase progress from the organization subscription. -- 2026-03-15 17: Foundry-local model lists now route through shared Sandbox Agent config resources. - - Removed the remaining duplicated hardcoded model tables from the frontend/client workspace view-model layer and switched backend default-model / agent-inference fallbacks to the shared catalog helpers in `shared/src/models.ts`. - - Updated mock/default app state to stop seeding deleted `claude-sonnet-4` / `claude-opus-4` ids, and aligned the user-profile default-model migration fallback with the shared catalog default. -- 2026-03-15 17: Shared model catalog moved off the old fixed union. - - Replaced the shared `WorkspaceModelId` closed union with string ids, introduced a shared model catalog derived from the sandbox-agent agent-config resources, and switched the client/frontend picker label helpers to consume that catalog instead of maintaining separate hardcoded `MODEL_GROUPS` arrays. - - Updated backend default-model and model→agent fallback logic to use the shared catalog/default id, and relaxed e2e env parsing so new sandbox-agent model ids can flow through without patching Foundry first. -- 2026-03-15 18: Workspace task status collapsed to a single live field. - - Removed the duplicate `runtimeStatus` field from workspace task/detail DTOs and all current backend/client/frontend consumers, so workspace task `status` is now the only task-state field on that surface. - - Removed the remaining synthetic `"new"` task status from the live workspace path; mock task creation now starts in the first concrete init state instead of exposing a frontend-only status. -- 2026-03-15 19: GitHub sync now persists branch and PR batches as they are fetched. - - The branch and pull-request phases now upsert each fetched repository batch immediately and only sweep stale rows after the phase completes, instead of buffering the full dataset in memory until the end of the sync. - - This aligns chunked progress reporting with chunked persistence and tightens recovery behavior for large repository imports. -- 2026-03-15 20: Repository-owned task projection artifacts are now aligned with runtime. - - Removed the last stale `task_lookup` Drizzle artifacts from the organization actor so the checked-in schema snapshots match the live repository-owned `tasks` projection. - - There are no remaining org/repo runtime references to the old org-side task lookup table. -- 2026-03-15 21: Legacy task/runtime fields are fully gone from the live Foundry surface. - - Confirmed the old task-table/runtime fields from item `21` are removed across backend/shared/client/frontend, and renamed the last leftover `agentTypeForModel()` helper to the neutral `sandboxAgentIdForModel()`. - - Deleted the final dead frontend diff-stat formatter/test that only referenced already-removed task diff state. -- 2026-03-15 22: Task status tracking is now fully collapsed to the canonical task status enum. - - With the earlier backend `statusMessage` removal plus this turn's workspace contract cleanup, the workspace/task surface now derives all task status UI from the canonical backend `status` enum. - - There are no remaining live workspace `runtimeStatus` or synthetic `"new"` task-state branches. -- 2026-03-15 23: Per-user workspace UI state is fully sourced from the user actor overlay. - - Confirmed the shared task actor no longer stores per-user `activeSessionId`, unread, or draft columns; those values are persisted in `user_task_state` and only projected back into workspace DTOs for the current viewer. - - The remaining active-session/unread/draft references in client/frontend code are consumer fields of that user-scoped overlay, not shared task-actor storage. -- 2026-03-15 24: Subscription topics are now fully normalized to single-snapshot events. - - Confirmed the shared realtime contracts now expose one full replacement event per topic (`appUpdated`, `organizationUpdated`, `taskUpdated`, `sessionUpdated`, `processesUpdated`) with matching wire event names and type fields. - - The client subscription manager already treats organization/task topics as full-snapshot refreshes, so there are no remaining multi-variant organization events or `taskDetailUpdated` name mismatches in live code. -- 2026-03-15 25: Sidebar PR/task split dead branches trimmed further. - - Removed the remaining dead `pr:`-id sidebar branch and switched the workspace sidebar to the real `pullRequest.isDraft` field instead of stale `pullRequest.status` reads. - - This does not finish item `15`, but it reduces the remaining synthetic PR/task split surface in the frontend. -- 2026-03-15 26: User-actor mutations now flow through a dedicated workflow queue. - - Added [user/workflow.ts](/home/nathan/sandbox-agent/foundry/packages/backend/src/actors/user/workflow.ts) plus shared query helpers, wired the user actor up with explicit queue names, and moved auth/profile/session/task-state mutations behind workflow handlers instead of direct action bodies. -- 2026-03-15 27: Organization GitHub/shell/billing mutations now route through workflow queues. - - Added shared organization queue definitions in `organization/queues.ts`, taught the organization workflow to handle the remaining GitHub projection, org-profile, and billing mutation commands, and switched the app-shell, Better Auth, GitHub-data actor, and org-isolation test to send queue messages instead of calling direct org mutation actions. - - Deleted the dead organization shell mutation actions that no longer had callers (`applyOrganizationSyncCompleted`, `markOrganizationSyncFailed`, `applyGithubInstallationCreated`, `applyGithubInstallationRemoved`, `applyGithubRepositoryChanges`), which moves items `4`, `10`, and `12` forward even though the broader org action split is still open. -- 2026-03-15 28: Organization action split trimmed more of the monolith and removed dead event types. - - Moved `starSandboxAgentRepo` into `organization/actions/onboarding.ts` and the admin GitHub reload actions into `organization/actions/github.ts`, so `organization/actions.ts` is carrying fewer unrelated app-shell responsibilities. - - Deleted the dead backend-only `actors/events.ts` type file after confirming nothing in Foundry still imports those old task/PR event interfaces. -- 2026-03-15 29: Repo overview branch rows now carry a single PR object. - - Replaced the repo-overview branch DTO's scalar PR fields (`prNumber`, `prState`, `prUrl`, `reviewStatus`, `reviewer`) with `pullRequest: WorkspacePullRequestSummary | null`, and updated repository overview assembly plus the organization dashboard to consume that unified PR shape. - - This does not finish item `15`, but it removes another synthetic PR-only read surface and makes the repo overview align better with the task summary PR model. -- 2026-03-15 30: Repo overview stopped falling back to raw GitHub PR rows. - - Changed repository overview assembly to read PR metadata only from the repo-owned task projection instead of rejoining live GitHub PR rows on read, so the dashboard is one step closer to treating PRs as task data rather than a separate UI entity. -- 2026-03-15 31: GitHub organization-shell repair now uses the org workflow queue. - - Converted `syncOrganizationShellFromGithub` from a direct org action into a workflow-backed mutation command and updated the GitHub org sync path to send `organization.command.github.organization_shell.sync_from_github` instead of calling the action directly. - - Updated Better Auth adapter writes and task user-overlay writes to send directly to the user workflow queue, which partially lands item `4` and sets up item `11` for the user actor. -- 2026-03-15 27: Workflow layout standardized and queue-only write paths expanded. - - Split the remaining inline actor workflows into dedicated files for `audit-log`, `repository`, `github-data`, and `organization`, and moved user read actions into `user/actions/*` with Better Auth-prefixed action names. - - Removed the task actor's public mutation action wrappers entirely, moved organization/repository/github-data/task coordination onto direct queue sends, and made repository metadata reads stop mutating `repo_meta` on cache misses. -- 2026-03-15 28: PR-only admin/UI seams trimmed and PR branches now claim real tasks. - - Removed the remaining dedicated "reload pull requests" / "reload pull request" admin hooks from the backend/client/frontend surfaces and deleted the sidebar PR-only context action. - - Repository PR refresh now lazily creates a branch-owned task when a pull request arrives for an unclaimed branch, so PR-only branches stop living purely as a side table in GitHub sync flows. -- 2026-03-15 29: Organization Better Auth writes now use workflow queues. - - Split the organization actor's Better Auth routing and verification reads into `organization/actions/better-auth.ts`, moved `APP_SHELL_ORGANIZATION_ID` to `organization/constants.ts`, and renamed the org Better Auth read surface to the `betterAuth*` form. - - Added dedicated organization workflow queue handlers for session/email/account index writes plus verification CRUD, and updated `services/better-auth.ts` to send those mutations directly to organization queues instead of calling mutation actions. -- 2026-03-15 30: Shared model routing metadata is now centralized. - - Extended the shared model catalog with explicit `agentKind` and `sandboxAgentId` metadata, changed `WorkspaceAgentKind` to a dynamic string, and switched backend task session creation to resolve sandbox agent ids through the shared catalog instead of hardcoded `Codex` vs `Claude` branching. - - Updated the mock app/workspace and frontend model picker/new-task flows to consume the shared catalog/default model instead of forcing stale `Claude`/`Codex` fallbacks or a baked-in `gpt-5.3-codex` create-task default. -- 2026-03-15 31: Dead GitHub-data PR reload surface removed and fixture PR shapes aligned. - - Deleted the unused GitHub-data `reloadPullRequest` workflow command plus the dead `listOpenPullRequests` / `getPullRequestForBranch` action surface that no longer has live Foundry callers. - - Fixed the stale client `workspace-model.ts` pull-request fixtures to use the live `WorkspacePullRequestSummary` shape, which removes the last targeted client type errors in the touched slice. -- 2026-03-15 32: Organization action splitting continued past Better Auth. - - Moved the app snapshot/default-model/org-profile actions into `organization/actions/organization.ts`, onboarding actions into `organization/actions/onboarding.ts`, and app-level GitHub token/import actions into `organization/actions/github.ts`, then composed those files at the actor boundary. - - `organization/app-shell.ts` now exports shared helpers for those domains and no longer directly defines the moved action handlers, shrinking the remaining monolith and advancing item `10`. -- 2026-03-15 33: Task PR detail now reads the repository-owned task projection. - - Removed duplicate scalar PR fields from `TaskRecord` and `WorkspaceTaskDetail`, switched the remaining frontend/client consumers to the canonical `pullRequest` object, and trimmed stale mock/test scaffolding that still populated those dead fields. - - Replaced the task actor's PR lookup path with a repository projection read (`getProjectedTaskSummary`) so task detail/summary no longer ask the repo actor to re-query GitHub PR rows by branch. -- 2026-03-15 34: Workspace model catalogs now come from the live sandbox-agent API. - - Added a shared normalizer for `/v1/agents?config=true` payloads, exposed sandbox-scoped `listWorkspaceModelGroups()` from the task sandbox actor, and switched backend workspace session creation to resolve sandbox agent ids from the live sandbox catalog instead of only the checked-in default tables. - - Updated the frontend workspace model picker to query the active sandbox for model groups and use that live catalog for labels/options, while keeping the shared default catalog only as a fallback when no sandbox is available yet or the sandbox-agent connection is unavailable. -- 2026-03-15 35: Backend-only organization snapshot refresh is now queue-backed. - - Added `organization.command.snapshot.broadcast` to the organization workflow, switched repository and app-import callers to send that queue message instead of calling the organization actor's `refreshOrganizationSnapshot` action directly, and removed the direct action wrapper. - - Deleted the dead `adminReconcileWorkspaceState` organization action/interface entry after confirming nothing in Foundry still calls it. -- 2026-03-15 36: Dead backend actor export cleanup continued. - - Removed the stale `export * from "./events.js"` line from `backend/src/actors/index.ts`, which was left behind after deleting the dead backend event type file. - - This keeps the backend actor barrel aligned with the live file set and advances the final dead-code/event audit. -- 2026-03-15 34: Item 17 removed from this checklist; do not leave started items half-finished. - - By request, item `17` (`Type all actor context parameters — remove c: any`) is deferred out of this Foundry task and should not block completion here. - - Process note for the remaining checklist work: once an item is started, finish that item to completion before opening a different partial seam. Item `15` is the current priority under that rule. -- 2026-03-15 35: Task/PR unification now routes live PR changes through repository-owned task summaries only. - - GitHub PR sync and webhook handling now send concrete PR summaries directly to the repository coordinator, which lazily creates a real branch-owned task when needed and persists PR metadata on the task projection instead of re-querying raw `github_pull_requests` rows from repository reads. - - Cleared the last stale scalar PR test references (`prUrl`, `reviewStatus`, `reviewer`) so the remaining Foundry surfaces consistently use the canonical `pullRequest` object. -- 2026-03-15 36: Organization action entrypoints are now fully organized under `actions/`, and the public mutation surface is queue-only. - - Moved organization task/workspace proxy actions plus `createTaskMutation` into `organization/actions/tasks.ts`, added `organization/actions/app.ts` so every composed org action bundle now lives under `organization/actions/*`, and removed dead `app-shell` exports that no longer had external callers. - - Audited the remaining public organization actor actions and confirmed the write paths go through organization/repository/task/github-data workflow queues instead of direct mutation actions, which closes item `4` and item `10`. -- 2026-03-15 37: Organization dead-code audit completed. - - Removed the leftover exported-only Better Auth predicate helper from `organization/actions/better-auth.ts`; it is now module-private because nothing outside that file uses it. - - Audited the remaining organization actor surface and confirmed the live public reads/writes still in use are the composed `actions/*` bundles plus workflow mutation helpers. There are no remaining dead org action exports from the pre-refactor monolith. -- 2026-03-15 38: Final dead-event and dead-surface audit completed for the in-scope Foundry refactor. - - Confirmed the live Foundry realtime topics each have a single event type (`appUpdated`, `organizationUpdated`, `taskUpdated`, `sessionUpdated`), and the deleted legacy event names (`workspaceUpdated`, `taskSummaryUpdated`, `taskDetailUpdated`, `pullRequestUpdated`, `pullRequestRemoved`) no longer exist in live Foundry code. - - Re-audited the major removed compatibility seams (`Workbench`, branch rename, PR-only sidebar ids, duplicate runtime task status, `getTaskEnriched`, organization-owned task lookup tables) and found no remaining live references beyond expected domain strings like GitHub webhook event names or CLI `pr` labels. -- 2026-03-15 39: Item 15 was finished for real by moving PR ownership into the task actor. - - Added task-local `pull_request_json` storage, switched task detail/summary reads to the task DB, and added `task.command.pull_request.sync` so GitHub/repository flows update PR metadata through the task coordinator instead of overlaying it in the repository projection. - - The mock right sidebar now trusts the canonical `task.pullRequest.url` field instead of rebuilding a PR URL from repo name + PR number. -- 2026-03-15 40: Better Auth user singleton constraint is now enforced without breaking the adapter contract. - - The user actor's `user` table now uses an integer singleton primary key with `CHECK (id = 1)` plus a separate `auth_user_id` column for Better Auth's external string identity. - - Updated the user actor query/join/mutation helpers so Better Auth still reads and writes logical `user.id` as the external string id while SQLite enforces the singleton row invariant locally. - -No backwards compatibility — delete old code, don't deprecate. If something is removed, remove it everywhere (backend, client, shared types, frontend, tests, mocks). - -### Suggested execution order (respects dependencies) - -**Wave 1 — no dependencies, can be done in any order:** -1, 2, 3, 4, 5, 6, 13, 16, 20, 21, 23, 25 - -**Wave 2 — depends on wave 1:** -7 (after 1), 9 (after 13), 10 (after 1+6), 11 (after 4), 22 (after 1), 24 (after 21), 26 (after 25) - -**Wave 3 — depends on wave 2:** -8 (after 7+25), 12 (after 10), 15 (after 9+13), 19 (after 21+24) - -**Wave 4 — depends on wave 3:** -14 (after 15) - -**Final:** -18 (after everything), final audit pass (after everything) - -### Index - -- [x] 1. Rename Auth User actor → User actor -- [x] 2. Add Better Auth mapping comments to user/org actor tables -- [x] 3. Enforce `id = 1` CHECK constraint on single-row tables -- [x] 4. Move all mutation actions to queue messages -- [x] 5. Migrate task actor raw SQL to Drizzle migrations -- [x] 6. Rename History actor → Audit Log actor -- [x] 7. Move starred/default model to user actor settings *(depends on: 1)* -- [x] 8. Replace hardcoded model/agent lists with sandbox-agent API data *(depends on: 7, 25)* -- [x] 9. Flatten `taskLookup` + `taskSummaries` into single `tasks` table *(depends on: 13)* -- [x] 10. Reorganize user and org actor actions into `actions/` folders *(depends on: 1, 6)* -- [x] 11. Standardize workflow file structure across all actors *(depends on: 4)* -- [x] 12. Audit and remove dead code in organization actor *(depends on: 10)* -- [x] 13. Enforce coordinator pattern and fix ownership violations -- [x] 14. Standardize one event per subscription topic *(depends on: 15)* -- [x] 15. Unify tasks and pull requests — PRs are just task data *(depends on: 9, 13)* -- [x] 16. Chunk GitHub data sync and publish progress -- [x] 18. Final pass: remove all dead code *(depends on: all other items)* -- [x] 19. Remove duplicate data between `c.state` and SQLite *(depends on: 21, 24)* -- [x] 20. Prefix admin/recovery actions with `admin` -- [x] 21. Remove legacy/session-scoped fields from task table -- [x] 22. Move per-user UI state from task actor to user actor *(depends on: 1)* -- [x] 23. Delete `getTaskEnriched` and `enrichTaskRecord` (dead code) -- [x] 24. Clean up task status tracking *(depends on: 21)* -- [x] 25. Remove "Workbench" prefix from all types, functions, files, tables -- [x] 26. Delete branch rename (branches immutable after creation) *(depends on: 25)* -- [x] Final audit pass: dead events scan *(depends on: all other items)* - -Deferred follow-up outside this checklist: - -- 17. Type all actor context parameters — remove `c: any` *(removed from this task's scope by request)* - ---- - -## [ ] 1. Rename Auth User actor → User actor - -**Rationale:** The actor is already a single per-user actor storing all user data. The "Auth" prefix is unnecessary. - -### Files to change - -- **`foundry/packages/backend/src/actors/auth-user/`** → rename directory to `user/` - - `index.ts` — rename export `authUser` → `user`, display name `"Auth User"` → `"User"` - - `db/schema.ts`, `db/db.ts`, `db/migrations.ts`, `db/drizzle.config.ts` — update any auth-prefixed references -- **`foundry/packages/backend/src/actors/keys.ts`** — `authUserKey()` → `userKey()` -- **`foundry/packages/backend/src/actors/handles.ts`** — `getOrCreateAuthUser` → `getOrCreateUser`, `getAuthUser` → `getUser`, `selfAuthUser` → `selfUser` -- **`foundry/packages/backend/src/actors/index.ts`** — update import path and registration -- **`foundry/packages/backend/src/services/better-auth.ts`** — update all `authUser` references -- **Action names** — consider dropping "Auth" prefix from `createAuthRecord`, `findOneAuthRecord`, `updateAuthRecord`, `deleteAuthRecord`, `countAuthRecords`, etc. - ---- - -## [ ] 2. Add Better Auth mapping comments to user/org actor tables, actions, and queues - -**Rationale:** The user and organization actors contain a mix of Better Auth-driven and custom Foundry code. Tables, actions, and queues that exist to serve Better Auth's adapter need comments so developers know which pieces are constrained by Better Auth's schema/contract and which are ours to change freely. - -### Table mapping - -| Actor | Table | Better Auth? | Notes | -|---|---|---|---| -| user | `user` | Yes — 1:1 `user` model | All fields from Better Auth | -| user | `session` | Yes — 1:1 `session` model | All fields from Better Auth | -| user | `account` | Yes — 1:1 `account` model | All fields from Better Auth | -| user | `user_profiles` | No — custom Foundry | GitHub login, role, eligible orgs, starter repo status | -| user | `session_state` | No — custom Foundry | Active organization per session | -| org | `auth_verification` | Yes — Better Auth `verification` model | Lives on org actor because verification happens before user exists | -| org | `auth_session_index` | No — custom routing index | Maps session tokens → user actor IDs for Better Auth adapter routing | -| org | `auth_email_index` | No — custom routing index | Maps emails → user actor IDs for Better Auth adapter routing | -| org | `auth_account_index` | No — custom routing index | Maps OAuth accounts → user actor IDs for Better Auth adapter routing | - -### Action/queue mapping (user actor) - -| Action/Queue | Better Auth? | Notes | -|---|---|---| -| `createAuthRecord` | Yes — Better Auth adapter | Called by Better Auth adapter to create user/session/account records | -| `findOneAuthRecord` | Yes — Better Auth adapter | Called by Better Auth adapter for single-record lookups with joins | -| `findManyAuthRecords` | Yes — Better Auth adapter | Called by Better Auth adapter for multi-record queries | -| `updateAuthRecord` | Yes — Better Auth adapter | Called by Better Auth adapter to update records | -| `updateManyAuthRecords` | Yes — Better Auth adapter | Called by Better Auth adapter for bulk updates | -| `deleteAuthRecord` | Yes — Better Auth adapter | Called by Better Auth adapter to delete records | -| `deleteManyAuthRecords` | Yes — Better Auth adapter | Called by Better Auth adapter for bulk deletes | -| `countAuthRecords` | Yes — Better Auth adapter | Called by Better Auth adapter for count queries | -| `getAppAuthState` | No — custom Foundry | Aggregates auth state for frontend consumption | -| `upsertUserProfile` | No — custom Foundry | Manages Foundry-specific user profile data | -| `upsertSessionState` | No — custom Foundry | Manages Foundry-specific session state | - -### Action/queue mapping (organization actor app-shell) - -| Action/Queue | Better Auth? | Notes | -|---|---|---| -| App-shell auth index CRUD actions | Yes — Better Auth adapter routing | Maintain lookup indexes so the adapter can route by session/email/account to the correct user actor | -| `auth_verification` CRUD | Yes — Better Auth `verification` model | Used for email verification and password resets | - -### Files to change - -- **`foundry/packages/backend/src/actors/auth-user/db/schema.ts`** — add doc comments to each table: - - `user`, `session`, `account`: "Better Auth core model — schema defined at https://better-auth.com/docs/concepts/database" - - `user_profiles`, `session_state`: "Custom Foundry table — not part of Better Auth" -- **`foundry/packages/backend/src/actors/auth-user/index.ts`** — add doc comments to each action/queue: - - Better Auth adapter actions: "Better Auth adapter — called by the Better Auth adapter in better-auth.ts. Schema constrained by Better Auth." - - Custom actions: "Custom Foundry action — not part of Better Auth" -- **`foundry/packages/backend/src/actors/organization/db/schema.ts`** — add doc comments to `auth_verification` (Better Auth core), and the three index tables (Better Auth adapter routing) -- **`foundry/packages/backend/src/actors/organization/app-shell.ts`** — add doc comments to auth index actions marking them as Better Auth adapter routing infrastructure - ---- - -## [x] 3. Enforce `id = 1` CHECK constraint on all single-row actor tables - -**Rationale:** When an actor instance represents a single entity, tables that hold exactly one row should enforce this at the DB level with a `CHECK (id = 1)` constraint. The task actor already does this correctly; other actors don't. - -### Tables needing the constraint - -| Actor | Table | Current enforcement | Fix needed | -|---|---|---|---| -| auth-user (→ user) | `user` | None | Add `CHECK (id = 1)`, use integer PK | -| auth-user (→ user) | `user_profiles` | None | Add `CHECK (id = 1)`, use integer PK | -| github-data | `github_meta` | Hardcoded `id=1` in code only | Add `CHECK (id = 1)` in schema | -| organization | `organization_profile` | None | Add `CHECK (id = 1)`, use integer PK | -| repository | `repo_meta` | Hardcoded `id=1` in code only | Add `CHECK (id = 1)` in schema | -| task | `task` | CHECK constraint | Already correct | -| task | `task_runtime` | CHECK constraint | Already correct | - -### Files to change - -- **`foundry/packages/backend/src/actors/auth-user/db/schema.ts`** — change `user` and `user_profiles` tables to integer PK with CHECK constraint -- **`foundry/packages/backend/src/actors/auth-user/index.ts`** — update queries to use `id = 1` pattern -- **`foundry/packages/backend/src/services/better-auth.ts`** — update adapter to use fixed `id = 1` -- **`foundry/packages/backend/src/actors/github-data/db/schema.ts`** — add CHECK constraint to `github_meta` (already uses `id=1` in code) -- **`foundry/packages/backend/src/actors/organization/db/schema.ts`** — change `organization_profile` to integer PK with CHECK constraint -- **`foundry/packages/backend/src/actors/organization/actions.ts`** — update queries to use `id = 1` -- **`foundry/packages/backend/src/actors/repository/db/schema.ts`** — add CHECK constraint to `repo_meta` (already uses `id=1` in code) -- All affected actors — regenerate `db/migrations.ts` - -### CLAUDE.md update - -- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Single-row tables (tables that hold exactly one record per actor instance, e.g. metadata or profile tables) must use an integer primary key with a `CHECK (id = 1)` constraint to enforce the singleton invariant at the database level. Follow the pattern established in the task actor's `task` and `task_runtime` tables." - ---- - -## [x] 4. Move all mutation actions to queue messages - -**Rationale:** Actions should be read-only (queries). All mutations (INSERT/UPDATE/DELETE) should go through queue messages processed by workflow handlers. This ensures single-writer consistency and aligns with the actor model. No actor currently does this correctly — the history actor has the mutation in the workflow handler, but the `append` action wraps a `wait: true` queue send, which is the same anti-pattern (callers should send to the queue directly). - -### Violations by actor - -**User actor (auth-user)** — `auth-user/index.ts` — 7 mutation actions: -- `createAuthRecord` (INSERT, line 164) -- `updateAuthRecord` (UPDATE, line 205) -- `updateManyAuthRecords` (UPDATE, line 219) -- `deleteAuthRecord` (DELETE, line 234) -- `deleteManyAuthRecords` (DELETE, line 243) -- `upsertUserProfile` (UPSERT, line 283) -- `upsertSessionState` (UPSERT, line 331) - -**GitHub Data actor** — `github-data/index.ts` — 7 mutation actions: -- `fullSync` (batch INSERT/DELETE/UPDATE, line 686) -- `reloadOrganization` (batch, line 690) -- `reloadAllPullRequests` (batch, line 694) -- `reloadRepository` (INSERT/UPDATE, line 698) -- `reloadPullRequest` (INSERT/DELETE/UPDATE, line 763) -- `clearState` (batch DELETE, line 851) -- `handlePullRequestWebhook` (INSERT/UPDATE/DELETE, line 879) - -**Organization actor — `actions.ts`** — 5 mutation actions: -- `applyTaskSummaryUpdate` (UPSERT, line 464) -- `removeTaskSummary` (DELETE, line 476) -- `applyGithubRepositoryProjection` (UPSERT, line 521) -- `applyGithubDataProjection` (INSERT/UPDATE/DELETE, line 547) -- `recordGithubWebhookReceipt` (UPDATE, line 620) - -**Organization actor — `app-shell.ts`** — 38 mutation actions: - -Better Auth index mutations (11): -- `authUpsertSessionIndex` (UPSERT) -- `authDeleteSessionIndex` (DELETE) -- `authUpsertEmailIndex` (UPSERT) -- `authDeleteEmailIndex` (DELETE) -- `authUpsertAccountIndex` (UPSERT) -- `authDeleteAccountIndex` (DELETE) -- `authCreateVerification` (INSERT) -- `authUpdateVerification` (UPDATE) -- `authUpdateManyVerification` (UPDATE) -- `authDeleteVerification` (DELETE) -- `authDeleteManyVerification` (DELETE) - -Organization profile/state mutations (13): -- `updateOrganizationShellProfile` (UPDATE on organizationProfile) -- `markOrganizationSyncStarted` (UPDATE on organizationProfile) -- `applyOrganizationSyncCompleted` (UPDATE on organizationProfile) -- `markOrganizationSyncFailed` (UPDATE on organizationProfile) -- `applyOrganizationStripeCustomer` (UPDATE on organizationProfile) -- `applyOrganizationStripeSubscription` (UPSERT on organizationProfile) -- `applyOrganizationFreePlan` (UPDATE on organizationProfile) -- `setOrganizationBillingPaymentMethod` (UPDATE on organizationProfile) -- `setOrganizationBillingStatus` (UPDATE on organizationProfile) -- `upsertOrganizationInvoice` (UPSERT on invoices) -- `recordOrganizationSeatUsage` (UPSERT on seatAssignments) -- `applyGithubInstallationCreated` (UPDATE on organizationProfile) -- `applyGithubInstallationRemoved` (UPDATE on organizationProfile) - -App-level mutations that delegate + mutate (8): -- `skipAppStarterRepo` (calls upsertUserProfile) -- `starAppStarterRepo` (calls upsertUserProfile + child mutation) -- `selectAppOrganization` (calls setActiveOrganization) -- `triggerAppRepoImport` (calls markOrganizationSyncStarted) -- `createAppCheckoutSession` (calls applyOrganizationFreePlan + applyOrganizationStripeCustomer) -- `finalizeAppCheckoutSession` (calls applyOrganizationStripeCustomer) -- `cancelAppScheduledRenewal` (calls setOrganizationBillingStatus) -- `resumeAppSubscription` (calls setOrganizationBillingStatus) -- `recordAppSeatUsage` (calls recordOrganizationSeatUsage) -- `handleAppStripeWebhook` (calls multiple org mutations) -- `handleAppGithubWebhook` (calls org mutations + github-data mutations) -- `syncOrganizationShellFromGithub` (multiple DB operations) -- `applyGithubRepositoryChanges` (calls applyGithubRepositoryProjection) - -**Task actor workbench** — `task/workbench.ts` — 14 mutation actions: -- `renameWorkbenchTask` (UPDATE, line 970) -- `renameWorkbenchBranch` (UPDATE, line 988) -- `createWorkbenchSession` (INSERT, line 1039) -- `renameWorkbenchSession` (UPDATE, line 1125) -- `setWorkbenchSessionUnread` (UPDATE, line 1136) -- `updateWorkbenchDraft` (UPDATE, line 1143) -- `changeWorkbenchModel` (UPDATE, line 1152) -- `sendWorkbenchMessage` (UPDATE, line 1205) -- `stopWorkbenchSession` (UPDATE, line 1255) -- `syncWorkbenchSessionStatus` (UPDATE, line 1265) -- `closeWorkbenchSession` (UPDATE, line 1331) -- `markWorkbenchUnread` (UPDATE, line 1363) -- `publishWorkbenchPr` (UPDATE, line 1375) -- `revertWorkbenchFile` (UPDATE, line 1403) - -**Repository actor** — `repository/actions.ts` — 5 mutation actions/helpers: -- `createTask` → calls `createTaskMutation()` (INSERT on taskIndex + creates task actor) -- `registerTaskBranch` → calls `registerTaskBranchMutation()` (INSERT/UPDATE on taskIndex) -- `reinsertTaskIndexRow()` (INSERT/UPDATE, called from `getTaskEnriched`) -- `deleteStaleTaskIndexRow()` (DELETE) -- `persistRemoteUrl()` (INSERT/UPDATE on repoMeta, called from `getRepoOverview`) - -### History (audit log) actor — `append` action must also be removed - -The history actor's workflow handler is correct (mutation in queue handler), but the `append` action (line 77) is a `wait: true` wrapper around the queue send — same anti-pattern. Delete the `append` action. Callers (the `appendHistory()` helper in `task/workflow/common.ts`) should send directly to the `auditLog.command.append` queue with `wait: false` (audit log writes are fire-and-forget, no need to block the caller). - -### Reference patterns (queue handlers only, no action wrappers) -- **Task actor core** — initialize, attach, push, sync, merge, archive, kill all use queue messages directly - -### Migration approach - -This is NOT about wrapping queue sends inside actions. The mutation actions must be **removed entirely** and replaced with queue messages that callers (including `packages/client`) send directly. - -Each actor needs: -1. Define queue message types for each mutation -2. Move mutation logic from action handlers into workflow/queue handlers -3. **Delete the mutation actions** — do not wrap them -4. Update `packages/client` to send queue messages directly to the actor instead of calling the old action -5. Update any inter-actor callers (e.g. `better-auth.ts`, `app-shell.ts`, other actors) to send queue messages instead of calling actions - -### CLAUDE.md update - -- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Actions must be read-only. All database mutations (INSERT, UPDATE, DELETE, UPSERT) must be queue messages processed by workflow handlers. Callers (client, other actors, services) send messages directly to the queue — do not wrap queue sends inside actions. Follow the pattern established in the task workflow actor's queue handlers." - ---- - -## [ ] 5. Migrate task actor raw SQL to Drizzle migrations - -**Rationale:** The task actor uses raw `db.execute()` with `ALTER TABLE ... ADD COLUMN` in `workbench.ts` and `workflow/init.ts` instead of proper Drizzle migrations. All actor DBs should use the standard Drizzle migration pattern. - -### Files to change - -- **`foundry/packages/backend/src/actors/task/workbench.ts`** (lines 24-56) — remove `ALTER TABLE` raw SQL, add columns to `db/schema.ts` and generate a proper migration -- **`foundry/packages/backend/src/actors/task/workflow/init.ts`** (lines 12-15) — same treatment -- **`foundry/packages/backend/src/actors/task/db/schema.ts`** — add the missing columns that are currently added via `ALTER TABLE` -- **`foundry/packages/backend/src/actors/task/db/migrations.ts`** — regenerate with new migration - -### CLAUDE.md update - -- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "All actor databases must use Drizzle ORM with proper schema definitions and generated migrations. No raw SQL (`db.execute()`, `ALTER TABLE`, etc.). Schema changes must go through `schema.ts` + migration generation." - ---- - -## [ ] 6. Rename History actor → Audit Log actor - -**Rationale:** The actor functions as a comprehensive audit log tracking task lifecycle events. "Audit Log" better describes its purpose. - -### Files to change - -- **`foundry/packages/backend/src/actors/history/`** → rename directory to `audit-log/` - - `index.ts` — rename export `history` → `auditLog`, display name `"History"` → `"Audit Log"`, queue `history.command.append` → `auditLog.command.append` - - Internal types: `HistoryInput` → `AuditLogInput`, `AppendHistoryCommand` → `AppendAuditLogCommand`, `ListHistoryParams` → `ListAuditLogParams` -- **`foundry/packages/backend/src/actors/keys.ts`** — `historyKey()` → `auditLogKey()` -- **`foundry/packages/backend/src/actors/handles.ts`** — `getOrCreateHistory` → `getOrCreateAuditLog`, `selfHistory` → `selfAuditLog` -- **`foundry/packages/backend/src/actors/index.ts`** — update import path and registration -- **`foundry/packages/shared/src/contracts.ts`** — `HistoryEvent` → `AuditLogEvent` -- **`foundry/packages/backend/src/actors/organization/actions.ts`** — `history()` action → `auditLog()`, update imports -- **`foundry/packages/backend/src/actors/repository/actions.ts`** — update `getOrCreateHistory` calls -- **`foundry/packages/backend/src/actors/task/workflow/common.ts`** — `appendHistory()` → `appendAuditLog()` -- **`foundry/packages/backend/src/actors/task/workflow/init.ts`** — update imports and calls -- **`foundry/packages/backend/src/actors/task/workflow/commands.ts`** — update imports and calls -- **`foundry/packages/backend/src/actors/task/workflow/push.ts`** — update imports and calls - -### Coverage gaps to fix - -The audit log only covers 9 of ~24 significant events (37.5%). The entire `task/workbench.ts` file has zero logging. Add audit log calls for: - -**High priority (missing lifecycle events):** -- `task.switch` — in `task/workflow/index.ts` handleSwitchActivity -- `task.session.created` — in `task/workbench.ts` createWorkbenchSession -- `task.session.closed` — in `task/workbench.ts` closeWorkbenchSession -- `task.session.stopped` — in `task/workbench.ts` stopWorkbenchSession - -**Medium priority (missing user actions):** -- `task.session.renamed` — renameWorkbenchSession -- `task.message.sent` — sendWorkbenchMessage -- `task.model.changed` — changeWorkbenchModel -- `task.title.changed` — renameWorkbenchTask -- `task.branch.renamed` — renameWorkbenchBranch -- `task.pr.published` — publishWorkbenchPr -- `task.file.reverted` — revertWorkbenchFile - -**Low priority / debatable:** -- `task.draft.updated`, `task.session.unread`, `task.derived.refreshed`, `task.transcript.refreshed` - -### CLAUDE.md updates needed - -- **`foundry/packages/backend/CLAUDE.md`** — rename `HistoryActor` → `AuditLogActor` in actor hierarchy, add maintenance rule: "Every new action or command handler that represents a user-visible or workflow-significant event must append to the audit log actor. The audit log must remain a comprehensive record of all significant operations." -- **`foundry/CLAUDE.md`** — rename "History Events" section → "Audit Log Events", update the list to include all events above, add note: "When adding new task/workbench commands, always add a corresponding audit log event." - ---- - -## [ ] 7. Move starred/default model to user actor settings - -**Dependencies:** item 1 - -**Rationale:** The starred/default model preference is currently broken — the frontend stores it in local React state that resets on reload. The org actor's `organizationProfile` table has a `defaultModel` column but there's no action to update it and it's the wrong scope anyway. This is a per-user preference, not an org setting. - -### Current state (broken) - -- **Frontend** (`mock-layout.tsx` line 313) — `useState("claude-sonnet-4")` — local state, lost on reload -- **Model picker UI** (`model-picker.tsx`) — has star icons + `onSetDefault` callback, but it only updates local state -- **Org actor** (`organization/db/schema.ts` line 43) — `defaultModel` column exists but nothing writes to it -- **No backend persistence** — starred model is not saved anywhere - -### Changes needed - -1. **Add `user_settings` table to user actor** (or add `defaultModel` column to `user_profiles`): - - `defaultModel` (text) — the user's starred/preferred model - - File: `foundry/packages/backend/src/actors/auth-user/db/schema.ts` - -2. **Add queue message to user actor** to update the default model: - - File: `foundry/packages/backend/src/actors/auth-user/index.ts` - -3. **Remove `defaultModel` from org actor** `organizationProfile` table (wrong scope): - - File: `foundry/packages/backend/src/actors/organization/db/schema.ts` - -4. **Update frontend** to read starred model from user settings (via `app` subscription) and send queue message on star click: - - File: `foundry/packages/frontend/src/components/mock-layout/model-picker.tsx` - - File: `foundry/packages/frontend/src/components/mock-layout.tsx` - -5. **Update shared types** — move `defaultModel` from `FoundryOrganizationSettings` to user settings type: - - File: `foundry/packages/shared/src/app-shell.ts` - -6. **Update client** to send the queue message to user actor: - - File: `foundry/packages/client/` - ---- - -## [ ] 8. Replace hardcoded model/agent lists with sandbox-agent API data - -**Dependencies:** items 7, 25 - -**Rationale:** The frontend hardcodes 8 models in a static list and ignores the sandbox-agent API's `GET /v1/agents` endpoint which already exposes the full agent config — models, modes, and reasoning/thought levels per agent. The frontend should consume this API 1:1 instead of maintaining its own stale copy. - -### Current state (hardcoded) - -- **`foundry/packages/frontend/src/components/mock-layout/view-model.ts`** (lines 20-39) — hardcoded `MODEL_GROUPS` with 8 models -- **`foundry/packages/client/src/workbench-model.ts`** (lines 18-37) — identical hardcoded `MODEL_GROUPS` copy -- **`foundry/packages/shared/src/workbench.ts`** (lines 5-13) — `WorkbenchModelId` hardcoded union type -- No modes or thought/reasoning levels exposed in UI at all -- No API calls to discover available models - -### What the sandbox-agent API already provides (`GET /v1/agents`) - -Per agent, the API returns: -- **models** — full list with display names (Claude: 4, Codex: 6, Cursor: 35+, OpenCode: 239) -- **modes** — execution modes (Claude: 5, Codex: 3, OpenCode: 2) -- **thought_level** — reasoning levels (Codex: low/medium/high/xhigh, Mock: low/medium/high) -- **capabilities** — plan_mode, reasoning, status support -- **credentialsAvailable** / **installed** — agent availability - -### Changes needed - -1. **Remove hardcoded model lists** from: - - `foundry/packages/frontend/src/components/mock-layout/view-model.ts` — delete `MODEL_GROUPS` - - `foundry/packages/client/src/workbench-model.ts` — delete `MODEL_GROUPS` - - `foundry/packages/shared/src/workbench.ts` — replace `WorkbenchModelId` union type with `string` (dynamic from API) - -2. **Backend: fetch and cache agent config from sandbox-agent API** - - Add an action or startup flow that calls `GET /v1/agents?config=true` on the sandbox-agent API - - Cache the result (agent list + models + modes + thought levels) in the appropriate actor - - Expose it to the frontend via the existing subscription/event system - -3. **Frontend: consume API-driven config** - - Model picker reads available models from backend-provided agent config, not hardcoded list - - Expose modes selector per agent - - Expose thought/reasoning level selector for agents that support it (Codex, Mock) - - Group models by agent as the API does (not by arbitrary provider grouping) - -4. **Update shared types** — make model/mode/thought_level types dynamic strings rather than hardcoded unions: - - `foundry/packages/shared/src/workbench.ts` - -5. **No backwards compatibility needed** — we're cleaning up, not preserving old behavior - ---- - -## [ ] 9. Flatten `taskLookup` + `taskSummaries` into single `tasks` table on org actor - -**Dependencies:** item 13 - -**Rationale:** `taskLookup` (taskId → repoId) is a strict subset of `taskSummaries` (which also has repoId + title, status, branch, PR, sessions). There's no reason for two tables with the same primary key. Flatten into one `tasks` table. - -### Current state - -- **`taskLookup`** — `taskId` (PK), `repoId` — used only for taskId → repoId resolution -- **`taskSummaries`** — `taskId` (PK), `repoId`, `title`, `status`, `repoName`, `updatedAtMs`, `branch`, `pullRequestJson`, `sessionsSummaryJson` — materialized sidebar data - -### Changes needed - -1. **Merge into single `tasks` table** in `foundry/packages/backend/src/actors/organization/db/schema.ts`: - - Drop `taskLookup` table - - Rename `taskSummaries` → `tasks` - - Keep all columns from `taskSummaries` (already includes `repoId`) - -2. **Update all references**: - - `foundry/packages/backend/src/actors/organization/actions.ts` — replace `taskLookup` queries with `tasks` table lookups - - `foundry/packages/backend/src/actors/organization/app-shell.ts` — if it references either table - - Any imports of the old table names from schema - -3. **Regenerate migrations** — `foundry/packages/backend/src/actors/organization/db/migrations.ts` - ---- - -## [x] 10. Reorganize user and organization actor actions into `actions/` folders - -**Dependencies:** items 1, 6 - -**Rationale:** Both actors cram too many concerns into single files. The organization actor has `app-shell.ts` (1,947 lines) + `actions.ts` mixing Better Auth, Stripe, GitHub, onboarding, workbench proxying, and org state. The user actor mixes Better Auth adapter CRUD with custom Foundry actions. Split into `actions/` folders grouped by domain, with `betterAuth` prefix on all Better Auth actions. - -### User actor → `user/actions/` - -| File | Actions | Source | -|---|---|---| -| `actions/better-auth.ts` | `betterAuthCreateRecord`, `betterAuthFindOneRecord`, `betterAuthFindManyRecords`, `betterAuthUpdateRecord`, `betterAuthUpdateManyRecords`, `betterAuthDeleteRecord`, `betterAuthDeleteManyRecords`, `betterAuthCountRecords` + all helper functions (`tableFor`, `columnFor`, `normalizeValue`, `clauseToExpr`, `buildWhere`, `applyJoinToRow`, `applyJoinToRows`) | Currently in `index.ts` | -| `actions/user.ts` | `getAppAuthState`, `upsertUserProfile`, `upsertSessionState` | Currently in `index.ts` | - -### Organization actor → `organization/actions/` - -**Delete `app-shell.ts`** — split its ~50 actions + helpers across these files: - -| File | Actions | Source | -|---|---|---| -| `actions/better-auth.ts` | `betterAuthFindSessionIndex`, `betterAuthUpsertSessionIndex`, `betterAuthDeleteSessionIndex`, `betterAuthFindEmailIndex`, `betterAuthUpsertEmailIndex`, `betterAuthDeleteEmailIndex`, `betterAuthFindAccountIndex`, `betterAuthUpsertAccountIndex`, `betterAuthDeleteAccountIndex`, `betterAuthCreateVerification`, `betterAuthFindOneVerification`, `betterAuthFindManyVerification`, `betterAuthUpdateVerification`, `betterAuthUpdateManyVerification`, `betterAuthDeleteVerification`, `betterAuthDeleteManyVerification`, `betterAuthCountVerification` + auth clause builder helpers | Currently in `app-shell.ts` | -| `actions/stripe.ts` | `createAppCheckoutSession`, `finalizeAppCheckoutSession`, `createAppBillingPortalSession`, `cancelAppScheduledRenewal`, `resumeAppSubscription`, `recordAppSeatUsage`, `handleAppStripeWebhook`, `applyOrganizationStripeCustomer`, `applyOrganizationStripeSubscription`, `applyOrganizationFreePlan`, `setOrganizationBillingPaymentMethod`, `setOrganizationBillingStatus`, `upsertOrganizationInvoice`, `recordOrganizationSeatUsage` | Currently in `app-shell.ts` | -| `actions/github.ts` | `resolveAppGithubToken`, `beginAppGithubInstall`, `triggerAppRepoImport`, `handleAppGithubWebhook`, `syncOrganizationShellFromGithub`, `syncGithubOrganizations`, `applyGithubInstallationCreated`, `applyGithubInstallationRemoved`, `applyGithubRepositoryChanges`, `reloadGithubOrganization`, `reloadGithubPullRequests`, `reloadGithubRepository`, `reloadGithubPullRequest`, `applyGithubRepositoryProjection`, `applyGithubDataProjection`, `recordGithubWebhookReceipt`, `refreshTaskSummaryForGithubBranch` | Currently split across `app-shell.ts` and `actions.ts` | -| `actions/onboarding.ts` | `skipAppStarterRepo`, `starAppStarterRepo`, `starSandboxAgentRepo`, `selectAppOrganization` | Currently in `app-shell.ts` | -| `actions/organization.ts` | `getAppSnapshot`, `getOrganizationShellState`, `getOrganizationShellStateIfInitialized`, `updateOrganizationShellProfile`, `updateAppOrganizationProfile`, `markOrganizationSyncStarted`, `applyOrganizationSyncCompleted`, `markOrganizationSyncFailed`, `useOrganization`, `getOrganizationSummary`, `reconcileWorkbenchState` | Currently split across `app-shell.ts` and `actions.ts` | -| `actions/tasks.ts` | `createTask`, `createWorkbenchTask`, `listTasks`, `getTask`, `switchTask`, `applyTaskSummaryUpdate`, `removeTaskSummary`, `findTaskForGithubBranch`, `applyOpenPullRequestUpdate`, `removeOpenPullRequest`, `attachTask`, `pushTask`, `syncTask`, `mergeTask`, `archiveTask`, `killTask` | Currently in `actions.ts` | -| `actions/workbench.ts` | `markWorkbenchUnread`, `renameWorkbenchTask`, `renameWorkbenchBranch`, `createWorkbenchSession`, `renameWorkbenchSession`, `setWorkbenchSessionUnread`, `updateWorkbenchDraft`, `changeWorkbenchModel`, `sendWorkbenchMessage`, `stopWorkbenchSession`, `closeWorkbenchSession`, `publishWorkbenchPr`, `revertWorkbenchFile` | Currently in `actions.ts` (proxy calls to task actor) | -| `actions/repos.ts` | `listRepos`, `getRepoOverview` | Currently in `actions.ts` | -| `actions/history.ts` | `history` (→ `auditLog` after rename) | Currently in `actions.ts` | - -Also move: -- `APP_SHELL_ORGANIZATION_ID` constant → `organization/constants.ts` -- `runOrganizationWorkflow` → `organization/workflow.ts` -- Private helpers (`buildAppSnapshot`, `assertAppOrganization`, `collectAllTaskSummaries`, etc.) → colocate with the action file that uses them - -### Files to update - -- **`foundry/packages/backend/src/services/better-auth.ts`** — update all action name references to use `betterAuth` prefix -- **`foundry/packages/backend/src/actors/organization/index.ts`** — import and spread action objects from `actions/` files instead of `app-shell.ts` + `actions.ts` -- **`foundry/packages/backend/src/actors/auth-user/index.ts`** (or `user/index.ts`) — import actions from `actions/` files - ---- - -## [ ] 11. Standardize workflow file structure across all actors - -**Dependencies:** item 4 - -**Rationale:** Workflow logic is inconsistently placed — inline in `index.ts`, in `actions.ts`, or in a `workflow/` directory. Standardize: every actor with a workflow gets a `workflow.ts` file. If the workflow is large, use `workflow/{index,...}.ts`. - -### Changes per actor - -| Actor | Current location | New location | Notes | -|---|---|---|---| -| user (auth-user) | None | `workflow.ts` (new) | Needs a workflow for mutations (item 4) | -| github-data | Inline in `index.ts` (~57 lines) | `workflow.ts` | Extract `runGithubDataWorkflow` + handler | -| history (→ audit-log) | Inline in `index.ts` (~18 lines) | `workflow.ts` | Extract `runHistoryWorkflow` + `appendHistoryRow` | -| organization | In `actions.ts` (~51 lines) | `workflow.ts` | Extract `runOrganizationWorkflow` + queue handlers | -| repository | In `actions.ts` (~42 lines) | `workflow.ts` | Extract `runRepositoryWorkflow` + queue handlers | -| task | `workflow/` directory (926 lines) | `workflow/` directory — already correct | Keep as-is: `workflow/index.ts`, `workflow/queue.ts`, `workflow/common.ts`, `workflow/init.ts`, `workflow/commands.ts`, `workflow/push.ts` | -| sandbox | None (wrapper) | N/A | No custom workflow needed | - -### Pattern - -- **Small workflows** (< ~200 lines): single `workflow.ts` file -- **Large workflows** (> ~200 lines): `workflow/index.ts` holds the main loop, other files hold step groups: - - `workflow/index.ts` — main loop + handler dispatch - - `workflow/queue.ts` — queue name definitions (if many) - - `workflow/{group}.ts` — step/activity functions grouped by domain - -### CLAUDE.md update - -- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Every actor with a message queue must have its workflow logic in a dedicated `workflow.ts` file (or `workflow/index.ts` for complex actors). Do not inline workflow logic in `index.ts` or `actions.ts`. Actions are read-only handlers; workflow handlers process queue messages and perform mutations." - ---- - ---- - -## [ ] 12. Audit and remove dead code in organization actor - -**Dependencies:** item 10 - -**Rationale:** The organization actor has ~50+ actions across `app-shell.ts` and `actions.ts`. Likely some are unused or vestigial. Audit all actions and queues for dead code and remove anything that has no callers. - -### Scope - -- All actions in `organization/actions.ts` and `organization/app-shell.ts` -- All queue message types and their handlers -- Helper functions that may no longer be called -- Shared types in `packages/shared` that only served removed actions - -### Approach - -- Trace each action/queue from caller → handler to confirm it's live -- Remove any action with no callers (client, other actors, services, HTTP endpoints) -- Remove any queue handler with no senders -- Remove associated types and helpers - ---- - -## [ ] 13. Enforce coordinator pattern and fix ownership violations - -**Rationale:** The actor hierarchy follows a coordinator pattern: org → repo → task → session. The coordinator owns the index/summary of its children, handles create/destroy, and children push updates up to their coordinator. Several violations exist where levels are skipped. - -### Coordinator hierarchy (add to CLAUDE.md) - -``` -Organization (coordinator for repos) -├── Repository (coordinator for tasks) -│ └── Task (coordinator for sessions) -│ └── Session -``` - -**Rules:** -- The coordinator owns the index/summary table for its direct children -- The coordinator handles create/destroy of its direct children -- Children push summary updates UP to their direct coordinator (not skipping levels) -- Read paths go through the coordinator, not direct cross-level access -- No backwards compatibility needed — we're cleaning up - -### Violations to fix - -#### V1: Task index tables on wrong actor (HIGH) - -`taskLookup` and `taskSummaries` (item 9 merges these into `tasks`) are on the **organization** actor but should be on the **repository** actor, since repo is the coordinator for tasks. - -**Fix:** -- Move the merged `tasks` table (from item 9) to `repository/db/schema.ts` -- Repository owns task summaries, not organization -- Organization gets a `repoSummaries` table instead (repo count, latest activity, etc.) — the repo pushes its summary up to org - -#### V2: Tasks push summaries directly to org, skipping repo (HIGH) - -Task actors call `organization.applyTaskSummaryUpdate()` directly (line 464 in `actions.ts`), bypassing the repository coordinator. - -**Fix:** -- Task pushes summary to `repository.applyTaskSummaryUpdate()` instead -- Repository updates its `tasks` table, then pushes a repo summary up to organization -- Organization never receives task-level updates directly - -#### V3: Org resolves taskId → repoId from its own table (MEDIUM) - -`resolveRepoId(c, taskId)` in `organization/actions.ts` queries `taskLookup` directly. Used by `switchTask`, `attachTask`, `pushTask`, `syncTask`, `mergeTask`, `archiveTask`, `killTask` (7 actions). - -**Fix:** -- Remove `resolveRepoId()` from org actor -- Org must know the `repoId` from the caller (frontend already knows which repo a task belongs to) or query the repo actor -- Update all 7 proxy actions to require `repoId` in their input instead of looking it up - -#### V4: Duplicate task creation bookkeeping at org level (MEDIUM) - -`createTaskMutation` in org actor calls `repository.createTask()`, then independently inserts `taskLookup` and seeds `taskSummaries`. Repository already inserts its own `taskIndex` row. - -**Fix:** -- Org calls `repository.createTask()` — that's it -- Repository handles all task index bookkeeping internally -- Repository pushes the new task summary back up to org as part of its repo summary update - -### Files to change - -- **`foundry/packages/backend/src/actors/organization/db/schema.ts`** — remove `taskLookup` and `taskSummaries`, add `repoSummaries` if needed -- **`foundry/packages/backend/src/actors/repository/db/schema.ts`** — add merged `tasks` table (task summaries) -- **`foundry/packages/backend/src/actors/organization/actions.ts`** — remove `resolveRepoId()`, `applyTaskSummaryUpdate`, `removeTaskSummary`, `findTaskForGithubBranch`, `refreshTaskSummaryForGithubBranch`; update proxy actions to require `repoId` in input -- **`foundry/packages/backend/src/actors/repository/actions.ts`** — add `applyTaskSummaryUpdate` action (receives from task), push repo summary to org -- **`foundry/packages/backend/src/actors/task/workflow/common.ts`** — change summary push target from org → repo -- **`foundry/packages/shared/src/contracts.ts`** — update input types to include `repoId` where needed -- **`foundry/packages/client/`** — update calls to pass `repoId` - -### CLAUDE.md update - -- **`foundry/packages/backend/CLAUDE.md`** — add coordinator pattern rules: - ``` - ## Coordinator Pattern - - The actor hierarchy follows a strict coordinator pattern: - - Organization = coordinator for repositories - - Repository = coordinator for tasks - - Task = coordinator for sessions - - Rules: - - Each coordinator owns the index/summary table for its direct children. - - Only the coordinator handles create/destroy of its direct children. - - Children push summary updates to their direct coordinator only (never skip levels). - - Cross-level access (e.g. org directly querying task state) is not allowed — go through the coordinator. - - Proxy actions at higher levels (e.g. org.pushTask) must delegate to the correct coordinator, not bypass it. - ``` - ---- - ---- - -## [ ] 14. Standardize one event per subscription topic across all actors - -**Dependencies:** item 15 - -**Rationale:** Each subscription topic should have exactly one event type carrying the full replacement snapshot. The organization topic currently violates this with 7 subtypes. Additionally, event naming is inconsistent across actors. Standardize all of them. - -### Current state - -| Topic | Wire event name | Event type field | Subtypes | Issue | -|---|---|---|---|---| -| `app` | `appUpdated` | `type: "appUpdated"` | 1 | Name is fine | -| `organization` | `organizationUpdated` | 7 variants | **7** | Needs consolidation | -| `task` | `taskUpdated` | `type: "taskDetailUpdated"` | 1 | Wire name ≠ type name | -| `session` | `sessionUpdated` | `type: "sessionUpdated"` | 1 | Fine | -| `sandboxProcesses` | `processesUpdated` | `type: "processesUpdated"` | 1 | Fine | - -### Target state - -Every topic gets exactly one event. Wire event name = type field = `{topic}Updated`. Each carries the full snapshot for that topic. - -| Topic | Event name | Payload | -|---|---|---| -| `app` | `appUpdated` | `FoundryAppSnapshot` | -| `organization` | `organizationUpdated` | `OrganizationSummarySnapshot` | -| `task` | `taskUpdated` | `WorkbenchTaskDetail` | -| `session` | `sessionUpdated` | `WorkbenchSessionDetail` | -| `sandboxProcesses` | `processesUpdated` | `SandboxProcessSnapshot[]` | - -### Organization — consolidate 7 subtypes into 1 - -Remove the discriminated union. Replace all 7 subtypes: -- `taskSummaryUpdated`, `taskRemoved`, `repoAdded`, `repoUpdated`, `repoRemoved`, `pullRequestUpdated`, `pullRequestRemoved` - -With a single `organizationUpdated` event carrying the full `OrganizationSummarySnapshot`. The client replaces its cached state — same pattern as every other topic. - -### Task — fix event type name mismatch - -Wire event is `taskUpdated` but the type field says `taskDetailUpdated`. Rename to `taskUpdated` everywhere for consistency. - -### Files to change - -- **`foundry/packages/shared/src/realtime-events.ts`** — replace `OrganizationEvent` union with single event type; rename `TaskEvent.type` from `taskDetailUpdated` → `taskUpdated` -- **`foundry/packages/backend/src/actors/organization/actions.ts`** — update all 7 `c.broadcast("organizationUpdated", { type: "taskSummaryUpdated", ... })` calls to emit single event with full snapshot -- **`foundry/packages/backend/src/actors/organization/app-shell.ts`** — same for any broadcasts here -- **`foundry/packages/backend/src/actors/task/workbench.ts`** — rename `taskDetailUpdated` → `taskUpdated` in broadcast calls -- **`foundry/packages/client/src/subscription/topics.ts`** — simplify `applyEvent` for organization topic (no more discriminated union handling); update task event type name -- **`foundry/packages/client/src/subscription/mock-manager.ts`** — update mock event handling -- **`foundry/packages/frontend/`** — update any direct references to event type names - -### CLAUDE.md update - -- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Each subscription topic must have exactly one event type. The event carries the full replacement snapshot for that topic — no discriminated unions, no partial patches, no subtypes. Event name must match the pattern `{topic}Updated` (e.g. `organizationUpdated`, `taskUpdated`). When state changes, broadcast the full snapshot; the client replaces its cached state." - ---- - -## [x] 15. Unify tasks and pull requests — PRs are just task data - -**Dependencies:** items 9, 13 - -**Rationale:** From the client's perspective, tasks and PRs are the same thing — a branch with work on it. The frontend already merges them into one sorted list, converting PRs to synthetic task objects with `pr:{prId}` IDs. The distinction is artificial. A "task" should represent any branch, and the task actor lazily wraps it. PR metadata is just data the task holds. - -### Current state (separate entities) - -- **Tasks**: stored in task actor SQLite, surfaced via `WorkbenchTaskSummary`, events via `taskSummaryUpdated` -- **PRs**: stored in GitHub data actor (`githubPullRequests` table), surfaced via `WorkbenchOpenPrSummary`, events via `pullRequestUpdated`/`pullRequestRemoved` -- **Frontend hack**: converts PRs to fake task objects with `pr:{prId}` IDs, merges into one list -- **Filtering logic**: org actor silently swallows `pullRequestUpdated` if a task claims the same branch — fragile coupling -- **Two separate types**: `WorkbenchTaskSummary` and `WorkbenchOpenPrSummary` with overlapping fields - -### Target state (unified) - -- **One entity**: a "task" represents a branch. Task actors are lazily created when needed (user creates one, or a PR arrives for an unclaimed branch). -- **PR data lives on the task**: the task actor stores PR metadata (number, title, state, url, isDraft, authorLogin, etc.) as part of its state, not as a separate entity -- **One type**: `WorkbenchTaskSummary` includes full PR fields (nullable). No separate `WorkbenchOpenPrSummary`. -- **One event**: `organizationUpdated` carries task summaries that include PR data. No separate PR events. -- **No synthetic IDs**: every item in the sidebar is a real task with a real taskId - -### Changes needed - -1. **Remove `WorkbenchOpenPrSummary` type** from `packages/shared/src/workbench.ts` — merge its fields into `WorkbenchTaskSummary` -2. **Expand task's `pullRequest` field** from `{ number, status }` to full PR metadata (number, title, state, url, headRefName, baseRefName, isDraft, authorLogin, updatedAtMs) -3. **Remove `openPullRequests` from `OrganizationSummarySnapshot`** — all items are tasks now -4. **Remove PR-specific events** from `realtime-events.ts`: `pullRequestUpdated`, `pullRequestRemoved` -5. **Remove PR-specific actions** from organization actor: `applyOpenPullRequestUpdate`, `removeOpenPullRequest` -6. **Remove branch-claiming filter logic** in org actor (the `if task claims branch, skip PR` check) -7. **GitHub data actor PR sync**: when PRs arrive (webhook or sync), create/update a task for that branch lazily via the repository coordinator -8. **Task actor**: store PR metadata in its DB (new columns or table), update when GitHub data pushes changes -9. **Frontend**: remove `toOpenPrTaskModel` conversion, remove `pr:` ID prefix hack, remove separate `openPullRequests` state — sidebar is just tasks -10. **Repository actor**: when a PR arrives for a branch with no task, lazily create a task actor for it (lightweight, no sandbox needed) - -### Implications for coordinator pattern (item 13) - -This reinforces: repo is the coordinator for tasks. When GitHub data detects a new PR for a branch, it tells the repo coordinator, which creates/updates the task. The task holds the PR data and pushes its summary to the repo coordinator. - -### No backwards compatibility needed - -The `authSessionIndex`, `authEmailIndex`, `authAccountIndex`, and `authVerification` tables stay on the org actor. They're routing indexes needed by the Better Auth adapter to resolve user identity before the user actor can be accessed (e.g. session token → userId lookup). Already covered in item 2 for adding comments explaining this. - ---- - -## [ ] 16. Chunk GitHub data sync and publish progress - -**Rationale:** `runFullSync` in the github-data actor fetches everything at once (all repos, branches, members, PRs), replaces all tables atomically, and has a 5-minute timeout. For large orgs this will timeout or lose all data mid-sync (replace pattern deletes everything first). Needs to be chunked with incremental progress. - -### Current state (broken for large orgs) - -- `runFullSync()` (`github-data/index.ts` line 486-538): - 1. Fetches ALL repos, branches, members, PRs in 4 sequential calls - 2. `replaceRepositories/Branches/Members/PullRequests` — deletes all rows then inserts all new rows - 3. Single 5-minute timeout wraps the entire operation - 4. No progress reporting to the client — just "Syncing GitHub data..." → "Synced N repositories" - 5. If it fails mid-sync, data is partially deleted with no recovery - -### Changes needed - -1. **Chunk the sync by repository** — sync repos first (paginated from GitHub API), then for each repo chunk, sync its branches and PRs. Members can be a separate chunk. - -2. **Incremental upsert, not replace** — don't delete-then-insert. Use upsert per row so partial sync doesn't lose data. Mark rows with a sync generation ID; after full sync completes, delete rows from previous generations. - -3. **Run in a loop, not a single step** — each chunk is a separate workflow step with its own timeout. If one chunk fails, previous chunks are persisted. - -4. **Publish progress per chunk** — after each chunk completes: - - Update `github_meta` with progress (e.g. `syncedRepos: 15/42`) - - Push progress to the organization actor - - Organization broadcasts to clients so the UI shows progress (e.g. "Syncing repositories... 15/42") - -5. **Initial sync uses the same chunked approach** — `github-data-initial-sync` step should kick off the chunked loop, not call `runFullSync` directly - -### Files to change - -- **`foundry/packages/backend/src/actors/github-data/index.ts`**: - - Refactor `runFullSync` into chunked loop - - Replace `replaceRepositories/Branches/Members/PullRequests` with upsert + generation sweep - - Add progress metadata to `github_meta` table - - Publish progress to org actor after each chunk -- **`foundry/packages/backend/src/actors/github-data/db/schema.ts`** — add sync generation column to all tables, add progress fields to `github_meta` -- **`foundry/packages/backend/src/actors/organization/actions.ts`** (or `app-shell.ts`) — handle sync progress updates and broadcast to clients -- **`foundry/packages/shared/src/app-shell.ts`** — add sync progress fields to `FoundryGithubState` (e.g. `syncProgress: { current: number; total: number } | null`) -- **`foundry/packages/frontend/`** — show sync progress in UI (e.g. "Syncing repositories... 15/42") - ---- - ---- - -# Deferred follow-up outside this task - -## 17. Type all actor context parameters — remove `c: any` - -**Rationale:** 272+ instances of `c: any`, `ctx: any`, `loopCtx: any` across all actor code. This eliminates type safety for DB access, state access, broadcasts, and queue operations. All context parameters should use RivetKit's proper context types. - -### Scope (by file, approximate count) - -| File | `any` contexts | -|---|---| -| `organization/app-shell.ts` | ~108 | -| `organization/actions.ts` | ~56 | -| `task/workbench.ts` | ~53 | -| `github-data/index.ts` | ~23 | -| `repository/actions.ts` | ~22 | -| `sandbox/index.ts` | ~21 | -| `handles.ts` | ~19 | -| `task/workflow/commands.ts` | ~10 | -| `task/workflow/init.ts` | ~4 | -| `auth-user/index.ts` | ~2 | -| `history/index.ts` | ~2 | -| `task/workflow/index.ts` | ~2 | -| `task/workflow/common.ts` | ~2 | -| `task/workflow/push.ts` | ~1 | -| `polling.ts` | ~1 | - -### Changes needed - -1. **Determine correct RivetKit context types** — check RivetKit exports for `ActionContext`, `ActorContextOf`, `WorkflowContext`, `LoopContext`, or equivalent. Reference `polling.ts` which already defines typed contexts (`PollingActorContext`, `WorkflowPollingActorContext`). - -2. **Define per-actor context types** — each actor has its own state shape and DB schema, so the context type should be specific (e.g. `ActionContext` or similar). - -3. **Replace all `c: any`** with the proper typed context across every file listed above. - -4. **Type workflow/loop contexts** — `ctx: any` in workflow functions and `loopCtx: any` in loop callbacks need proper types too. - -### CLAUDE.md update - -- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "All actor context parameters (`c`, `ctx`, `loopCtx`) must be properly typed using RivetKit's context types. Never use `any` for actor contexts. Each actor should define or derive its context type from the actor definition." - ---- - -## [ ] 18. Final pass: remove all dead code - -**Dependencies:** all other items (do this last, after 17) - -**Rationale:** After completing all changes above, many actions, queues, SQLite tables, workflow steps, shared types, and helper functions will be orphaned. Do a full scan to find and remove everything that's dead. - -### Scope - -Scan the entire foundry codebase for: -- **Dead actions** — actions with no callers (client, other actors, services, HTTP endpoints) -- **Dead queues** — queue message types with no senders -- **Dead SQLite tables** — tables with no reads or writes -- **Dead workflow steps** — step names that are no longer referenced -- **Dead shared types** — types in `packages/shared` that are no longer imported -- **Dead helper functions** — private functions with no callers -- **Dead imports** — unused imports across all files - -### When to do this - -After all items 1–17 are complete. Not before — removing code while other items are in progress will create conflicts. - ---- - -## [ ] 19. Remove duplicate data between `c.state` and SQLite - -**Dependencies:** items 21, 24 - -**Rationale:** Several actors store the same data in both `c.state` (RivetKit durable state) and their SQLite tables. Mutable fields that exist in both can silently diverge — `c.state` becomes stale when the SQLite copy is updated. Per the existing CLAUDE.md rule, `c.state` should hold only small scalars/identifiers; anything queryable or mutable belongs in SQLite. - -### Duplicates found - -**Task actor** — `c.state` (`createState` in `task/index.ts` lines 124-139) vs `task`/`taskRuntime` tables: - -| Field | In SQLite? | Mutable? | Verdict | -|---|---|---|---| -| `organizationId` | No | No | **KEEP** — identity field | -| `repoId` | No | No | **KEEP** — identity field | -| `taskId` | No | No | **KEEP** — identity field | -| `repoRemote` | No (but org `repos` table has it) | No | **DELETE** — not needed on task, read from repo/org | -| `branchName` | Yes (`task.branch_name`) | Yes | **REMOVE from c.state** — HIGH risk, goes stale on rename | -| `title` | Yes (`task.title`) | Yes | **REMOVE from c.state** — HIGH risk, goes stale on rename | -| `task` (description) | Yes (`task.task`) | No | **REMOVE from c.state** — redundant | -| `sandboxProviderId` | Yes (`task.sandbox_provider_id`) | No | **REMOVE from c.state** — redundant | -| `agentType` | Yes (`task.agent_type`) | Yes | **DELETE entirely** — session-specific (item 21) | -| `explicitTitle` | No | No | **MOVE to SQLite** — creation metadata | -| `explicitBranchName` | No | No | **MOVE to SQLite** — creation metadata | -| `initialPrompt` | No | No | **DELETE entirely** — dead code, session-specific (item 21) | -| `initialized` | No | Yes | **DELETE entirely** — dead code, `status` already tracks init progress | -| `previousStatus` | No | No | **DELETE entirely** — never set, never read | - -**Repository actor** — `c.state` (`createState` in `repository/index.ts`) vs `repoMeta` table: - -| Field | Mutable? | Risk | -|---|---|---| -| `remoteUrl` | No | Low — redundant but safe | - -### Fix - -Remove all duplicated fields from `c.state`. Keep only identity fields needed for actor key resolution (e.g. `organizationId`, `repoId`, `taskId`). Read mutable data from SQLite. - -**Task actor `c.state` should become:** -```typescript -createState: (_c, input) => ({ - organizationId: input.organizationId, - repoId: input.repoId, - taskId: input.taskId, -}) -``` - -Fields already in SQLite (`branchName`, `title`, `task`, `sandboxProviderId`) — remove from `c.state`, read from SQLite only. Fields not yet in SQLite (`explicitTitle`, `explicitBranchName`) — add to `task` table, remove from `c.state`. Dead code to delete entirely: `agentType`, `initialPrompt` (item 21), `initialized`, `previousStatus`, `repoRemote`. - -**Repository actor `c.state` should become:** -```typescript -createState: (_c, input) => ({ - organizationId: input.organizationId, - repoId: input.repoId, -}) -``` - -`remoteUrl` is removed from repo actor `c.state` entirely. The repo actor reads `remoteUrl` from its own `repoMeta` SQLite table when needed. The org actor already stores `remoteUrl` in its `repos` table (source of truth from GitHub data). The `getOrCreateRepository()` helper in `handles.ts` currently requires `remoteUrl` as a parameter and passes it as `createWithInput` — this parameter must be removed. Every call site in `organization/actions.ts` and `organization/app-shell.ts` currently does a DB lookup for `remoteUrl` just to pass it to `getOrCreateRepository()` — all of those lookups go away. On actor creation, the repo actor should populate its `repoMeta.remoteUrl` by querying the org actor or github-data actor, not by receiving it as a create input. - -### Files to change - -- **`foundry/packages/backend/src/actors/task/index.ts`** — trim `createState`, update all `c.state.*` reads for removed fields to read from SQLite instead -- **`foundry/packages/backend/src/actors/task/workbench.ts`** — update `c.state.*` reads -- **`foundry/packages/backend/src/actors/task/workflow/*.ts`** — update `c.state.*` reads -- **`foundry/packages/backend/src/actors/repository/index.ts`** — trim `createState`, remove `remoteUrl` from input type -- **`foundry/packages/backend/src/actors/repository/actions.ts`** — update all `c.state.remoteUrl` reads to query `repoMeta` table; remove `persistRemoteUrl()` helper -- **`foundry/packages/backend/src/actors/handles.ts`** — remove `remoteUrl` parameter from `getOrCreateRepository()` -- **`foundry/packages/backend/src/actors/organization/actions.ts`** — remove all `remoteUrl` lookups done solely to pass to `getOrCreateRepository()` (~10 call sites) -- **`foundry/packages/backend/src/actors/organization/app-shell.ts`** — same cleanup for app-shell call sites - -### CLAUDE.md update - -- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Never duplicate data between `c.state` and SQLite. `c.state` holds only immutable identity fields needed for actor key resolution (e.g. `organizationId`, `repoId`, `taskId`). All mutable data and anything queryable must live exclusively in SQLite. If a field can change after actor creation, it must not be in `c.state`." - ---- - -## [ ] 20. Prefix all admin/recovery actions with `admin` - -**Rationale:** Several actions are admin-only recovery/rebuild operations but their names don't distinguish them from normal product flows. Prefix with `admin` so it's immediately clear these are not part of regular user flows. - -### Actions to rename - -**Organization actor:** - -| Current name | New name | Why it's admin | -|---|---|---| -| `reconcileWorkbenchState` | `adminReconcileWorkbenchState` | Full fan-out rebuild of task summary projection | -| `reloadGithubOrganization` | `adminReloadGithubOrganization` | Manual trigger to refetch all org GitHub data | -| `reloadGithubPullRequests` | `adminReloadGithubPullRequests` | Manual trigger to refetch all PR data | -| `reloadGithubRepository` | `adminReloadGithubRepository` | Manual trigger to refetch single repo | -| `reloadGithubPullRequest` | `adminReloadGithubPullRequest` | Manual trigger to refetch single PR | - -**GitHub Data actor:** - -| Current name | New name | Why it's admin | -|---|---|---| -| `fullSync` | `adminFullSync` | Full replace of all GitHub data — recovery operation | -| `reloadOrganization` | `adminReloadOrganization` | Triggers full sync manually | -| `reloadAllPullRequests` | `adminReloadAllPullRequests` | Triggers full sync manually | -| `clearState` | `adminClearState` | Deletes all GitHub data — recovery from lost access | - -**NOT renamed** (these are triggered by webhooks/normal flows, not manual admin actions): -- `reloadRepository` — called by push/create/delete webhooks (incremental, normal flow) -- `reloadPullRequest` — called by PR webhooks (incremental, normal flow) -- `handlePullRequestWebhook` — webhook handler (normal flow) -- `syncGithubOrganizations` — called during OAuth callback (normal flow, though also used for repair) - -### Files to change - -- **`foundry/packages/backend/src/actors/github-data/index.ts`** — rename actions -- **`foundry/packages/backend/src/actors/organization/actions.ts`** — rename actions -- **`foundry/packages/client/src/backend-client.ts`** — update method names -- **`foundry/packages/frontend/`** — update any references to renamed actions - -### CLAUDE.md update - -- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Admin-only actions (recovery, rebuild, manual resync, state reset) must be prefixed with `admin` (e.g. `adminReconcileState`, `adminClearState`). This makes it clear they are not part of normal product flows and should not be called from regular client code paths." - ---- - -## [ ] 21. Remove legacy/session-scoped fields from task table - -**Rationale:** The `task` table has fields that either belong on the session, are redundant with data from other actors, or are dead code from the removed local git clone. These should be cleaned up. - -### Fields to remove from `task` table and `c.state` - -**`agentType`** — Legacy from when task = 1 session. Only used for `defaultModelForAgent(c.state.agentType)` to pick the default model when creating a new session. Sessions already have their own `model` column in `taskWorkbenchSessions`. The default model for new sessions should come from user settings (see item 16 — starred model stored in user actor). Remove `agentType` from task table, `c.state`, `createState`, `TaskRecord`, and all `defaultModelForAgent()` call sites. Replace with user settings lookup. - -**`initialPrompt`** — Stored on `c.state` at task creation but **never read anywhere**. Completely dead code. This is also session-specific, not task-specific — the initial prompt belongs on the first session, not the task. Remove from `c.state`, `createState` input type, and `CreateTaskCommand`/`CreateTaskInput` types. Remove from `repository/actions.ts` create flow. - -**`prSubmitted`** — Redundant boolean set when `submitPullRequest` runs. PR state already flows from GitHub webhooks → github-data actor → branch name lookup. This boolean can go stale (PR closed and reopened, PR deleted, etc.). Remove entirely — PR existence is derivable from github-data by branch name (already how `enrichTaskRecord` and `buildTaskSummary` work). - -### Dead fields on `taskRuntime` table - -**`provisionStage`** — Values: `"queued"`, `"ready"`, `"error"`. Redundant with `status` — `init_complete` implies ready, `error` implies error. Never read in business logic. Delete. - -**`provisionStageUpdatedAt`** — Timestamp for `provisionStage` changes. Never read anywhere. Delete. - -### Dead fields on `TaskRecord` (in `workflow/common.ts`) - -These are always hardcoded to `null` — remnants of the removed local git clone: - -- `diffStat` — was populated from `branches` table (deleted) -- `hasUnpushed` — was populated from `branches` table (deleted) -- `conflictsWithMain` — was populated from `branches` table (deleted) -- `parentBranch` — was populated from `branches` table (deleted) - -Remove from `TaskRecord` type, `getCurrentRecord()`, and all consumers (contracts, mock client, tests, frontend). - -### Files to change - -- **`foundry/packages/backend/src/actors/task/db/schema.ts`** — remove `agentType` and `prSubmitted` columns from `task` table; remove `provisionStage` and `provisionStageUpdatedAt` from `taskRuntime` table -- **`foundry/packages/backend/src/actors/task/index.ts`** — remove `agentType`, `initialPrompt`, `initialized`, `previousStatus`, `repoRemote` from `createState` and input type -- **`foundry/packages/backend/src/actors/task/workbench.ts`** — remove `defaultModelForAgent()`, `agentTypeForModel()`, update session creation to use user settings for default model; remove `prSubmitted` set in `submitPullRequest` -- **`foundry/packages/backend/src/actors/task/workflow/common.ts`** — remove `agentType`, `prSubmitted`, `diffStat`, `hasUnpushed`, `conflictsWithMain`, `parentBranch` from `getCurrentRecord()` and `TaskRecord` construction -- **`foundry/packages/backend/src/actors/task/workflow/init.ts`** — remove `agentType` from task row inserts -- **`foundry/packages/shared/src/contracts.ts`** — remove `agentType`, `prSubmitted`, `diffStat`, `prUrl`, `hasUnpushed`, `conflictsWithMain`, `parentBranch` from `TaskRecord` schema (note: `prUrl` and `prAuthor` should stay if still populated by `enrichTaskRecord`, or move to the unified task/PR model from item 15) -- **`foundry/packages/client/src/mock/backend-client.ts`** — update mock to remove dead fields -- **`foundry/packages/client/test/view-model.test.ts`** — update test fixtures -- **`foundry/packages/frontend/src/features/tasks/model.test.ts`** — update test fixtures -- **`foundry/packages/backend/src/actors/organization/actions.ts`** — remove any references to `agentType` in task creation input -- **`foundry/packages/backend/src/actors/repository/actions.ts`** — update `enrichTaskRecord()` to stop setting dead fields - ---- - -## [ ] 22. Move per-user UI state from task actor to user actor - -**Dependencies:** item 1 - -**Rationale:** The task actor stores UI-facing state that is user-specific, not task-global. With multiplayer (multiple users viewing the same task), this breaks — each user has their own active session, their own unread state, their own drafts. These must live on the user actor, keyed by `(taskId, sessionId)`, not on the shared task actor. - -### Per-user state currently on the task actor (wrong) - -**`taskRuntime.activeSessionId`** — Which session the user is "looking at." Used to: -- Determine which session's status drives the task-level status (running/idle) — this is wrong, the task status should reflect ALL sessions, not one user's active tab -- Return a "current" session in `attachTask` responses — this is per-user -- Migration path for legacy single-session tasks in `ensureWorkbenchSeeded` - -This should move to the user actor as `activeSessionId` per `(userId, taskId)`. - -**`taskWorkbenchSessions.unread`** — Per-user unread state stored globally on the session. If user A reads a session, user B's unread state is also cleared. Move to user actor keyed by `(userId, taskId, sessionId)`. - -**`taskWorkbenchSessions.draftText` / `draftAttachmentsJson` / `draftUpdatedAt`** — Per-user draft state stored globally. If user A starts typing a draft, it overwrites user B's draft. Move to user actor keyed by `(userId, taskId, sessionId)`. - -### What stays on the task actor (correct — task-global state) - -- `taskRuntime.activeSandboxId` — which sandbox is running (global to the task) -- `taskRuntime.activeSwitchTarget` / `activeCwd` — sandbox connection state (global) -- `taskRuntime.statusMessage` — provisioning/runtime status (global) -- `taskWorkbenchSessions.model` — which model the session uses (global) -- `taskWorkbenchSessions.status` — session runtime status (global) -- `taskWorkbenchSessions.transcriptJson` — session transcript (global) - -### Fix - -Add a `userTaskState` table to the user actor: - -```typescript -export const userTaskState = sqliteTable("user_task_state", { - taskId: text("task_id").notNull(), - sessionId: text("session_id").notNull(), - activeSessionId: text("active_session_id"), // per-user active tab - unread: integer("unread").notNull().default(0), - draftText: text("draft_text").notNull().default(""), - draftAttachmentsJson: text("draft_attachments_json").notNull().default("[]"), - draftUpdatedAt: integer("draft_updated_at"), - updatedAt: integer("updated_at").notNull(), -}, (table) => ({ - pk: primaryKey(table.taskId, table.sessionId), -})); -``` - -Remove `activeSessionId` from `taskRuntime`. Remove `unread`, `draftText`, `draftAttachmentsJson`, `draftUpdatedAt` from `taskWorkbenchSessions`. - -The task-level status should be derived from ALL sessions (e.g., task is "running" if ANY session is running), not from one user's `activeSessionId`. - -### Files to change - -- **`foundry/packages/backend/src/actors/auth-user/db/schema.ts`** — add `userTaskState` table -- **`foundry/packages/backend/src/actors/task/db/schema.ts`** — remove `activeSessionId` from `taskRuntime`; remove `unread`, `draftText`, `draftAttachmentsJson`, `draftUpdatedAt` from `taskWorkbenchSessions` -- **`foundry/packages/backend/src/actors/task/workbench.ts`** — remove all `activeSessionId` reads/writes; remove draft/unread mutation functions; task status derivation should check all sessions -- **`foundry/packages/backend/src/actors/task/workflow/common.ts`** — remove `activeSessionId` from `getCurrentRecord()` -- **`foundry/packages/backend/src/actors/task/workflow/commands.ts`** — remove `activeSessionId` references in `attachTask` -- **`foundry/packages/backend/src/actors/task/workflow/init.ts`** — remove `activeSessionId` initialization -- **`foundry/packages/client/`** — draft/unread/activeSession operations route to user actor instead of task actor -- **`foundry/packages/frontend/`** — update subscription to fetch per-user state from user actor - -### CLAUDE.md update - -- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Per-user UI state (active session tab, unread counts, draft text, draft attachments) must live on the user actor, not on shared task/session actors. Task actors hold only task-global state visible to all users. This is critical for multiplayer correctness — multiple users may view the same task simultaneously with different active sessions, unread states, and in-progress drafts." - ---- - -## [ ] 23. Delete `getTaskEnriched` and `enrichTaskRecord` (dead code) - -**Rationale:** `getTaskEnriched` is dead code with zero callers from the client. It's also the worst fan-out pattern in the codebase: org → repo actor → task actor (`.get()`) → github-data actor (`listPullRequestsForRepository` fetches ALL PRs, then `.find()`s by branch name). This is exactly the pattern the coordinator model eliminates — task detail comes from `getTaskDetail` on the task actor, sidebar data comes from materialized `taskSummaries` on the org actor. - -### What to delete - -- **`enrichTaskRecord()`** — `repository/actions.ts:117-143`. Fetches all PRs for a repo to find one by branch name. Dead code. -- **`getTaskEnriched` action** — `repository/actions.ts:432-450`. Only caller of `enrichTaskRecord`. Dead code. -- **`getTaskEnriched` org proxy** — `organization/actions.ts:838-849`. Only caller of the repo action. Dead code. -- **`GetTaskEnrichedCommand` type** — wherever defined. - -### Files to change - -- **`foundry/packages/backend/src/actors/repository/actions.ts`** — delete `enrichTaskRecord()` and `getTaskEnriched` action -- **`foundry/packages/backend/src/actors/organization/actions.ts`** — delete `getTaskEnriched` proxy action - ---- - -## [ ] 24. Clean up task status tracking - -**Dependencies:** item 21 - -**Rationale:** Task status tracking is spread across `c.state`, the `task` SQLite table, and the `taskRuntime` table with redundant and dead fields. Consolidate to a single `status` enum on the `task` table. Remove `statusMessage` — human-readable status text should be derived on the client from the `status` enum, not stored on the backend. - -### Fields to delete - -| Field | Location | Why | -|---|---|---| -| `initialized` | `c.state` | Dead code — never read. `status` already tracks init progress. | -| `previousStatus` | `c.state` | Dead code — never set, never read. | -| `statusMessage` | `taskRuntime` table | Client concern — the client should derive display text from the `status` enum. The backend should not store UI copy. | -| `provisionStage` | `taskRuntime` table | Redundant — `status` already encodes provision progress (`init_bootstrap_db` → `init_enqueue_provision` → `init_complete`). | -| `provisionStageUpdatedAt` | `taskRuntime` table | Dead — never read. | - -### What remains - -- **`status`** on the `task` table — the single canonical state machine enum. Values: `init_bootstrap_db`, `init_enqueue_provision`, `init_complete`, `running`, `idle`, `error`, `archive_*`, `kill_*`, `archived`, `killed`. - -### Files to change - -- **`foundry/packages/backend/src/actors/task/db/schema.ts`** — remove `statusMessage`, `provisionStage`, `provisionStageUpdatedAt` from `taskRuntime` table -- **`foundry/packages/backend/src/actors/task/index.ts`** — remove `initialized`, `previousStatus` from `createState` -- **`foundry/packages/backend/src/actors/task/workflow/common.ts`** — remove `statusMessage` parameter from `setTaskState()`, remove it from `getCurrentRecord()` query -- **`foundry/packages/backend/src/actors/task/workflow/init.ts`** — remove `statusMessage`, `provisionStage`, `provisionStageUpdatedAt` from taskRuntime inserts/updates; remove `ensureTaskRuntimeCacheColumns()` raw ALTER TABLE for these columns -- **`foundry/packages/backend/src/actors/task/workflow/commands.ts`** — remove `statusMessage` from handler updates -- **`foundry/packages/backend/src/actors/task/workflow/push.ts`** — remove `statusMessage` updates -- **`foundry/packages/backend/src/actors/task/workbench.ts`** — remove `statusMessage` from `buildTaskDetail()`, remove `ensureTaskRuntimeCacheColumns()` for these columns -- **`foundry/packages/shared/src/workbench.ts`** — remove `statusMessage` from `WorkbenchTaskDetail` -- **`foundry/packages/frontend/`** — derive display text from `status` enum instead of reading `statusMessage` - ---- - -## [ ] 25. Remove "Workbench" prefix from all types, functions, files, and tables - -**Rationale:** "Workbench" is not a real concept in the system. It's a namespace prefix applied to every type, function, file, and table name. The actual entities are Task, Session, Repository, Sandbox, Transcript, Draft, etc. — "Workbench" adds zero information and obscures what things actually are. - -### Rename strategy - -Drop "Workbench" everywhere. If the result collides with an existing name (e.g., auth `Session`), use the domain prefix (e.g., `TaskSession` vs auth `Session`). - -### Type renames (`shared/src/workbench.ts`) - -| Before | After | -|---|---| -| `WorkbenchTaskStatus` | `TaskStatus` (already exists as base, merge) | -| `WorkbenchAgentKind` | `AgentKind` | -| `WorkbenchModelId` | `ModelId` | -| `WorkbenchSessionStatus` | `SessionStatus` | -| `WorkbenchTranscriptEvent` | `TranscriptEvent` | -| `WorkbenchComposerDraft` | `ComposerDraft` | -| `WorkbenchSessionSummary` | `SessionSummary` | -| `WorkbenchSessionDetail` | `SessionDetail` | -| `WorkbenchFileChange` | `FileChange` | -| `WorkbenchFileTreeNode` | `FileTreeNode` | -| `WorkbenchLineAttachment` | `LineAttachment` | -| `WorkbenchHistoryEvent` | `HistoryEvent` | -| `WorkbenchDiffLineKind` | `DiffLineKind` | -| `WorkbenchParsedDiffLine` | `ParsedDiffLine` | -| `WorkbenchPullRequestSummary` | `PullRequestSummary` | -| `WorkbenchOpenPrSummary` | `OpenPrSummary` | -| `WorkbenchSandboxSummary` | `SandboxSummary` | -| `WorkbenchTaskSummary` | `TaskSummary` | -| `WorkbenchTaskDetail` | `TaskDetail` | -| `WorkbenchRepositorySummary` | `RepositorySummary` | -| `WorkbenchSession` | `TaskSession` (avoids auth `Session` collision) | -| `WorkbenchTask` | `TaskSnapshot` (avoids `task` table collision) | -| `WorkbenchRepo` | `RepoSnapshot` | -| `WorkbenchRepositorySection` | `RepositorySection` | -| `TaskWorkbenchSnapshot` | `DashboardSnapshot` | -| `WorkbenchModelOption` | `ModelOption` | -| `WorkbenchModelGroup` | `ModelGroup` | -| `TaskWorkbenchSelectInput` | `SelectTaskInput` | -| `TaskWorkbenchCreateTaskInput` | `CreateTaskInput` | -| `TaskWorkbenchRenameInput` | `RenameTaskInput` | -| `TaskWorkbenchSendMessageInput` | `SendMessageInput` | -| `TaskWorkbenchSessionInput` | `SessionInput` | -| `TaskWorkbenchRenameSessionInput` | `RenameSessionInput` | -| `TaskWorkbenchChangeModelInput` | `ChangeModelInput` | -| `TaskWorkbenchUpdateDraftInput` | `UpdateDraftInput` | -| `TaskWorkbenchSetSessionUnreadInput` | `SetSessionUnreadInput` | -| `TaskWorkbenchDiffInput` | `DiffInput` | -| `TaskWorkbenchCreateTaskResponse` | `CreateTaskResponse` | -| `TaskWorkbenchAddSessionResponse` | `AddSessionResponse` | - -### File renames - -| Before | After | -|---|---| -| `shared/src/workbench.ts` | `shared/src/types.ts` (or split into `task.ts`, `session.ts`, etc.) | -| `backend/src/actors/task/workbench.ts` | `backend/src/actors/task/sessions.ts` (already planned in item 7) | -| `client/src/workbench-client.ts` | `client/src/task-client.ts` | -| `client/src/workbench-model.ts` | `client/src/model.ts` | -| `client/src/remote/workbench-client.ts` | `client/src/remote/task-client.ts` | -| `client/src/mock/workbench-client.ts` | `client/src/mock/task-client.ts` | - -### Table rename - -| Before | After | -|---|---| -| `task_workbench_sessions` | `task_sessions` | - -### Function renames (backend — drop "Workbench" infix) - -All functions in `backend/src/actors/task/workbench.ts`: -- `createWorkbenchSession` → `createSession` -- `closeWorkbenchSession` → `closeSession` -- `changeWorkbenchModel` → `changeModel` -- `sendWorkbenchMessage` → `sendMessage` -- `stopWorkbenchSession` → `stopSession` -- `renameWorkbenchBranch` → deleted (see item 26) -- `renameWorkbenchTask` → `renameTask` -- `renameWorkbenchSession` → `renameSession` -- `revertWorkbenchFile` → `revertFile` -- `publishWorkbenchPr` → `publishPr` -- `updateWorkbenchDraft` → `updateDraft` -- `setWorkbenchSessionUnread` → `setSessionUnread` -- `markWorkbenchUnread` → `markUnread` -- `syncWorkbenchSessionStatus` → `syncSessionStatus` -- `ensureWorkbenchSeeded` → `ensureSessionSeeded` - -### Queue/command type renames (backend) - -- `TaskWorkbenchValueCommand` → `TaskValueCommand` -- `TaskWorkbenchSessionTitleCommand` → `SessionTitleCommand` -- `TaskWorkbenchSessionUnreadCommand` → `SessionUnreadCommand` - -### Scope - -~420 occurrences across shared (35+ types), backend (200+ refs), client (324 refs), frontend (96 refs). Mechanical find-and-replace once the rename map is settled. - -### Files to change - -- **`foundry/packages/shared/src/workbench.ts`** — rename file, rename all exported types -- **`foundry/packages/shared/src/index.ts`** — update re-export path -- **`foundry/packages/shared/src/app-shell.ts`** — update `WorkbenchModelId` → `ModelId` import -- **`foundry/packages/shared/src/realtime-events.ts`** — update all `Workbench*` type imports -- **`foundry/packages/backend/src/actors/task/workbench.ts`** — rename file + all functions -- **`foundry/packages/backend/src/actors/task/index.ts`** — update imports and action registrations -- **`foundry/packages/backend/src/actors/task/db/schema.ts`** — rename `taskWorkbenchSessions` → `taskSessions` -- **`foundry/packages/backend/src/actors/task/workflow/`** — update all workbench references -- **`foundry/packages/backend/src/actors/organization/`** — update type imports and action names -- **`foundry/packages/backend/src/actors/repository/`** — update type imports -- **`foundry/packages/client/src/`** — rename files + update all type/function references -- **`foundry/packages/frontend/src/`** — update all type imports - -### CLAUDE.md update - -Update `foundry/packages/backend/CLAUDE.md` coordinator hierarchy diagram: `taskWorkbenchSessions` → `taskSessions`. - ---- - -## [ ] 26. Delete branch rename (branches immutable after creation) - -**Dependencies:** item 25 - -**Rationale:** Branch name is assigned once at task creation and never changes. Branch rename is unused in the frontend UI and SDK, adds ~80 lines of code, and creates a transactional consistency risk (git rename succeeds but index update fails). - -### Delete - -- **`task/workbench.ts`** — delete `renameWorkbenchBranch()` (~50 lines) -- **`task/index.ts`** — delete `renameWorkbenchBranch` action -- **`task/workflow/queue.ts`** — remove `"task.command.workbench.rename_branch"` queue type -- **`task/workflow/index.ts`** — remove `"task.command.workbench.rename_branch"` handler -- **`organization/actions.ts`** — delete `renameWorkbenchBranch` proxy action -- **`repository/actions.ts`** — delete `registerTaskBranch` action (only caller was rename flow) -- **`client/src/workbench-client.ts`** — remove `renameBranch` from interface -- **`client/src/remote/workbench-client.ts`** — delete `renameBranch()` method -- **`client/src/mock/workbench-client.ts`** — delete `renameBranch()` method -- **`client/src/backend-client.ts`** — delete `renameWorkbenchBranch` from interface + implementation -- **`client/src/mock/backend-client.ts`** — delete `renameWorkbenchBranch` implementation -- **`frontend/src/components/mock-layout.tsx`** — remove `renameBranch` from client interface, delete `onRenameBranch` callbacks and all `renameBranch` wiring (~8 refs) -- **`shared/src/workbench.ts`** — delete `TaskWorkbenchRenameInput` (if only used by branch rename; check if task title rename shares it) - -### Keep - -- `deriveFallbackTitle()` + `sanitizeBranchName()` + `resolveCreateFlowDecision()` — initial branch derivation at creation -- `registerTaskBranchMutation()` — used during task creation for `onBranch` path -- `renameWorkbenchTask()` — title rename is independent, stays -- `taskIndex` table — still the coordinator index for branch→task mapping - ---- - -## [ ] Final audit pass (run after all items above are complete) - -### Dead code scan - -Already tracked in item 18: once all changes are complete, do a full scan to find dead actions, queues, SQLite tables, and workflow steps that need to be removed. - -### Dead events audit - -Scan all event types emitted by actors (in `packages/shared/src/realtime-events.ts` and anywhere actors call `c.broadcast()` or similar). Cross-reference against all client subscribers (in `packages/client/` and `packages/frontend/`). Remove any events that are emitted but never subscribed to by any client. This includes events that may have been superseded by the consolidated single-topic-per-actor pattern (item 14). diff --git a/foundry/README.md b/foundry/README.md index 47501ef..f65d93e 100644 --- a/foundry/README.md +++ b/foundry/README.md @@ -1,6 +1,6 @@ # Foundry -TypeScript organization task system powered by RivetKit actors, SQLite/Drizzle state, and OpenTUI. +TypeScript workspace task system powered by RivetKit actors, SQLite/Drizzle state, and OpenTUI. **Documentation**: see `../docs/` in the repository root @@ -12,12 +12,12 @@ pnpm install pnpm -w build ``` -## Repository Goals +## Project Goals - **Simple**: There's one screen. It has everything you need. You can use it blindfolded. - **Fast**: No waiting around. - **Collaborative**: Built for fast moving teams that need code reviewed & shipped fast. -- **Pluggable**: Works for small side repositories to enterprise teams. +- **Pluggable**: Works for small side projects to enterprise teams. ## License diff --git a/foundry/compose.dev.yaml b/foundry/compose.dev.yaml index 7fa492d..a66a8c6 100644 --- a/foundry/compose.dev.yaml +++ b/foundry/compose.dev.yaml @@ -14,10 +14,6 @@ services: HF_BACKEND_HOST: "0.0.0.0" HF_BACKEND_PORT: "7741" RIVETKIT_STORAGE_PATH: "/root/.local/share/foundry/rivetkit" - RIVET_LOG_ERROR_STACK: "${RIVET_LOG_ERROR_STACK:-1}" - RIVET_LOG_LEVEL: "${RIVET_LOG_LEVEL:-debug}" - RIVET_LOG_TIMESTAMP: "${RIVET_LOG_TIMESTAMP:-1}" - FOUNDRY_LOG_LEVEL: "${FOUNDRY_LOG_LEVEL:-debug}" # Pass through credentials needed for agent execution + PR creation in dev/e2e. # Do not hardcode secrets; set these in your environment when starting compose. ANTHROPIC_API_KEY: "${ANTHROPIC_API_KEY:-}" @@ -43,11 +39,6 @@ services: STRIPE_SECRET_KEY: "${STRIPE_SECRET_KEY:-}" STRIPE_WEBHOOK_SECRET: "${STRIPE_WEBHOOK_SECRET:-}" STRIPE_PRICE_TEAM: "${STRIPE_PRICE_TEAM:-}" - FOUNDRY_SANDBOX_PROVIDER: "${FOUNDRY_SANDBOX_PROVIDER:-local}" - HF_LOCAL_SANDBOX_IMAGE: "${HF_LOCAL_SANDBOX_IMAGE:-rivetdev/sandbox-agent:foundry-base-latest}" - E2B_API_KEY: "${E2B_API_KEY:-}" - E2B_TEMPLATE: "${E2B_TEMPLATE:-}" - HF_E2B_TEMPLATE: "${HF_E2B_TEMPLATE:-${E2B_TEMPLATE:-}}" DAYTONA_ENDPOINT: "${DAYTONA_ENDPOINT:-}" DAYTONA_API_KEY: "${DAYTONA_API_KEY:-}" HF_DAYTONA_ENDPOINT: "${HF_DAYTONA_ENDPOINT:-}" @@ -57,15 +48,19 @@ services: - "7741:7741" volumes: - "..:/app" + # The linked RivetKit checkout resolves from Foundry packages to /task/rivet-checkout in-container. + - "../../../task/rivet-checkout:/task/rivet-checkout:ro" # Reuse the host Codex auth profile for local sandbox-agent Codex sessions in dev. - "${HOME}/.codex:/root/.codex" - - "/var/run/docker.sock:/var/run/docker.sock" # Keep backend dependency installs Linux-native instead of using host node_modules. - "foundry_backend_root_node_modules:/app/node_modules" - "foundry_backend_backend_node_modules:/app/foundry/packages/backend/node_modules" - "foundry_backend_shared_node_modules:/app/foundry/packages/shared/node_modules" + - "foundry_backend_persist_rivet_node_modules:/app/sdks/persist-rivet/node_modules" - "foundry_backend_typescript_node_modules:/app/sdks/typescript/node_modules" - "foundry_backend_pnpm_store:/root/.local/share/pnpm/store" + # Persist backend-managed local git clones across container restarts. + - "foundry_git_repos:/root/.local/share/foundry/repos" # Persist RivetKit local storage across container restarts. - "foundry_rivetkit_storage:/root/.local/share/foundry/rivetkit" @@ -85,40 +80,22 @@ services: - "..:/app" # Ensure logs in .foundry/ persist on the host even if we change source mounts later. - "./.foundry:/app/foundry/.foundry" - # Use Linux-native repo dependencies inside the container instead of host node_modules. + - "../../../task/rivet-checkout:/task/rivet-checkout:ro" + # Use Linux-native workspace dependencies inside the container instead of host node_modules. - "foundry_node_modules:/app/node_modules" - "foundry_client_node_modules:/app/foundry/packages/client/node_modules" - "foundry_frontend_node_modules:/app/foundry/packages/frontend/node_modules" - "foundry_shared_node_modules:/app/foundry/packages/shared/node_modules" - "foundry_pnpm_store:/tmp/.local/share/pnpm/store" - smee: - image: node:20-alpine - depends_on: - - backend - env_file: - - path: .env - required: false - environment: - SMEE_URL: "${SMEE_URL:-}" - SMEE_TARGET: "${SMEE_TARGET:-http://backend:7741/v1/webhooks/github}" - command: - - /bin/sh - - -lc - - | - if [ -z "$SMEE_URL" ]; then - echo "SMEE_URL is required for local GitHub webhook forwarding" >&2 - exit 1 - fi - exec npx --yes smee-client --url "$SMEE_URL" --target "$SMEE_TARGET" - restart: unless-stopped - volumes: foundry_backend_root_node_modules: {} foundry_backend_backend_node_modules: {} foundry_backend_shared_node_modules: {} + foundry_backend_persist_rivet_node_modules: {} foundry_backend_typescript_node_modules: {} foundry_backend_pnpm_store: {} + foundry_git_repos: {} foundry_rivetkit_storage: {} foundry_node_modules: {} foundry_client_node_modules: {} diff --git a/foundry/compose.mock.yaml b/foundry/compose.mock.yaml index 6c57875..c4a06ff 100644 --- a/foundry/compose.mock.yaml +++ b/foundry/compose.mock.yaml @@ -15,6 +15,7 @@ services: volumes: - "..:/app" - "./.foundry:/app/foundry/.foundry" + - "../../../task/rivet-checkout:/task/rivet-checkout:ro" - "mock_node_modules:/app/node_modules" - "mock_client_node_modules:/app/foundry/packages/client/node_modules" - "mock_frontend_node_modules:/app/foundry/packages/frontend/node_modules" diff --git a/foundry/compose.preview.yaml b/foundry/compose.preview.yaml index aa43b52..6213885 100644 --- a/foundry/compose.preview.yaml +++ b/foundry/compose.preview.yaml @@ -24,6 +24,7 @@ services: - "7841:7841" volumes: - "${HOME}/.codex:/root/.codex" + - "foundry_preview_git_repos:/root/.local/share/foundry/repos" - "foundry_preview_rivetkit_storage:/root/.local/share/foundry/rivetkit" frontend: @@ -37,4 +38,5 @@ services: - "4273:4273" volumes: + foundry_preview_git_repos: {} foundry_preview_rivetkit_storage: {} diff --git a/foundry/docker/backend.Dockerfile b/foundry/docker/backend.Dockerfile index ae14ddf..c41fd1f 100644 --- a/foundry/docker/backend.Dockerfile +++ b/foundry/docker/backend.Dockerfile @@ -13,13 +13,13 @@ RUN pnpm --filter @sandbox-agent/foundry-shared build RUN pnpm --filter acp-http-client build RUN pnpm --filter @sandbox-agent/cli-shared build RUN SKIP_OPENAPI_GEN=1 pnpm --filter sandbox-agent build +RUN pnpm --filter @sandbox-agent/persist-rivet build RUN pnpm --filter @sandbox-agent/foundry-backend build RUN pnpm --filter @sandbox-agent/foundry-backend deploy --prod /out FROM oven/bun:1.2 AS runtime ENV NODE_ENV=production ENV HOME=/home/task -ENV RIVET_RUNNER_VERSION_FILE=/etc/foundry/rivet-runner-version WORKDIR /app RUN apt-get update \ && apt-get install -y --no-install-recommends \ @@ -32,8 +32,6 @@ RUN addgroup --system --gid 1001 task \ && adduser --system --uid 1001 --home /home/task --ingroup task task \ && mkdir -p /home/task \ && chown -R task:task /home/task /app -RUN mkdir -p /etc/foundry \ - && date +%s > /etc/foundry/rivet-runner-version COPY --from=build /out ./ USER task EXPOSE 7741 diff --git a/foundry/docker/backend.dev.Dockerfile b/foundry/docker/backend.dev.Dockerfile index c4b6c3a..0182aa5 100644 --- a/foundry/docker/backend.dev.Dockerfile +++ b/foundry/docker/backend.dev.Dockerfile @@ -2,6 +2,7 @@ FROM oven/bun:1.3 +ARG GIT_SPICE_VERSION=v0.23.0 ARG SANDBOX_AGENT_VERSION=0.3.0 RUN apt-get update \ @@ -17,13 +18,24 @@ RUN apt-get update \ RUN npm install -g pnpm@10.28.2 +RUN set -eux; \ + arch="$(dpkg --print-architecture)"; \ + case "$arch" in \ + amd64) spice_arch="x86_64" ;; \ + arm64) spice_arch="aarch64" ;; \ + *) echo "Unsupported architecture for git-spice: $arch" >&2; exit 1 ;; \ + esac; \ + tmpdir="$(mktemp -d)"; \ + curl -fsSL "https://github.com/abhinav/git-spice/releases/download/${GIT_SPICE_VERSION}/git-spice.Linux-${spice_arch}.tar.gz" -o "${tmpdir}/git-spice.tgz"; \ + tar -xzf "${tmpdir}/git-spice.tgz" -C "${tmpdir}"; \ + install -m 0755 "${tmpdir}/gs" /usr/local/bin/gs; \ + ln -sf /usr/local/bin/gs /usr/local/bin/git-spice; \ + rm -rf "${tmpdir}" + RUN curl -fsSL "https://releases.rivet.dev/sandbox-agent/${SANDBOX_AGENT_VERSION}/install.sh" | sh ENV PATH="/root/.local/bin:${PATH}" ENV SANDBOX_AGENT_BIN="/root/.local/bin/sandbox-agent" -ENV RIVET_RUNNER_VERSION_FILE=/etc/foundry/rivet-runner-version -RUN mkdir -p /etc/foundry \ - && date +%s > /etc/foundry/rivet-runner-version WORKDIR /app diff --git a/foundry/docker/backend.preview.Dockerfile b/foundry/docker/backend.preview.Dockerfile index 91cd7c7..8c30ae0 100644 --- a/foundry/docker/backend.preview.Dockerfile +++ b/foundry/docker/backend.preview.Dockerfile @@ -2,6 +2,7 @@ FROM oven/bun:1.3 +ARG GIT_SPICE_VERSION=v0.23.0 ARG SANDBOX_AGENT_VERSION=0.3.0 RUN apt-get update \ @@ -16,17 +17,29 @@ RUN apt-get update \ && npm install -g pnpm@10.28.2 \ && rm -rf /var/lib/apt/lists/* +RUN set -eux; \ + arch="$(dpkg --print-architecture)"; \ + case "$arch" in \ + amd64) spice_arch="x86_64" ;; \ + arm64) spice_arch="aarch64" ;; \ + *) echo "Unsupported architecture for git-spice: $arch" >&2; exit 1 ;; \ + esac; \ + tmpdir="$(mktemp -d)"; \ + curl -fsSL "https://github.com/abhinav/git-spice/releases/download/${GIT_SPICE_VERSION}/git-spice.Linux-${spice_arch}.tar.gz" -o "${tmpdir}/git-spice.tgz"; \ + tar -xzf "${tmpdir}/git-spice.tgz" -C "${tmpdir}"; \ + install -m 0755 "${tmpdir}/gs" /usr/local/bin/gs; \ + ln -sf /usr/local/bin/gs /usr/local/bin/git-spice; \ + rm -rf "${tmpdir}" + RUN curl -fsSL "https://releases.rivet.dev/sandbox-agent/${SANDBOX_AGENT_VERSION}/install.sh" | sh ENV PATH="/root/.local/bin:${PATH}" ENV SANDBOX_AGENT_BIN="/root/.local/bin/sandbox-agent" -ENV RIVET_RUNNER_VERSION_FILE=/etc/foundry/rivet-runner-version -RUN mkdir -p /etc/foundry \ - && date +%s > /etc/foundry/rivet-runner-version WORKDIR /workspace/quebec COPY quebec /workspace/quebec +COPY rivet-checkout /workspace/rivet-checkout RUN pnpm install --frozen-lockfile RUN pnpm --filter @sandbox-agent/foundry-shared build diff --git a/foundry/docker/foundry-base.Dockerfile b/foundry/docker/foundry-base.Dockerfile deleted file mode 100644 index b4b9e26..0000000 --- a/foundry/docker/foundry-base.Dockerfile +++ /dev/null @@ -1,190 +0,0 @@ -# syntax=docker/dockerfile:1.10.0 -# -# Foundry base sandbox image. -# -# Builds sandbox-agent from source (reusing the upstream Dockerfile.full build -# stages) and layers Foundry-specific tooling on top: sudo, git, neovim, gh, -# node, bun, chromium, and agent-browser. -# -# Build: -# docker build --platform linux/amd64 \ -# -f foundry/docker/foundry-base.Dockerfile \ -# -t rivetdev/sandbox-agent:foundry-base- . -# -# Must be invoked from the repository root so the COPY . picks up the full -# source tree for the Rust + inspector build stages. - -# ============================================================================ -# Build inspector frontend -# ============================================================================ -FROM --platform=linux/amd64 node:22-alpine AS inspector-build -WORKDIR /app -RUN npm install -g pnpm - -COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./ -COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ -COPY sdks/cli-shared/package.json ./sdks/cli-shared/ -COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ -COPY sdks/react/package.json ./sdks/react/ -COPY sdks/typescript/package.json ./sdks/typescript/ - -RUN pnpm install --filter @sandbox-agent/inspector... - -COPY docs/openapi.json ./docs/ -COPY sdks/cli-shared ./sdks/cli-shared -COPY sdks/acp-http-client ./sdks/acp-http-client -COPY sdks/react ./sdks/react -COPY sdks/typescript ./sdks/typescript - -RUN cd sdks/cli-shared && pnpm exec tsup -RUN cd sdks/acp-http-client && pnpm exec tsup -RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup -RUN cd sdks/react && pnpm exec tsup - -COPY frontend/packages/inspector ./frontend/packages/inspector -RUN cd frontend/packages/inspector && pnpm exec vite build - -# ============================================================================ -# AMD64 Builder - sandbox-agent static binary -# ============================================================================ -FROM --platform=linux/amd64 rust:1.88.0 AS builder - -ENV DEBIAN_FRONTEND=noninteractive - -RUN apt-get update && apt-get install -y \ - musl-tools \ - musl-dev \ - llvm-14-dev \ - libclang-14-dev \ - clang-14 \ - libssl-dev \ - pkg-config \ - ca-certificates \ - g++ \ - g++-multilib \ - git \ - curl \ - wget && \ - rm -rf /var/lib/apt/lists/* - -RUN wget -q https://github.com/cross-tools/musl-cross/releases/latest/download/x86_64-unknown-linux-musl.tar.xz && \ - tar -xf x86_64-unknown-linux-musl.tar.xz -C /opt/ && \ - rm x86_64-unknown-linux-musl.tar.xz && \ - rustup target add x86_64-unknown-linux-musl - -ENV PATH="/opt/x86_64-unknown-linux-musl/bin:$PATH" \ - LIBCLANG_PATH=/usr/lib/llvm-14/lib \ - CLANG_PATH=/usr/bin/clang-14 \ - CC_x86_64_unknown_linux_musl=x86_64-unknown-linux-musl-gcc \ - CXX_x86_64_unknown_linux_musl=x86_64-unknown-linux-musl-g++ \ - AR_x86_64_unknown_linux_musl=x86_64-unknown-linux-musl-ar \ - CARGO_TARGET_X86_64_UNKNOWN_LINUX_MUSL_LINKER=x86_64-unknown-linux-musl-gcc \ - CARGO_INCREMENTAL=0 \ - CARGO_NET_GIT_FETCH_WITH_CLI=true - -ENV SSL_VER=1.1.1w -RUN wget https://www.openssl.org/source/openssl-$SSL_VER.tar.gz && \ - tar -xzf openssl-$SSL_VER.tar.gz && \ - cd openssl-$SSL_VER && \ - ./Configure no-shared no-async --prefix=/musl --openssldir=/musl/ssl linux-x86_64 && \ - make -j$(nproc) && \ - make install_sw && \ - cd .. && \ - rm -rf openssl-$SSL_VER* - -ENV OPENSSL_DIR=/musl \ - OPENSSL_INCLUDE_DIR=/musl/include \ - OPENSSL_LIB_DIR=/musl/lib \ - PKG_CONFIG_ALLOW_CROSS=1 \ - RUSTFLAGS="-C target-feature=+crt-static -C link-arg=-static-libgcc" - -WORKDIR /build -COPY . . - -COPY --from=inspector-build /app/frontend/packages/inspector/dist ./frontend/packages/inspector/dist - -RUN --mount=type=cache,target=/usr/local/cargo/registry \ - --mount=type=cache,target=/usr/local/cargo/git \ - --mount=type=cache,target=/build/target \ - cargo build -p sandbox-agent --release --target x86_64-unknown-linux-musl -j4 && \ - cp target/x86_64-unknown-linux-musl/release/sandbox-agent /sandbox-agent - -# ============================================================================ -# Runtime - Foundry base sandbox image -# ============================================================================ -FROM --platform=linux/amd64 node:22-bookworm-slim - -ENV DEBIAN_FRONTEND=noninteractive - -# --- System packages -------------------------------------------------------- -RUN apt-get update && apt-get install -y --no-install-recommends \ - bash \ - ca-certificates \ - curl \ - git \ - gnupg \ - neovim \ - sudo \ - unzip \ - wget \ - # Chromium and its runtime deps - chromium \ - fonts-liberation \ - libasound2 \ - libatk-bridge2.0-0 \ - libatk1.0-0 \ - libcups2 \ - libdbus-1-3 \ - libdrm2 \ - libgbm1 \ - libgtk-3-0 \ - libnspr4 \ - libnss3 \ - libx11-xcb1 \ - libxcomposite1 \ - libxdamage1 \ - libxrandr2 \ - xdg-utils \ - && rm -rf /var/lib/apt/lists/* - -# --- GitHub CLI (gh) ------------------------------------------------------- -RUN curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg \ - | dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg \ - && chmod go+r /usr/share/keyrings/githubcli-archive-keyring.gpg \ - && echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages stable main" \ - > /etc/apt/sources.list.d/github-cli.list \ - && apt-get update && apt-get install -y gh \ - && rm -rf /var/lib/apt/lists/* - -# --- Bun -------------------------------------------------------------------- -RUN curl -fsSL https://bun.sh/install | bash \ - && mv /root/.bun/bin/bun /usr/local/bin/bun \ - && ln -sf /usr/local/bin/bun /usr/local/bin/bunx \ - && rm -rf /root/.bun - -# --- sandbox-agent binary (from local build) -------------------------------- -COPY --from=builder /sandbox-agent /usr/local/bin/sandbox-agent -RUN chmod +x /usr/local/bin/sandbox-agent - -# --- sandbox user with passwordless sudo ------------------------------------ -RUN useradd -m -s /bin/bash sandbox \ - && echo "sandbox ALL=(ALL) NOPASSWD:ALL" > /etc/sudoers.d/sandbox \ - && chmod 0440 /etc/sudoers.d/sandbox - -USER sandbox -WORKDIR /home/sandbox - -# Point Chromium/Playwright at the system binary -ENV CHROME_PATH=/usr/bin/chromium -ENV CHROMIUM_PATH=/usr/bin/chromium -ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium -ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true - -# --- Install all sandbox-agent agents + agent-browser ----------------------- -RUN sandbox-agent install-agent --all -RUN sudo npm install -g agent-browser - -EXPOSE 2468 - -ENTRYPOINT ["sandbox-agent"] -CMD ["server", "--host", "0.0.0.0", "--port", "2468"] diff --git a/foundry/docker/frontend.dev.Dockerfile b/foundry/docker/frontend.dev.Dockerfile index dd74dd0..3b0d8e4 100644 --- a/foundry/docker/frontend.dev.Dockerfile +++ b/foundry/docker/frontend.dev.Dockerfile @@ -8,4 +8,4 @@ RUN npm install -g pnpm@10.28.2 WORKDIR /app -CMD ["bash", "-lc", "pnpm install --frozen-lockfile --filter @sandbox-agent/foundry-frontend... && cd foundry/packages/frontend && exec pnpm vite --host 0.0.0.0 --port 4173"] +CMD ["bash", "-lc", "pnpm install --force --frozen-lockfile --filter @sandbox-agent/foundry-frontend... && cd foundry/packages/frontend && exec pnpm vite --host 0.0.0.0 --port 4173"] diff --git a/foundry/docker/frontend.preview.Dockerfile b/foundry/docker/frontend.preview.Dockerfile index dd10422..05cbba7 100644 --- a/foundry/docker/frontend.preview.Dockerfile +++ b/foundry/docker/frontend.preview.Dockerfile @@ -7,6 +7,7 @@ RUN npm install -g pnpm@10.28.2 WORKDIR /workspace/quebec COPY quebec /workspace/quebec +COPY rivet-checkout /workspace/rivet-checkout RUN pnpm install --frozen-lockfile RUN pnpm --filter @sandbox-agent/foundry-shared build diff --git a/foundry/packages/backend/CLAUDE.md b/foundry/packages/backend/CLAUDE.md index f7e054d..949db90 100644 --- a/foundry/packages/backend/CLAUDE.md +++ b/foundry/packages/backend/CLAUDE.md @@ -5,334 +5,33 @@ Keep the backend actor tree aligned with this shape unless we explicitly decide to change it: ```text -OrganizationActor (direct coordinator for tasks) -├─ AuditLogActor (organization-scoped global feed) -├─ GithubDataActor -├─ TaskActor(task) -│ ├─ taskSessions → session metadata/transcripts -│ └─ taskSandboxes → sandbox instance index -└─ SandboxInstanceActor(sandboxProviderId, sandboxId) × N +WorkspaceActor +├─ HistoryActor(workspace-scoped global feed) +├─ ProjectActor(repo) +│ ├─ ProjectBranchSyncActor +│ ├─ ProjectPrSyncActor +│ └─ TaskActor(task) +│ ├─ TaskSessionActor(session) × N +│ │ └─ SessionStatusSyncActor(session) × 0..1 +│ └─ Task-local workbench state +└─ SandboxInstanceActor(providerId, sandboxId) × N ``` -## Coordinator Pattern - -Actors follow a coordinator pattern where each coordinator is responsible for: -1. **Index tables** — keeping a local SQLite index/summary of its child actors' data -2. **Create/destroy** — handling lifecycle of child actors -3. **Routing** — resolving lookups to the correct child actor - -Children push updates **up** to their direct coordinator only. Coordinators broadcast changes to connected clients. This keeps the read path local (no fan-out to children). - -### Coordinator hierarchy and index tables - -```text -OrganizationActor (coordinator for tasks + auth users) -│ -│ Index tables: -│ ├─ taskIndex → TaskActor index (taskId → repoId + branchName) -│ ├─ taskSummaries → TaskActor materialized sidebar projection -│ ├─ authSessionIndex → UserActor index (session token → userId) -│ ├─ authEmailIndex → UserActor index (email → userId) -│ └─ authAccountIndex → UserActor index (OAuth account → userId) -│ -├─ TaskActor (coordinator for sessions + sandboxes) -│ │ -│ │ Index tables: -│ │ ├─ taskWorkspaceSessions → Session index (session metadata + transcript) -│ │ └─ taskSandboxes → SandboxInstanceActor index (sandbox history) -│ │ -│ └─ SandboxInstanceActor (leaf) -│ -├─ AuditLogActor (organization-scoped audit log, not a coordinator) -└─ GithubDataActor (GitHub API cache, not a coordinator) -``` - -When adding a new index table, annotate it in the schema file with a doc comment identifying it as a coordinator index and which child actor it indexes (see existing examples). - -## GitHub Sync Data Model - -The GithubDataActor syncs **repositories** and **pull requests** from GitHub, not branches. We only need repos (to know which repos exist and their metadata) and PRs (to lazily populate virtual tasks in the sidebar). Branch data is not synced because we only create tasks from PRs or fresh user-initiated creation, never from bare branches. Generated branch names for new tasks are treated as unique enough to skip conflict detection against remote branches. - -Tasks are either: -1. **Created fresh** by the user (no PR yet, branch name generated from task description) -2. **Lazily populated from pull requests** during PR sync (virtual task entries in org tables, no actor spawned) - -## Lazy Task Actor Creation — CRITICAL - -**Task actors must NEVER be created during GitHub sync or bulk operations.** Creating hundreds of task actors simultaneously causes OOM crashes. An org can have 200+ PRs; spawning an actor per PR kills the process. - -### The two creation points - -There are exactly **two** places that may create a task actor: - -1. **`createTaskMutation`** in `task-mutations.ts` — the only backend code that calls `getOrCreateTask`. Triggered by explicit user action ("New Task" button). One actor at a time. - -2. **`backend-client.ts` client helper** — calls `client.task.getOrCreate(...)`. This is the lazy materialization point: when a user clicks a virtual task in the sidebar, the client creates the actor, and it self-initializes in `getCurrentRecord()` (`workflow/common.ts`) by reading branch/title from the org's `getTaskIndexEntry` action. - -### The rule - -### The rule - -**Never use `getOrCreateTask` inside a sync loop, webhook handler, or any bulk operation.** That's what caused the OOM — 186 actors spawned simultaneously during PR sync. - -`getOrCreateTask` IS allowed in: -- `createTaskMutation` — explicit user "New Task" action -- `requireWorkspaceTask` — user-initiated actions (createSession, sendMessage, etc.) that may hit a virtual task -- `getTask` action on the org — called by sandbox actor and client, needs to materialize virtual tasks -- `backend-client.ts` client helper — lazy materialization when user views a task - -### Virtual tasks (PR-driven) - -During PR sync, `refreshTaskSummaryForBranchMutation` is called for every changed PR (via github-data's `emitPullRequestChangeEvents`). It writes **virtual task entries** to the org actor's local `taskIndex` + `taskSummaries` tables only. No task actor is spawned. No cross-actor calls to task actors. - -When the user interacts with a virtual task (clicks it, creates a session): -1. Client or org actor calls `getOrCreate` on the task actor key → actor is created with empty DB -2. Any action on the actor calls `getCurrentRecord()` → sees empty DB → reads branch/title from org's `getTaskIndexEntry` → calls `initBootstrapDbActivity` + `initCompleteActivity` → task is now real - -### Call sites to watch - -- `refreshTaskSummaryForBranchMutation` — called in bulk during sync. Must ONLY write to org local tables. Never create task actors or call task actor actions. -- `emitPullRequestChangeEvents` in github-data — iterates all changed PRs. Must remain fire-and-forget with no actor fan-out. - -## Queue vs Action Decision Framework - -The default is a direct action. Use a queue only if the answer to one or more of these questions is **yes**. - -Actions are pure RPCs with no DB overhead on send — fast, but if the call fails the operation is lost. Queues persist the message to the database on send, guaranteeing it will be processed even if the target actor is busy, slow, or recovering. The tradeoff: queues add write overhead and serialize processing. - -### 1. Does this operation coordinate multi-step work? - -Does it involve external I/O (sandbox API, GitHub API, agent process management) or state machine transitions where interleaving would corrupt state? This is different from database-level serialization — a simple read-then-write on SQLite can use a transaction. The queue is for ordering operations that span DB writes + external I/O. - -**Queue examples:** -- `workspace.send_message` — sends to sandbox agent, writes session status, does owner-swap. Multi-step with external I/O. -- `push` / `sync` / `merge` — git operations in sandbox that must not interleave. -- `createTask` — read-then-write across task index + actor creation. Returns result, so `wait: true`. - -**Action examples:** -- `billing.stripe_customer.apply` — single column upsert, no external I/O. -- `workspace.update_draft` — writes draft text, no coordination with sandbox ops. -- `workspace.rename_task` — updates title column, queue handlers don't touch title. - -### 2. Must this message be processed no matter what? - -Is this a cross-actor fire-and-forget where the caller won't retry and data loss is unacceptable? A queue persists the message — if the target is down, it waits. An action RPC that fails is gone. - -**Queue examples:** -- `audit.append` — caller must never be affected by audit failures, and audit entries must not be lost. -- `applyTaskSummaryUpdate` — task actor pushes summary to org and moves on. Won't retry if org is busy. -- `refreshTaskSummaryForBranch` — webhook-driven, won't be redelivered for the same event. - -**Action examples:** -- `billing.invoice.upsert` — Stripe retries handle failures externally. No durability need on our side. -- `workspace.mark_unread` — UI convenience state. Acceptable to lose on transient failure. -- `github.webhook_receipt.record` — timestamp columns with no downstream effects. - -### Once on a queue: wait or fire-and-forget? - -If the caller needs a return value, use `wait: true`. If the UI updates via push events, use `wait: false`. - -Full migration plan: `QUEUE_TO_ACTION_MIGRATION.md`. - ## Ownership Rules -- `OrganizationActor` is the organization coordinator, direct coordinator for tasks, and lookup/index owner. It owns the task index, task summaries, and repo catalog. -- `AuditLogActor` is organization-scoped. There is one organization-level audit log feed. +- `WorkspaceActor` is the workspace coordinator and lookup/index owner. +- `HistoryActor` is workspace-scoped. There is one workspace-level history feed. +- `ProjectActor` is the repo coordinator and owns repo-local caches/indexes. - `TaskActor` is one branch. Treat `1 task = 1 branch` once branch assignment is finalized. - `TaskActor` can have many sessions. - `TaskActor` can reference many sandbox instances historically, but should have only one active sandbox/session at a time. -- Session unread state and draft prompts are backend-owned workspace state, not frontend-local state. -- Branch names are immutable after task creation. Do not implement branch-rename flows. +- Session unread state and draft prompts are backend-owned workbench state, not frontend-local state. +- Branch rename is a real git operation, not just metadata. - `SandboxInstanceActor` stays separate from `TaskActor`; tasks/sessions reference it by identity. -- The backend stores no local git state. No clones, no refs, no working trees, and no git-spice. Repository metadata comes from GitHub API data and webhook events. Any working-tree git operation runs inside a sandbox via `executeInSandbox()`. +- Sync actors are polling workers only. They feed parent actors and should not become the source of truth. - When a backend request path must aggregate multiple independent actor calls or reads, prefer bounded parallelism over sequential fan-out when correctness permits. Do not serialize independent work by default. -- Only a coordinator creates/destroys its children. Do not create child actors from outside the coordinator. -- Children push state changes up to their direct coordinator only. Task actors push summary updates directly to the organization actor. -- Read paths must use the coordinator's local index tables. Do not fan out to child actors on the hot read path. -- Never build "enriched" read actions that chain through multiple actors (e.g., coordinator → child actor → sibling actor). If data from multiple actors is needed for a read, it should already be materialized in the coordinator's index tables via push updates. If it's not there, fix the write path to push it — do not add a fan-out read path. - -## Drizzle Migration Maintenance - -After changing any actor's `db/schema.ts`, you **must** regenerate the corresponding migration so the runtime creates the tables that match the schema. Forgetting this step causes `no such table` errors at runtime. - -1. **Generate a new drizzle migration.** Run from `packages/backend`: - ```bash - npx drizzle-kit generate --config=./src/actors//db/drizzle.config.ts - ``` - If the interactive prompt is unavailable (e.g. in a non-TTY), manually create a new `.sql` file under `./src/actors//db/drizzle/` and add the corresponding entry to `meta/_journal.json`. - -2. **Regenerate the compiled `migrations.ts`.** Run from the foundry root: - ```bash - npx tsx packages/backend/src/actors/_scripts/generate-actor-migrations.ts - ``` - -3. **Verify insert/upsert calls.** Every column with `.notNull()` (and no `.default(...)`) must be provided a value in all `insert()` and `onConflictDoUpdate()` calls. Missing a NOT NULL column causes a runtime constraint violation, not a type error. - -4. **Nuke RivetKit state in dev** after migration changes to start fresh: - ```bash - docker compose -f compose.dev.yaml down - docker volume rm foundry_foundry_rivetkit_storage - docker compose -f compose.dev.yaml up -d - ``` - -Actors with drizzle migrations: `organization`, `audit-log`, `task`. Other actors (`user`, `github-data`) use inline migrations without drizzle. - -## Workflow Step Nesting — FORBIDDEN - -**Never call `c.step()` / `ctx.step()` from inside another step's `run` callback.** RivetKit workflow steps cannot be nested. Doing so causes the runtime error: *"Cannot start a new workflow entry while another is in progress."* - -This means: -- Functions called from within a step `run` callback must NOT use `c.step()`, `c.loop()`, `c.sleep()`, or `c.queue.next()`. -- If a mutation function needs to be called both from a step and standalone, it must only do plain DB/API work — no workflow primitives. The workflow step wrapping belongs in the workflow file, not in the mutation. -- Helper wrappers that conditionally call `c.step()` (like a `runSyncStep` pattern) are dangerous — if the caller is already inside a step, the nested `c.step()` will crash at runtime with no compile-time warning. - -**Rule of thumb:** Workflow primitives (`step`, `loop`, `sleep`, `queue.next`) may only appear at the top level of a workflow function or inside a `loop` callback — never inside a step's `run`. - -## SQLite Constraints - -- Single-row tables must use an integer primary key with `CHECK (id = 1)` to enforce the singleton invariant at the database level. -- Follow the task actor pattern for metadata/profile rows and keep the fixed row id in code as `1`, not a string sentinel. - -## Multiplayer Correctness - -Per-user UI state must live on the user actor, not on shared task/session actors. This is critical for multiplayer — multiple users may view the same task simultaneously with different active sessions, unread states, and in-progress drafts. - -**Per-user state (user actor):** active session tab, unread counts, draft text, draft attachments. Keyed by `(userId, taskId, sessionId)`. - -**Task-global state (task actor):** session transcript, session model, session runtime status, sandbox identity, task status, branch name, PR state. These are shared across all users viewing the task — that is correct behavior. - -Do not store per-user preferences, selections, or ephemeral UI state on shared actors. If a field's value should differ between two users looking at the same task, it belongs on the user actor. - -## Audit Log Maintenance - -Every new action or command handler that represents a user-visible or workflow-significant event must append to the audit log actor. The audit log must remain a comprehensive record of significant operations. - -## Debugging Actors - -### RivetKit Inspector UI - -The RivetKit inspector UI at `http://localhost:6420/ui/` is the most reliable way to debug actor state in local development. The inspector HTTP API (`/inspector/workflow-history`) has a known bug where it returns empty `{}` even when the workflow has entries — always cross-check with the UI. - -**Useful inspector URL pattern:** -``` -http://localhost:6420/ui/?u=http%3A%2F%2F127.0.0.1%3A6420&ns=default&r=default&n=[%22%22]&actorId=&tab= -``` - -Tabs: `workflow`, `database`, `state`, `queue`, `connections`, `metadata`. - -**To find actor IDs:** -```bash -curl -s 'http://127.0.0.1:6420/actors?name=organization' -``` - -**To query actor DB via bun (inside container):** -```bash -docker compose -f compose.dev.yaml exec -T backend bun -e ' - var Database = require("bun:sqlite"); - var db = new Database("/root/.local/share/foundry/rivetkit/databases/.db", { readonly: true }); - console.log(JSON.stringify(db.query("SELECT name FROM sqlite_master WHERE type=?").all("table"))); -' -``` - -**To call actor actions via inspector:** -```bash -curl -s -X POST 'http://127.0.0.1:6420/gateway//inspector/action/' \ - -H 'Content-Type: application/json' -d '{"args":[{}]}' -``` - -### Known inspector API bugs - -- `GET /inspector/workflow-history` may return `{"history":{}}` even when workflow has run. Use the UI's Workflow tab instead. -- `GET /inspector/queue` is reliable for checking pending messages. -- `GET /inspector/state` is reliable for checking actor state. - -## Inbox & Notification System - -The user actor owns two per-user systems: a **task feed** (sidebar ordering) and **notifications** (discrete events). These are distinct concepts that share a common "bump" mechanism. - -### Core distinction: bumps vs. notifications - -A **bump** updates the task's position in the user's sidebar feed. A **notification** is a discrete event entry shown in the notification panel. Every notification also triggers a bump, but not every bump creates a notification. - -| Event | Bumps task? | Creates notification? | -|-------|-------------|----------------------| -| User sends a message | Yes | No | -| User opens/clicks a task | Yes | No | -| User creates a session | Yes | No | -| Agent finishes responding | Yes | Yes | -| PR review requested | Yes | Yes | -| PR merged | Yes | Yes | -| PR comment added | Yes | Yes | -| Agent error/needs input | Yes | Yes | - -### Recipient resolution - -Notifications and bumps go to the **task owner** only. Each task has exactly one owner at a time (the user who last sent a message or explicitly took ownership). This is an acceptable race condition — it rarely makes sense for two users to work on the same task simultaneously, and ownership transfer is explicit. - -The system supports multiplayer (multiple users can view the same task), but the notification/bump target is always the single current owner. Each user has their own independent notification and unread state on their own user actor. - -### Tables (on user actor) - -Two new tables: - -- **`userTaskFeed`** — one row per task. Tracks `bumpedAtMs` and `bumpReason` for sidebar sort order. Does NOT denormalize task content (title, repo, etc.) — the frontend queries the org actor for task content and uses the feed only for ordering/filtering. -- **`userNotifications`** — discrete notification entries with `type`, `message`, `read` state, and optional `sessionId`. Retention: notifications are retained for a configurable number of days after being marked read, then cleaned up. - -### Queue commands (user actor workflow) - -- `user.bump_task` — upserts `userTaskFeed` row, no notification created. Used for user-initiated actions (send message, open task, create session). -- `user.notify` — inserts `userNotifications` row AND upserts `userTaskFeed` (auto-bump). Used for system events (agent finished, PR review requested). -- `user.mark_read` — marks notifications read for a given `(taskId, sessionId?)`. Also updates `userTaskState.unread` for the session. - -### Data flow - -Task actor (or org actor) resolves the current task owner, then sends to the owner's user actor queue: -1. `user.notify(...)` for notification-worthy events (auto-bumps the feed) -2. `user.bump_task(...)` for non-notification bumps (send message, open task) - -The user actor processes the queue message, writes to its local tables, and broadcasts a `userFeedUpdated` event to connected clients. - -### Sidebar architecture change - -The left sidebar changes from showing the repo/PR tree to showing **recent tasks** ordered by `userTaskFeed.bumpedAtMs`. Two new buttons at the top of the sidebar: -- **All Repositories** — navigates to a page showing the current repo + PR list (preserving existing functionality) -- **Notifications** — navigates to a page showing the full notification list - -The sidebar reads from two sources: -- **User actor** (`userTaskFeed`) — provides sort order and "which tasks are relevant to this user" -- **Org actor** (`taskSummaries`) — provides task content (title, status, branch, PR state, session summaries) - -The frontend merges these: org snapshot gives task data, user feed gives sort order. Uses the existing subscription system (`useSubscription`) for both initial state fetch and streaming updates. - -### `updatedAtMs` column semantics - -The org actor's `taskSummaries.updatedAtMs` and the user actor's `userTaskFeed.bumpedAtMs` serve different purposes: -- `taskSummaries.updatedAtMs` — updated by task actor push. Reflects the last time the task's global state changed (any mutation, any user). Used for "All Repositories" / "All Tasks" views. -- `userTaskFeed.bumpedAtMs` — updated by bump/notify commands. Reflects the last time this specific user's attention was drawn to this task. Used for the per-user sidebar sort. - -Add doc comments on both columns clarifying the update source. - -### Unread semantics - -Each user has independent unread state. The existing `userTaskState` table tracks per-`(taskId, sessionId)` unread state. When the user clicks a session: -1. `userTaskState.unread` is set to 0 for that session -2. All `userNotifications` rows matching `(taskId, sessionId)` are marked `read = 1` - -These two unread systems must stay in sync via the `user.mark_read` queue command. - -## Better Auth: Actions, Not Queues - -All Better Auth adapter operations (verification CRUD, session/email/account index mutations, and user-actor auth record mutations) are exposed as **actions**, not queue commands. This is an intentional exception to the normal pattern of using queues for mutations. - -**Why:** The org actor's workflow queue is shared with GitHub sync, webhook processing, task mutations, and billing — 20+ queue names processed sequentially. During the OAuth callback, Better Auth needs to read/write verification records and upsert session/account indexes. If any long-running queue handler (e.g., a GitHub sync step) is ahead in the queue, auth operations time out (10s), `expectQueueResponse` throws a regular `Error`, and Better Auth's `parseState` catches it as a non-`StateError` → redirects to `?error=please_restart_the_process`. - -**Why it's safe:** Auth operations are simple SQLite reads/writes scoped to a single actor instance with no cross-actor side effects. They don't need workflow replay semantics or sequential ordering guarantees relative to other queue commands. - -**Rule:** Never move Better Auth operations back to queue commands. If new auth-related mutations are added, expose them as actions on the relevant actor. ## Maintenance - Keep this file up to date whenever actor ownership, hierarchy, or lifecycle responsibilities change. - If the real actor tree diverges from this document, update this document in the same change. -- When adding, removing, or renaming coordinator index tables, update the hierarchy diagram above in the same change. -- When adding a new coordinator index table in a schema file, add a doc comment identifying which child actor it indexes (pattern: `/** Coordinator index of {ChildActor} instances. ... */`). diff --git a/foundry/packages/backend/package.json b/foundry/packages/backend/package.json index 562bab7..aec80a0 100644 --- a/foundry/packages/backend/package.json +++ b/foundry/packages/backend/package.json @@ -13,18 +13,18 @@ "start": "bun dist/index.js start" }, "dependencies": { - "@e2b/code-interpreter": "^2.3.3", + "@daytonaio/sdk": "0.141.0", "@hono/node-server": "^1.19.7", "@hono/node-ws": "^1.3.0", "@iarna/toml": "^2.2.5", "@sandbox-agent/foundry-shared": "workspace:*", + "@sandbox-agent/persist-rivet": "workspace:*", "better-auth": "^1.5.5", - "dockerode": "^4.0.9", "drizzle-kit": "^0.31.8", "drizzle-orm": "^0.44.5", "hono": "^4.11.9", "pino": "^10.3.1", - "rivetkit": "https://pkg.pr.new/rivet-dev/rivet/rivetkit@791500a", + "rivetkit": "2.1.6", "sandbox-agent": "workspace:*", "uuid": "^13.0.0", "ws": "^8.19.0", diff --git a/foundry/packages/backend/src/actors/audit-log/db/drizzle.config.ts b/foundry/packages/backend/src/actors/audit-log/db/drizzle.config.ts deleted file mode 100644 index da5e904..0000000 --- a/foundry/packages/backend/src/actors/audit-log/db/drizzle.config.ts +++ /dev/null @@ -1,6 +0,0 @@ -import { defineConfig } from "rivetkit/db/drizzle"; - -export default defineConfig({ - out: "./src/actors/audit-log/db/drizzle", - schema: "./src/actors/audit-log/db/schema.ts", -}); diff --git a/foundry/packages/backend/src/actors/audit-log/db/drizzle/0001_add_repo_id.sql b/foundry/packages/backend/src/actors/audit-log/db/drizzle/0001_add_repo_id.sql deleted file mode 100644 index 9ada559..0000000 --- a/foundry/packages/backend/src/actors/audit-log/db/drizzle/0001_add_repo_id.sql +++ /dev/null @@ -1 +0,0 @@ -ALTER TABLE `events` ADD COLUMN `repo_id` text; diff --git a/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/0001_snapshot.json b/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/0001_snapshot.json deleted file mode 100644 index cf2910c..0000000 --- a/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/0001_snapshot.json +++ /dev/null @@ -1,77 +0,0 @@ -{ - "version": "6", - "dialect": "sqlite", - "id": "a1b2c3d4-0001-4000-8000-000000000001", - "prevId": "e592c829-141f-4740-88b7-09cf957a4405", - "tables": { - "events": { - "name": "events", - "columns": { - "id": { - "name": "id", - "type": "integer", - "primaryKey": true, - "notNull": true, - "autoincrement": true - }, - "repo_id": { - "name": "repo_id", - "type": "text", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, - "task_id": { - "name": "task_id", - "type": "text", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, - "branch_name": { - "name": "branch_name", - "type": "text", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, - "kind": { - "name": "kind", - "type": "text", - "primaryKey": false, - "notNull": true, - "autoincrement": false - }, - "payload_json": { - "name": "payload_json", - "type": "text", - "primaryKey": false, - "notNull": true, - "autoincrement": false - }, - "created_at": { - "name": "created_at", - "type": "integer", - "primaryKey": false, - "notNull": true, - "autoincrement": false - } - }, - "indexes": {}, - "foreignKeys": {}, - "compositePrimaryKeys": {}, - "uniqueConstraints": {}, - "checkConstraints": {} - } - }, - "views": {}, - "enums": {}, - "_meta": { - "schemas": {}, - "tables": {}, - "columns": {} - }, - "internal": { - "indexes": {} - } -} diff --git a/foundry/packages/backend/src/actors/audit-log/index.ts b/foundry/packages/backend/src/actors/audit-log/index.ts deleted file mode 100644 index db32829..0000000 --- a/foundry/packages/backend/src/actors/audit-log/index.ts +++ /dev/null @@ -1,180 +0,0 @@ -// @ts-nocheck -import { and, desc, eq } from "drizzle-orm"; -import { actor, queue } from "rivetkit"; -import { workflow, Loop } from "rivetkit/workflow"; -import type { AuditLogEvent } from "@sandbox-agent/foundry-shared"; -import { selfAuditLog } from "../handles.js"; -import { logActorWarning, resolveErrorMessage } from "../logging.js"; -import { auditLogDb } from "./db/db.js"; -import { events } from "./db/schema.js"; - -export interface AuditLogInput { - organizationId: string; -} - -export interface AppendAuditLogCommand { - kind: string; - repoId?: string; - taskId?: string; - branchName?: string; - payload: Record; -} - -export interface ListAuditLogParams { - repoId?: string; - branch?: string; - taskId?: string; - limit?: number; -} - -// --------------------------------------------------------------------------- -// Queue names -// --------------------------------------------------------------------------- - -const AUDIT_LOG_QUEUE_NAMES = ["auditLog.command.append"] as const; - -type AuditLogQueueName = (typeof AUDIT_LOG_QUEUE_NAMES)[number]; - -function auditLogWorkflowQueueName(name: AuditLogQueueName): AuditLogQueueName { - return name; -} - -// --------------------------------------------------------------------------- -// Mutation functions -// --------------------------------------------------------------------------- - -async function appendMutation(c: any, body: AppendAuditLogCommand): Promise<{ ok: true }> { - const now = Date.now(); - await c.db - .insert(events) - .values({ - repoId: body.repoId ?? null, - taskId: body.taskId ?? null, - branchName: body.branchName ?? null, - kind: body.kind, - payloadJson: JSON.stringify(body.payload), - createdAt: now, - }) - .run(); - return { ok: true }; -} - -// --------------------------------------------------------------------------- -// Workflow command loop -// --------------------------------------------------------------------------- - -type AuditLogWorkflowHandler = (loopCtx: any, body: any) => Promise; - -const AUDIT_LOG_COMMAND_HANDLERS: Record = { - "auditLog.command.append": async (c, body) => appendMutation(c, body), -}; - -async function runAuditLogWorkflow(ctx: any): Promise { - await ctx.loop("audit-log-command-loop", async (loopCtx: any) => { - const msg = await loopCtx.queue.next("next-audit-log-command", { - names: [...AUDIT_LOG_QUEUE_NAMES], - completable: true, - }); - - if (!msg) { - return Loop.continue(undefined); - } - - const handler = AUDIT_LOG_COMMAND_HANDLERS[msg.name as AuditLogQueueName]; - if (!handler) { - logActorWarning("auditLog", "unknown audit-log command", { command: msg.name }); - await msg.complete({ error: `Unknown command: ${msg.name}` }).catch(() => {}); - return Loop.continue(undefined); - } - - try { - // Wrap in a step so c.state and c.db are accessible inside mutation functions. - const result = await loopCtx.step({ - name: msg.name, - timeout: 60_000, - run: async () => handler(loopCtx, msg.body), - }); - await msg.complete(result); - } catch (error) { - const message = resolveErrorMessage(error); - logActorWarning("auditLog", "audit-log workflow command failed", { - command: msg.name, - error: message, - }); - await msg.complete({ error: message }).catch(() => {}); - } - - return Loop.continue(undefined); - }); -} - -// --------------------------------------------------------------------------- -// Actor definition -// --------------------------------------------------------------------------- - -/** - * Organization-scoped audit log. One per org, not one per repo. - * - * The org is the coordinator for all tasks across repos, and we frequently need - * to query the full audit trail across repos (e.g. org-wide activity feed, - * compliance). A per-repo audit log would require fan-out reads every time. - * Keeping it org-scoped gives us a single queryable feed with optional repoId - * filtering when callers want a narrower view. - */ -export const auditLog = actor({ - db: auditLogDb, - queues: Object.fromEntries(AUDIT_LOG_QUEUE_NAMES.map((name) => [name, queue()])), - options: { - name: "Audit Log", - icon: "database", - }, - createState: (_c, input: AuditLogInput) => ({ - organizationId: input.organizationId, - }), - actions: { - // Mutation — self-send to queue for workflow history - async append(c: any, body: AppendAuditLogCommand): Promise<{ ok: true }> { - const self = selfAuditLog(c); - await self.send(auditLogWorkflowQueueName("auditLog.command.append"), body, { wait: false }); - return { ok: true }; - }, - - // Read — direct action (no queue) - async list(c, params?: ListAuditLogParams): Promise { - const whereParts = []; - if (params?.repoId) { - whereParts.push(eq(events.repoId, params.repoId)); - } - if (params?.taskId) { - whereParts.push(eq(events.taskId, params.taskId)); - } - if (params?.branch) { - whereParts.push(eq(events.branchName, params.branch)); - } - - const base = c.db - .select({ - id: events.id, - repoId: events.repoId, - taskId: events.taskId, - branchName: events.branchName, - kind: events.kind, - payloadJson: events.payloadJson, - createdAt: events.createdAt, - }) - .from(events); - - const rows = await (whereParts.length > 0 ? base.where(and(...whereParts)) : base) - .orderBy(desc(events.createdAt)) - .limit(params?.limit ?? 100) - .all(); - - return rows.map((row) => ({ - ...row, - organizationId: c.state.organizationId, - repoId: row.repoId ?? null, - })); - }, - }, - run: workflow(runAuditLogWorkflow), -}); diff --git a/foundry/packages/backend/src/actors/audit-log/db/db.ts b/foundry/packages/backend/src/actors/auth-user/db/db.ts similarity index 69% rename from foundry/packages/backend/src/actors/audit-log/db/db.ts rename to foundry/packages/backend/src/actors/auth-user/db/db.ts index d808ec0..b434338 100644 --- a/foundry/packages/backend/src/actors/audit-log/db/db.ts +++ b/foundry/packages/backend/src/actors/auth-user/db/db.ts @@ -2,4 +2,4 @@ import { db } from "rivetkit/db/drizzle"; import * as schema from "./schema.js"; import migrations from "./migrations.js"; -export const auditLogDb = db({ schema, migrations }); +export const authUserDb = db({ schema, migrations }); diff --git a/foundry/packages/backend/src/actors/user/db/migrations.ts b/foundry/packages/backend/src/actors/auth-user/db/migrations.ts similarity index 65% rename from foundry/packages/backend/src/actors/user/db/migrations.ts rename to foundry/packages/backend/src/actors/auth-user/db/migrations.ts index da92bdc..be7cb17 100644 --- a/foundry/packages/backend/src/actors/user/db/migrations.ts +++ b/foundry/packages/backend/src/actors/auth-user/db/migrations.ts @@ -10,12 +10,6 @@ const journal = { tag: "0000_auth_user", breakpoints: true, }, - { - idx: 1, - when: 1773532800000, - tag: "0001_user_task_state", - breakpoints: true, - }, ], } as const; @@ -23,19 +17,15 @@ export default { journal, migrations: { m0000: `CREATE TABLE \`user\` ( - \`id\` integer PRIMARY KEY NOT NULL, - \`auth_user_id\` text NOT NULL, + \`id\` text PRIMARY KEY NOT NULL, \`name\` text NOT NULL, \`email\` text NOT NULL, \`email_verified\` integer NOT NULL, \`image\` text, \`created_at\` integer NOT NULL, - \`updated_at\` integer NOT NULL, - CONSTRAINT \`user_singleton_id_check\` CHECK(\`id\` = 1) + \`updated_at\` integer NOT NULL ); --> statement-breakpoint -CREATE UNIQUE INDEX \`user_auth_user_id_idx\` ON \`user\` (\`auth_user_id\`); ---> statement-breakpoint CREATE TABLE \`session\` ( \`id\` text PRIMARY KEY NOT NULL, \`token\` text NOT NULL, @@ -68,39 +58,23 @@ CREATE TABLE \`account\` ( CREATE UNIQUE INDEX \`account_provider_account_idx\` ON \`account\` (\`provider_id\`, \`account_id\`); --> statement-breakpoint CREATE TABLE \`user_profiles\` ( - \`id\` integer PRIMARY KEY NOT NULL, - \`user_id\` text NOT NULL, + \`user_id\` text PRIMARY KEY NOT NULL, \`github_account_id\` text, \`github_login\` text, \`role_label\` text NOT NULL, - \`default_model\` text DEFAULT 'gpt-5.3-codex' NOT NULL, \`eligible_organization_ids_json\` text NOT NULL, \`starter_repo_status\` text NOT NULL, \`starter_repo_starred_at\` integer, \`starter_repo_skipped_at\` integer, \`created_at\` integer NOT NULL, - \`updated_at\` integer NOT NULL, - CONSTRAINT \`user_profiles_singleton_id_check\` CHECK(\`id\` = 1) + \`updated_at\` integer NOT NULL ); --> statement-breakpoint -CREATE UNIQUE INDEX \`user_profiles_user_id_idx\` ON \`user_profiles\` (\`user_id\`); ---> statement-breakpoint CREATE TABLE \`session_state\` ( \`session_id\` text PRIMARY KEY NOT NULL, \`active_organization_id\` text, \`created_at\` integer NOT NULL, \`updated_at\` integer NOT NULL -);`, - m0001: `CREATE TABLE \`user_task_state\` ( - \`task_id\` text NOT NULL, - \`session_id\` text NOT NULL, - \`active_session_id\` text, - \`unread\` integer DEFAULT 0 NOT NULL, - \`draft_text\` text DEFAULT '' NOT NULL, - \`draft_attachments_json\` text DEFAULT '[]' NOT NULL, - \`draft_updated_at\` integer, - \`updated_at\` integer NOT NULL, - PRIMARY KEY(\`task_id\`, \`session_id\`) );`, } as const, }; diff --git a/foundry/packages/backend/src/actors/auth-user/db/schema.ts b/foundry/packages/backend/src/actors/auth-user/db/schema.ts new file mode 100644 index 0000000..b87567a --- /dev/null +++ b/foundry/packages/backend/src/actors/auth-user/db/schema.ts @@ -0,0 +1,70 @@ +import { integer, sqliteTable, text, uniqueIndex } from "drizzle-orm/sqlite-core"; + +export const authUsers = sqliteTable("user", { + id: text("id").notNull().primaryKey(), + name: text("name").notNull(), + email: text("email").notNull(), + emailVerified: integer("email_verified").notNull(), + image: text("image"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const authSessions = sqliteTable( + "session", + { + id: text("id").notNull().primaryKey(), + token: text("token").notNull(), + userId: text("user_id").notNull(), + expiresAt: integer("expires_at").notNull(), + ipAddress: text("ip_address"), + userAgent: text("user_agent"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), + }, + (table) => ({ + tokenIdx: uniqueIndex("session_token_idx").on(table.token), + }), +); + +export const authAccounts = sqliteTable( + "account", + { + id: text("id").notNull().primaryKey(), + accountId: text("account_id").notNull(), + providerId: text("provider_id").notNull(), + userId: text("user_id").notNull(), + accessToken: text("access_token"), + refreshToken: text("refresh_token"), + idToken: text("id_token"), + accessTokenExpiresAt: integer("access_token_expires_at"), + refreshTokenExpiresAt: integer("refresh_token_expires_at"), + scope: text("scope"), + password: text("password"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), + }, + (table) => ({ + providerAccountIdx: uniqueIndex("account_provider_account_idx").on(table.providerId, table.accountId), + }), +); + +export const userProfiles = sqliteTable("user_profiles", { + userId: text("user_id").notNull().primaryKey(), + githubAccountId: text("github_account_id"), + githubLogin: text("github_login"), + roleLabel: text("role_label").notNull(), + eligibleOrganizationIdsJson: text("eligible_organization_ids_json").notNull(), + starterRepoStatus: text("starter_repo_status").notNull(), + starterRepoStarredAt: integer("starter_repo_starred_at"), + starterRepoSkippedAt: integer("starter_repo_skipped_at"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const sessionState = sqliteTable("session_state", { + sessionId: text("session_id").notNull().primaryKey(), + activeOrganizationId: text("active_organization_id"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), +}); diff --git a/foundry/packages/backend/src/actors/auth-user/index.ts b/foundry/packages/backend/src/actors/auth-user/index.ts new file mode 100644 index 0000000..a77635a --- /dev/null +++ b/foundry/packages/backend/src/actors/auth-user/index.ts @@ -0,0 +1,353 @@ +import { and, asc, count as sqlCount, desc, eq, gt, gte, inArray, isNotNull, isNull, like, lt, lte, ne, notInArray, or } from "drizzle-orm"; +import { actor } from "rivetkit"; +import { authUserDb } from "./db/db.js"; +import { authAccounts, authSessions, authUsers, sessionState, userProfiles } from "./db/schema.js"; + +const tables = { + user: authUsers, + session: authSessions, + account: authAccounts, + userProfiles, + sessionState, +} as const; + +function tableFor(model: string) { + const table = tables[model as keyof typeof tables]; + if (!table) { + throw new Error(`Unsupported auth user model: ${model}`); + } + return table as any; +} + +function columnFor(table: any, field: string) { + const column = table[field]; + if (!column) { + throw new Error(`Unsupported auth user field: ${field}`); + } + return column; +} + +function normalizeValue(value: unknown): unknown { + if (value instanceof Date) { + return value.getTime(); + } + if (Array.isArray(value)) { + return value.map((entry) => normalizeValue(entry)); + } + return value; +} + +function clauseToExpr(table: any, clause: any) { + const column = columnFor(table, clause.field); + const value = normalizeValue(clause.value); + + switch (clause.operator) { + case "ne": + return value === null ? isNotNull(column) : ne(column, value as any); + case "lt": + return lt(column, value as any); + case "lte": + return lte(column, value as any); + case "gt": + return gt(column, value as any); + case "gte": + return gte(column, value as any); + case "in": + return inArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); + case "not_in": + return notInArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); + case "contains": + return like(column, `%${String(value ?? "")}%`); + case "starts_with": + return like(column, `${String(value ?? "")}%`); + case "ends_with": + return like(column, `%${String(value ?? "")}`); + case "eq": + default: + return value === null ? isNull(column) : eq(column, value as any); + } +} + +function buildWhere(table: any, where: any[] | undefined) { + if (!where || where.length === 0) { + return undefined; + } + + let expr = clauseToExpr(table, where[0]); + for (const clause of where.slice(1)) { + const next = clauseToExpr(table, clause); + expr = clause.connector === "OR" ? or(expr, next) : and(expr, next); + } + return expr; +} + +function applyJoinToRow(c: any, model: string, row: any, join: any) { + if (!row || !join) { + return row; + } + + if (model === "session" && join.user) { + return c.db + .select() + .from(authUsers) + .where(eq(authUsers.id, row.userId)) + .get() + .then((user: any) => ({ ...row, user: user ?? null })); + } + + if (model === "account" && join.user) { + return c.db + .select() + .from(authUsers) + .where(eq(authUsers.id, row.userId)) + .get() + .then((user: any) => ({ ...row, user: user ?? null })); + } + + if (model === "user" && join.account) { + return c.db + .select() + .from(authAccounts) + .where(eq(authAccounts.userId, row.id)) + .all() + .then((accounts: any[]) => ({ ...row, account: accounts })); + } + + return Promise.resolve(row); +} + +async function applyJoinToRows(c: any, model: string, rows: any[], join: any) { + if (!join || rows.length === 0) { + return rows; + } + + if (model === "session" && join.user) { + const userIds = [...new Set(rows.map((row) => row.userId).filter(Boolean))]; + const users = userIds.length > 0 ? await c.db.select().from(authUsers).where(inArray(authUsers.id, userIds)).all() : []; + const userMap = new Map(users.map((user: any) => [user.id, user])); + return rows.map((row) => ({ ...row, user: userMap.get(row.userId) ?? null })); + } + + if (model === "account" && join.user) { + const userIds = [...new Set(rows.map((row) => row.userId).filter(Boolean))]; + const users = userIds.length > 0 ? await c.db.select().from(authUsers).where(inArray(authUsers.id, userIds)).all() : []; + const userMap = new Map(users.map((user: any) => [user.id, user])); + return rows.map((row) => ({ ...row, user: userMap.get(row.userId) ?? null })); + } + + if (model === "user" && join.account) { + const userIds = rows.map((row) => row.id); + const accounts = userIds.length > 0 ? await c.db.select().from(authAccounts).where(inArray(authAccounts.userId, userIds)).all() : []; + const accountsByUserId = new Map(); + for (const account of accounts) { + const entries = accountsByUserId.get(account.userId) ?? []; + entries.push(account); + accountsByUserId.set(account.userId, entries); + } + return rows.map((row) => ({ ...row, account: accountsByUserId.get(row.id) ?? [] })); + } + + return rows; +} + +export const authUser = actor({ + db: authUserDb, + options: { + name: "Auth User", + icon: "shield", + actionTimeout: 60_000, + }, + createState: (_c, input: { userId: string }) => ({ + userId: input.userId, + }), + actions: { + async createAuthRecord(c, input: { model: string; data: Record }) { + const table = tableFor(input.model); + await c.db + .insert(table) + .values(input.data as any) + .run(); + return await c.db + .select() + .from(table) + .where(eq(columnFor(table, "id"), input.data.id as any)) + .get(); + }, + + async findOneAuthRecord(c, input: { model: string; where: any[]; join?: any }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + const row = predicate ? await c.db.select().from(table).where(predicate).get() : await c.db.select().from(table).get(); + return await applyJoinToRow(c, input.model, row ?? null, input.join); + }, + + async findManyAuthRecords(c, input: { model: string; where?: any[]; limit?: number; offset?: number; sortBy?: any; join?: any }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + let query: any = c.db.select().from(table); + if (predicate) { + query = query.where(predicate); + } + if (input.sortBy?.field) { + const column = columnFor(table, input.sortBy.field); + query = query.orderBy(input.sortBy.direction === "asc" ? asc(column) : desc(column)); + } + if (typeof input.limit === "number") { + query = query.limit(input.limit); + } + if (typeof input.offset === "number") { + query = query.offset(input.offset); + } + const rows = await query.all(); + return await applyJoinToRows(c, input.model, rows, input.join); + }, + + async updateAuthRecord(c, input: { model: string; where: any[]; update: Record }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + if (!predicate) { + throw new Error("updateAuthRecord requires a where clause"); + } + await c.db + .update(table) + .set(input.update as any) + .where(predicate) + .run(); + return await c.db.select().from(table).where(predicate).get(); + }, + + async updateManyAuthRecords(c, input: { model: string; where: any[]; update: Record }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + if (!predicate) { + throw new Error("updateManyAuthRecords requires a where clause"); + } + await c.db + .update(table) + .set(input.update as any) + .where(predicate) + .run(); + const row = await c.db.select({ value: sqlCount() }).from(table).where(predicate).get(); + return row?.value ?? 0; + }, + + async deleteAuthRecord(c, input: { model: string; where: any[] }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + if (!predicate) { + throw new Error("deleteAuthRecord requires a where clause"); + } + await c.db.delete(table).where(predicate).run(); + }, + + async deleteManyAuthRecords(c, input: { model: string; where: any[] }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + if (!predicate) { + throw new Error("deleteManyAuthRecords requires a where clause"); + } + const rows = await c.db.select().from(table).where(predicate).all(); + await c.db.delete(table).where(predicate).run(); + return rows.length; + }, + + async countAuthRecords(c, input: { model: string; where?: any[] }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + const row = predicate + ? await c.db.select({ value: sqlCount() }).from(table).where(predicate).get() + : await c.db.select({ value: sqlCount() }).from(table).get(); + return row?.value ?? 0; + }, + + async getAppAuthState(c, input: { sessionId: string }) { + const session = await c.db.select().from(authSessions).where(eq(authSessions.id, input.sessionId)).get(); + if (!session) { + return null; + } + const [user, profile, currentSessionState, accounts] = await Promise.all([ + c.db.select().from(authUsers).where(eq(authUsers.id, session.userId)).get(), + c.db.select().from(userProfiles).where(eq(userProfiles.userId, session.userId)).get(), + c.db.select().from(sessionState).where(eq(sessionState.sessionId, input.sessionId)).get(), + c.db.select().from(authAccounts).where(eq(authAccounts.userId, session.userId)).all(), + ]); + return { + session, + user, + profile: profile ?? null, + sessionState: currentSessionState ?? null, + accounts, + }; + }, + + async upsertUserProfile( + c, + input: { + userId: string; + patch: { + githubAccountId?: string | null; + githubLogin?: string | null; + roleLabel?: string; + eligibleOrganizationIdsJson?: string; + starterRepoStatus?: string; + starterRepoStarredAt?: number | null; + starterRepoSkippedAt?: number | null; + }; + }, + ) { + const now = Date.now(); + await c.db + .insert(userProfiles) + .values({ + userId: input.userId, + githubAccountId: input.patch.githubAccountId ?? null, + githubLogin: input.patch.githubLogin ?? null, + roleLabel: input.patch.roleLabel ?? "GitHub user", + eligibleOrganizationIdsJson: input.patch.eligibleOrganizationIdsJson ?? "[]", + starterRepoStatus: input.patch.starterRepoStatus ?? "pending", + starterRepoStarredAt: input.patch.starterRepoStarredAt ?? null, + starterRepoSkippedAt: input.patch.starterRepoSkippedAt ?? null, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: userProfiles.userId, + set: { + ...(input.patch.githubAccountId !== undefined ? { githubAccountId: input.patch.githubAccountId } : {}), + ...(input.patch.githubLogin !== undefined ? { githubLogin: input.patch.githubLogin } : {}), + ...(input.patch.roleLabel !== undefined ? { roleLabel: input.patch.roleLabel } : {}), + ...(input.patch.eligibleOrganizationIdsJson !== undefined ? { eligibleOrganizationIdsJson: input.patch.eligibleOrganizationIdsJson } : {}), + ...(input.patch.starterRepoStatus !== undefined ? { starterRepoStatus: input.patch.starterRepoStatus } : {}), + ...(input.patch.starterRepoStarredAt !== undefined ? { starterRepoStarredAt: input.patch.starterRepoStarredAt } : {}), + ...(input.patch.starterRepoSkippedAt !== undefined ? { starterRepoSkippedAt: input.patch.starterRepoSkippedAt } : {}), + updatedAt: now, + }, + }) + .run(); + + return await c.db.select().from(userProfiles).where(eq(userProfiles.userId, input.userId)).get(); + }, + + async upsertSessionState(c, input: { sessionId: string; activeOrganizationId: string | null }) { + const now = Date.now(); + await c.db + .insert(sessionState) + .values({ + sessionId: input.sessionId, + activeOrganizationId: input.activeOrganizationId, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: sessionState.sessionId, + set: { + activeOrganizationId: input.activeOrganizationId, + updatedAt: now, + }, + }) + .run(); + + return await c.db.select().from(sessionState).where(eq(sessionState.sessionId, input.sessionId)).get(); + }, + }, +}); diff --git a/foundry/packages/backend/src/actors/context.ts b/foundry/packages/backend/src/actors/context.ts index 3554a96..1c03ce2 100644 --- a/foundry/packages/backend/src/actors/context.ts +++ b/foundry/packages/backend/src/actors/context.ts @@ -1,15 +1,24 @@ import type { AppConfig } from "@sandbox-agent/foundry-shared"; import type { BackendDriver } from "../driver.js"; import type { NotificationService } from "../notifications/index.js"; +import type { ProviderRegistry } from "../providers/index.js"; import type { AppShellServices } from "../services/app-shell-runtime.js"; let runtimeConfig: AppConfig | null = null; +let providerRegistry: ProviderRegistry | null = null; let notificationService: NotificationService | null = null; let runtimeDriver: BackendDriver | null = null; let appShellServices: AppShellServices | null = null; -export function initActorRuntimeContext(config: AppConfig, notifications?: NotificationService, driver?: BackendDriver, appShell?: AppShellServices): void { +export function initActorRuntimeContext( + config: AppConfig, + providers: ProviderRegistry, + notifications?: NotificationService, + driver?: BackendDriver, + appShell?: AppShellServices, +): void { runtimeConfig = config; + providerRegistry = providers; notificationService = notifications ?? null; runtimeDriver = driver ?? null; appShellServices = appShell ?? null; @@ -17,11 +26,12 @@ export function initActorRuntimeContext(config: AppConfig, notifications?: Notif export function getActorRuntimeContext(): { config: AppConfig; + providers: ProviderRegistry; notifications: NotificationService | null; driver: BackendDriver; appShell: AppShellServices; } { - if (!runtimeConfig) { + if (!runtimeConfig || !providerRegistry) { throw new Error("Actor runtime context not initialized"); } @@ -35,6 +45,7 @@ export function getActorRuntimeContext(): { return { config: runtimeConfig, + providers: providerRegistry, notifications: notificationService, driver: runtimeDriver, appShell: appShellServices, diff --git a/foundry/packages/backend/src/actors/events.ts b/foundry/packages/backend/src/actors/events.ts new file mode 100644 index 0000000..8872dfa --- /dev/null +++ b/foundry/packages/backend/src/actors/events.ts @@ -0,0 +1,112 @@ +import type { TaskStatus, ProviderId } from "@sandbox-agent/foundry-shared"; + +export interface TaskCreatedEvent { + workspaceId: string; + repoId: string; + taskId: string; + providerId: ProviderId; + branchName: string; + title: string; +} + +export interface TaskStatusEvent { + workspaceId: string; + repoId: string; + taskId: string; + status: TaskStatus; + message: string; +} + +export interface ProjectSnapshotEvent { + workspaceId: string; + repoId: string; + updatedAt: number; +} + +export interface AgentStartedEvent { + workspaceId: string; + repoId: string; + taskId: string; + sessionId: string; +} + +export interface AgentIdleEvent { + workspaceId: string; + repoId: string; + taskId: string; + sessionId: string; +} + +export interface AgentErrorEvent { + workspaceId: string; + repoId: string; + taskId: string; + message: string; +} + +export interface PrCreatedEvent { + workspaceId: string; + repoId: string; + taskId: string; + prNumber: number; + url: string; +} + +export interface PrClosedEvent { + workspaceId: string; + repoId: string; + taskId: string; + prNumber: number; + merged: boolean; +} + +export interface PrReviewEvent { + workspaceId: string; + repoId: string; + taskId: string; + prNumber: number; + reviewer: string; + status: string; +} + +export interface CiStatusChangedEvent { + workspaceId: string; + repoId: string; + taskId: string; + prNumber: number; + status: string; +} + +export type TaskStepName = "auto_commit" | "push" | "pr_submit"; +export type TaskStepStatus = "started" | "completed" | "skipped" | "failed"; + +export interface TaskStepEvent { + workspaceId: string; + repoId: string; + taskId: string; + step: TaskStepName; + status: TaskStepStatus; + message: string; +} + +export interface BranchSwitchedEvent { + workspaceId: string; + repoId: string; + taskId: string; + branchName: string; +} + +export interface SessionAttachedEvent { + workspaceId: string; + repoId: string; + taskId: string; + sessionId: string; +} + +export interface BranchSyncedEvent { + workspaceId: string; + repoId: string; + taskId: string; + branchName: string; + strategy: string; +} diff --git a/foundry/packages/backend/src/actors/github-data/db/migrations.ts b/foundry/packages/backend/src/actors/github-data/db/migrations.ts deleted file mode 100644 index 10e3804..0000000 --- a/foundry/packages/backend/src/actors/github-data/db/migrations.ts +++ /dev/null @@ -1,114 +0,0 @@ -const journal = { - entries: [ - { - idx: 0, - when: 1773446400000, - tag: "0000_github_data", - breakpoints: true, - }, - { - idx: 1, - when: 1773810002000, - tag: "0001_default_branch", - breakpoints: true, - }, - { - idx: 2, - when: 1773810300000, - tag: "0002_github_branches", - breakpoints: true, - }, - { - idx: 3, - when: 1773907200000, - tag: "0003_sync_progress", - breakpoints: true, - }, - { - idx: 4, - when: 1773993600000, - tag: "0004_drop_github_branches", - breakpoints: true, - }, - ], -} as const; - -export default { - journal, - migrations: { - m0000: `CREATE TABLE \`github_meta\` ( - \`id\` integer PRIMARY KEY NOT NULL, - \`connected_account\` text NOT NULL, - \`installation_status\` text NOT NULL, - \`sync_status\` text NOT NULL, - \`installation_id\` integer, - \`last_sync_label\` text NOT NULL, - \`last_sync_at\` integer, - \`updated_at\` integer NOT NULL, - CONSTRAINT \`github_meta_singleton_id_check\` CHECK(\`id\` = 1) -); ---> statement-breakpoint -CREATE TABLE \`github_repositories\` ( - \`repo_id\` text PRIMARY KEY NOT NULL, - \`full_name\` text NOT NULL, - \`clone_url\` text NOT NULL, - \`private\` integer NOT NULL, - \`updated_at\` integer NOT NULL -); ---> statement-breakpoint -CREATE TABLE \`github_members\` ( - \`member_id\` text PRIMARY KEY NOT NULL, - \`login\` text NOT NULL, - \`display_name\` text NOT NULL, - \`email\` text, - \`role\` text, - \`state\` text NOT NULL, - \`updated_at\` integer NOT NULL -); ---> statement-breakpoint -CREATE TABLE \`github_pull_requests\` ( - \`pr_id\` text PRIMARY KEY NOT NULL, - \`repo_id\` text NOT NULL, - \`repo_full_name\` text NOT NULL, - \`number\` integer NOT NULL, - \`title\` text NOT NULL, - \`body\` text, - \`state\` text NOT NULL, - \`url\` text NOT NULL, - \`head_ref_name\` text NOT NULL, - \`base_ref_name\` text NOT NULL, - \`author_login\` text, - \`is_draft\` integer NOT NULL, - \`updated_at\` integer NOT NULL -); -`, - m0001: `ALTER TABLE \`github_repositories\` ADD \`default_branch\` text NOT NULL DEFAULT 'main'; -`, - m0002: `CREATE TABLE \`github_branches\` ( - \`branch_id\` text PRIMARY KEY NOT NULL, - \`repo_id\` text NOT NULL, - \`branch_name\` text NOT NULL, - \`commit_sha\` text NOT NULL, - \`updated_at\` integer NOT NULL -); -`, - m0003: `ALTER TABLE \`github_meta\` ADD \`sync_generation\` integer NOT NULL DEFAULT 0; ---> statement-breakpoint -ALTER TABLE \`github_meta\` ADD \`sync_phase\` text; ---> statement-breakpoint -ALTER TABLE \`github_meta\` ADD \`processed_repository_count\` integer NOT NULL DEFAULT 0; ---> statement-breakpoint -ALTER TABLE \`github_meta\` ADD \`total_repository_count\` integer NOT NULL DEFAULT 0; ---> statement-breakpoint -ALTER TABLE \`github_repositories\` ADD \`sync_generation\` integer NOT NULL DEFAULT 0; ---> statement-breakpoint -ALTER TABLE \`github_members\` ADD \`sync_generation\` integer NOT NULL DEFAULT 0; ---> statement-breakpoint -ALTER TABLE \`github_pull_requests\` ADD \`sync_generation\` integer NOT NULL DEFAULT 0; ---> statement-breakpoint -ALTER TABLE \`github_branches\` ADD \`sync_generation\` integer NOT NULL DEFAULT 0; -`, - m0004: `DROP TABLE IF EXISTS \`github_branches\`; -`, - } as const, -}; diff --git a/foundry/packages/backend/src/actors/github-data/db/schema.ts b/foundry/packages/backend/src/actors/github-data/db/schema.ts deleted file mode 100644 index 94b4edc..0000000 --- a/foundry/packages/backend/src/actors/github-data/db/schema.ts +++ /dev/null @@ -1,59 +0,0 @@ -import { check, integer, sqliteTable, text } from "rivetkit/db/drizzle"; -import { sql } from "drizzle-orm"; - -export const githubMeta = sqliteTable( - "github_meta", - { - id: integer("id").primaryKey(), - connectedAccount: text("connected_account").notNull(), - installationStatus: text("installation_status").notNull(), - syncStatus: text("sync_status").notNull(), - installationId: integer("installation_id"), - lastSyncLabel: text("last_sync_label").notNull(), - lastSyncAt: integer("last_sync_at"), - syncGeneration: integer("sync_generation").notNull(), - syncPhase: text("sync_phase"), - processedRepositoryCount: integer("processed_repository_count").notNull(), - totalRepositoryCount: integer("total_repository_count").notNull(), - updatedAt: integer("updated_at").notNull(), - }, - (table) => [check("github_meta_singleton_id_check", sql`${table.id} = 1`)], -); - -export const githubRepositories = sqliteTable("github_repositories", { - repoId: text("repo_id").notNull().primaryKey(), - fullName: text("full_name").notNull(), - cloneUrl: text("clone_url").notNull(), - private: integer("private").notNull(), - defaultBranch: text("default_branch").notNull(), - syncGeneration: integer("sync_generation").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -export const githubMembers = sqliteTable("github_members", { - memberId: text("member_id").notNull().primaryKey(), - login: text("login").notNull(), - displayName: text("display_name").notNull(), - email: text("email"), - role: text("role"), - state: text("state").notNull(), - syncGeneration: integer("sync_generation").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -export const githubPullRequests = sqliteTable("github_pull_requests", { - prId: text("pr_id").notNull().primaryKey(), - repoId: text("repo_id").notNull(), - repoFullName: text("repo_full_name").notNull(), - number: integer("number").notNull(), - title: text("title").notNull(), - body: text("body"), - state: text("state").notNull(), - url: text("url").notNull(), - headRefName: text("head_ref_name").notNull(), - baseRefName: text("base_ref_name").notNull(), - authorLogin: text("author_login"), - isDraft: integer("is_draft").notNull(), - syncGeneration: integer("sync_generation").notNull(), - updatedAt: integer("updated_at").notNull(), -}); diff --git a/foundry/packages/backend/src/actors/github-data/index.ts b/foundry/packages/backend/src/actors/github-data/index.ts deleted file mode 100644 index d19732a..0000000 --- a/foundry/packages/backend/src/actors/github-data/index.ts +++ /dev/null @@ -1,1010 +0,0 @@ -// @ts-nocheck -import { eq, inArray } from "drizzle-orm"; -import { actor, queue } from "rivetkit"; -import { workflow, Loop } from "rivetkit/workflow"; -import type { FoundryOrganization } from "@sandbox-agent/foundry-shared"; -import { getActorRuntimeContext } from "../context.js"; -import { getOrCreateOrganization, getTask } from "../handles.js"; -import { logActorWarning, resolveErrorMessage } from "../logging.js"; -import { taskWorkflowQueueName } from "../task/workflow/queue.js"; -import { repoIdFromRemote } from "../../services/repo.js"; -import { resolveOrganizationGithubAuth } from "../../services/github-auth.js"; -import { organizationWorkflowQueueName } from "../organization/queues.js"; -import { githubDataDb } from "./db/db.js"; -import { githubMembers, githubMeta, githubPullRequests, githubRepositories } from "./db/schema.js"; - -const META_ROW_ID = 1; -const SYNC_REPOSITORY_BATCH_SIZE = 10; - -type GithubSyncPhase = "discovering_repositories" | "syncing_repositories" | "syncing_members" | "syncing_pull_requests"; - -interface GithubDataInput { - organizationId: string; -} - -interface GithubMemberRecord { - id: string; - login: string; - name: string; - email?: string | null; - role?: string | null; - state?: string | null; -} - -interface GithubRepositoryRecord { - fullName: string; - cloneUrl: string; - private: boolean; - defaultBranch: string; -} - -interface GithubPullRequestRecord { - repoId: string; - repoFullName: string; - number: number; - title: string; - body: string | null; - state: string; - url: string; - headRefName: string; - baseRefName: string; - authorLogin: string | null; - isDraft: boolean; - updatedAt: number; -} - -interface FullSyncInput { - connectedAccount?: string | null; - installationStatus?: FoundryOrganization["github"]["installationStatus"]; - installationId?: number | null; - githubLogin?: string | null; - kind?: FoundryOrganization["kind"] | null; - accessToken?: string | null; - label?: string | null; -} - -interface ClearStateInput { - connectedAccount: string; - installationStatus: FoundryOrganization["github"]["installationStatus"]; - installationId: number | null; - label: string; -} - -// Queue names for github-data actor -export const GITHUB_DATA_QUEUE_NAMES = [ - "githubData.command.syncRepos", - "githubData.command.handlePullRequestWebhook", - "githubData.command.clearState", -] as const; - -type GithubDataQueueName = (typeof GITHUB_DATA_QUEUE_NAMES)[number]; - -export function githubDataWorkflowQueueName(name: GithubDataQueueName): GithubDataQueueName { - return name; -} - -interface PullRequestWebhookInput { - connectedAccount: string; - installationStatus: FoundryOrganization["github"]["installationStatus"]; - installationId: number | null; - repository: { - fullName: string; - cloneUrl: string; - private: boolean; - }; - pullRequest: { - number: number; - title: string; - body: string | null; - state: string; - url: string; - headRefName: string; - baseRefName: string; - authorLogin: string | null; - isDraft: boolean; - merged?: boolean; - }; -} - -interface GithubMetaState { - connectedAccount: string; - installationStatus: FoundryOrganization["github"]["installationStatus"]; - syncStatus: FoundryOrganization["github"]["syncStatus"]; - installationId: number | null; - lastSyncLabel: string; - lastSyncAt: number | null; - syncGeneration: number; - syncPhase: GithubSyncPhase | null; - processedRepositoryCount: number; - totalRepositoryCount: number; -} - -function normalizePrStatus(input: { state: string; isDraft?: boolean; merged?: boolean }): "OPEN" | "DRAFT" | "CLOSED" | "MERGED" { - const state = input.state.trim().toUpperCase(); - if (input.merged || state === "MERGED") return "MERGED"; - if (state === "CLOSED") return "CLOSED"; - return input.isDraft ? "DRAFT" : "OPEN"; -} - -function pullRequestSummaryFromRow(row: any) { - return { - prId: row.prId, - repoId: row.repoId, - repoFullName: row.repoFullName, - number: row.number, - status: Boolean(row.isDraft) ? "draft" : "ready", - title: row.title, - state: row.state, - url: row.url, - headRefName: row.headRefName, - baseRefName: row.baseRefName, - authorLogin: row.authorLogin ?? null, - isDraft: Boolean(row.isDraft), - updatedAtMs: row.updatedAt, - }; -} - -function chunkItems(items: T[], size: number): T[][] { - if (items.length === 0) { - return []; - } - const chunks: T[][] = []; - for (let index = 0; index < items.length; index += size) { - chunks.push(items.slice(index, index + size)); - } - return chunks; -} - -export async function readMeta(c: any): Promise { - const row = await c.db.select().from(githubMeta).where(eq(githubMeta.id, META_ROW_ID)).get(); - return { - connectedAccount: row?.connectedAccount ?? "", - installationStatus: (row?.installationStatus ?? "install_required") as FoundryOrganization["github"]["installationStatus"], - syncStatus: (row?.syncStatus ?? "pending") as FoundryOrganization["github"]["syncStatus"], - installationId: row?.installationId ?? null, - lastSyncLabel: row?.lastSyncLabel ?? "Waiting for first import", - lastSyncAt: row?.lastSyncAt ?? null, - syncGeneration: row?.syncGeneration ?? 0, - syncPhase: (row?.syncPhase ?? null) as GithubSyncPhase | null, - processedRepositoryCount: row?.processedRepositoryCount ?? 0, - totalRepositoryCount: row?.totalRepositoryCount ?? 0, - }; -} - -async function writeMeta(c: any, patch: Partial) { - const current = await readMeta(c); - const next = { - ...current, - ...patch, - }; - await c.db - .insert(githubMeta) - .values({ - id: META_ROW_ID, - connectedAccount: next.connectedAccount, - installationStatus: next.installationStatus, - syncStatus: next.syncStatus, - installationId: next.installationId, - lastSyncLabel: next.lastSyncLabel, - lastSyncAt: next.lastSyncAt, - syncGeneration: next.syncGeneration, - syncPhase: next.syncPhase, - processedRepositoryCount: next.processedRepositoryCount, - totalRepositoryCount: next.totalRepositoryCount, - updatedAt: Date.now(), - }) - .onConflictDoUpdate({ - target: githubMeta.id, - set: { - connectedAccount: next.connectedAccount, - installationStatus: next.installationStatus, - syncStatus: next.syncStatus, - installationId: next.installationId, - lastSyncLabel: next.lastSyncLabel, - lastSyncAt: next.lastSyncAt, - syncGeneration: next.syncGeneration, - syncPhase: next.syncPhase, - processedRepositoryCount: next.processedRepositoryCount, - totalRepositoryCount: next.totalRepositoryCount, - updatedAt: Date.now(), - }, - }) - .run(); - return next; -} - -async function publishSyncProgress(c: any, patch: Partial): Promise { - const meta = await writeMeta(c, patch); - const organization = await getOrCreateOrganization(c, c.state.organizationId); - await organization.send( - organizationWorkflowQueueName("organization.command.github.sync_progress.apply"), - { - connectedAccount: meta.connectedAccount, - installationStatus: meta.installationStatus, - installationId: meta.installationId, - syncStatus: meta.syncStatus, - lastSyncLabel: meta.lastSyncLabel, - lastSyncAt: meta.lastSyncAt, - syncGeneration: meta.syncGeneration, - syncPhase: meta.syncPhase, - processedRepositoryCount: meta.processedRepositoryCount, - totalRepositoryCount: meta.totalRepositoryCount, - }, - { wait: false }, - ); - return meta; -} - -async function getOrganizationContext(c: any, overrides?: FullSyncInput) { - // Try to read the org profile for fallback values, but don't require it. - // Webhook-triggered syncs can arrive before the user signs in and creates the - // org profile row. The webhook callers already pass the necessary overrides - // (connectedAccount, installationId, githubLogin, kind), so we can proceed - // without the profile as long as overrides cover the required fields. - const organizationHandle = await getOrCreateOrganization(c, c.state.organizationId); - const organizationState = await organizationHandle.getOrganizationShellStateIfInitialized({}); - - // If the org profile doesn't exist and overrides don't provide enough context, fail. - if (!organizationState && !overrides?.connectedAccount) { - throw new Error(`Organization ${c.state.organizationId} is not initialized and no override context was provided`); - } - - const auth = await resolveOrganizationGithubAuth(c, c.state.organizationId); - return { - kind: overrides?.kind ?? organizationState?.snapshot.kind, - githubLogin: overrides?.githubLogin ?? organizationState?.githubLogin, - connectedAccount: overrides?.connectedAccount ?? organizationState?.snapshot.github.connectedAccount ?? organizationState?.githubLogin, - installationId: overrides?.installationId ?? organizationState?.githubInstallationId ?? null, - installationStatus: - overrides?.installationStatus ?? - organizationState?.snapshot.github.installationStatus ?? - (organizationState?.snapshot.kind === "personal" ? "connected" : "reconnect_required"), - accessToken: overrides?.accessToken ?? auth?.githubToken ?? null, - }; -} - -async function upsertRepositories(c: any, repositories: GithubRepositoryRecord[], updatedAt: number, syncGeneration: number) { - for (const repository of repositories) { - await c.db - .insert(githubRepositories) - .values({ - repoId: repoIdFromRemote(repository.cloneUrl), - fullName: repository.fullName, - cloneUrl: repository.cloneUrl, - private: repository.private ? 1 : 0, - defaultBranch: repository.defaultBranch, - syncGeneration, - updatedAt, - }) - .onConflictDoUpdate({ - target: githubRepositories.repoId, - set: { - fullName: repository.fullName, - cloneUrl: repository.cloneUrl, - private: repository.private ? 1 : 0, - defaultBranch: repository.defaultBranch, - syncGeneration, - updatedAt, - }, - }) - .run(); - } -} - -async function sweepRepositories(c: any, syncGeneration: number) { - const rows = await c.db.select({ repoId: githubRepositories.repoId, syncGeneration: githubRepositories.syncGeneration }).from(githubRepositories).all(); - for (const row of rows) { - if (row.syncGeneration === syncGeneration) { - continue; - } - await c.db.delete(githubRepositories).where(eq(githubRepositories.repoId, row.repoId)).run(); - } -} - -async function upsertMembers(c: any, members: GithubMemberRecord[], updatedAt: number, syncGeneration: number) { - for (const member of members) { - await c.db - .insert(githubMembers) - .values({ - memberId: member.id, - login: member.login, - displayName: member.name || member.login, - email: member.email ?? null, - role: member.role ?? null, - state: member.state ?? "active", - syncGeneration, - updatedAt, - }) - .onConflictDoUpdate({ - target: githubMembers.memberId, - set: { - login: member.login, - displayName: member.name || member.login, - email: member.email ?? null, - role: member.role ?? null, - state: member.state ?? "active", - syncGeneration, - updatedAt, - }, - }) - .run(); - } -} - -async function sweepMembers(c: any, syncGeneration: number) { - const rows = await c.db.select({ memberId: githubMembers.memberId, syncGeneration: githubMembers.syncGeneration }).from(githubMembers).all(); - for (const row of rows) { - if (row.syncGeneration === syncGeneration) { - continue; - } - await c.db.delete(githubMembers).where(eq(githubMembers.memberId, row.memberId)).run(); - } -} - -async function upsertPullRequests(c: any, pullRequests: GithubPullRequestRecord[], syncGeneration: number) { - for (const pullRequest of pullRequests) { - await c.db - .insert(githubPullRequests) - .values({ - prId: `${pullRequest.repoId}#${pullRequest.number}`, - repoId: pullRequest.repoId, - repoFullName: pullRequest.repoFullName, - number: pullRequest.number, - title: pullRequest.title, - body: pullRequest.body ?? null, - state: pullRequest.state, - url: pullRequest.url, - headRefName: pullRequest.headRefName, - baseRefName: pullRequest.baseRefName, - authorLogin: pullRequest.authorLogin ?? null, - isDraft: pullRequest.isDraft ? 1 : 0, - syncGeneration, - updatedAt: pullRequest.updatedAt, - }) - .onConflictDoUpdate({ - target: githubPullRequests.prId, - set: { - repoId: pullRequest.repoId, - repoFullName: pullRequest.repoFullName, - number: pullRequest.number, - title: pullRequest.title, - body: pullRequest.body ?? null, - state: pullRequest.state, - url: pullRequest.url, - headRefName: pullRequest.headRefName, - baseRefName: pullRequest.baseRefName, - authorLogin: pullRequest.authorLogin ?? null, - isDraft: pullRequest.isDraft ? 1 : 0, - syncGeneration, - updatedAt: pullRequest.updatedAt, - }, - }) - .run(); - } -} - -async function sweepPullRequests(c: any, syncGeneration: number) { - const rows = await c.db.select({ prId: githubPullRequests.prId, syncGeneration: githubPullRequests.syncGeneration }).from(githubPullRequests).all(); - for (const row of rows) { - if (row.syncGeneration === syncGeneration) { - continue; - } - await c.db.delete(githubPullRequests).where(eq(githubPullRequests.prId, row.prId)).run(); - } -} - -async function refreshTaskSummaryForBranch(c: any, repoId: string, branchName: string, pullRequest: ReturnType | null) { - const repositoryRecord = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, repoId)).get(); - if (!repositoryRecord) { - return; - } - const organization = await getOrCreateOrganization(c, c.state.organizationId); - void organization - .send( - organizationWorkflowQueueName("organization.command.refreshTaskSummaryForBranch"), - { repoId, branchName, pullRequest, repoName: repositoryRecord.fullName ?? undefined }, - { wait: false }, - ) - .catch(() => {}); -} - -async function emitPullRequestChangeEvents(c: any, beforeRows: any[], afterRows: any[]) { - const beforeById = new Map(beforeRows.map((row) => [row.prId, row])); - const afterById = new Map(afterRows.map((row) => [row.prId, row])); - - for (const [prId, row] of afterById) { - const previous = beforeById.get(prId); - const changed = - !previous || - previous.title !== row.title || - previous.state !== row.state || - previous.url !== row.url || - previous.headRefName !== row.headRefName || - previous.baseRefName !== row.baseRefName || - previous.authorLogin !== row.authorLogin || - previous.isDraft !== row.isDraft || - previous.updatedAt !== row.updatedAt; - if (!changed) { - continue; - } - await refreshTaskSummaryForBranch(c, row.repoId, row.headRefName, pullRequestSummaryFromRow(row)); - } - - for (const [prId, row] of beforeById) { - if (afterById.has(prId)) { - continue; - } - await refreshTaskSummaryForBranch(c, row.repoId, row.headRefName, null); - } -} - -async function autoArchiveTaskForClosedPullRequest(c: any, row: any) { - const repositoryRecord = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, row.repoId)).get(); - if (!repositoryRecord) { - return; - } - const organization = await getOrCreateOrganization(c, c.state.organizationId); - const match = await organization.findTaskForBranch({ - repoId: row.repoId, - branchName: row.headRefName, - }); - if (!match?.taskId) { - return; - } - try { - const task = getTask(c, c.state.organizationId, row.repoId, match.taskId); - void task.send(taskWorkflowQueueName("task.command.archive"), { reason: `PR ${String(row.state).toLowerCase()}` }, { wait: false }).catch(() => {}); - } catch { - // Best-effort only. Task summary refresh will still clear the PR state. - } -} - -async function resolveRepositories(c: any, context: Awaited>): Promise { - const { appShell } = getActorRuntimeContext(); - if (context.kind === "personal") { - if (!context.accessToken) { - return []; - } - return await appShell.github.listUserRepositories(context.accessToken); - } - - if (context.installationId != null) { - try { - return await appShell.github.listInstallationRepositories(context.installationId); - } catch (error) { - if (!context.accessToken) { - throw error; - } - } - } - - if (!context.accessToken) { - return []; - } - - return (await appShell.github.listUserRepositories(context.accessToken)).filter((repository) => repository.fullName.startsWith(`${context.githubLogin}/`)); -} - -async function resolveMembers(c: any, context: Awaited>): Promise { - const { appShell } = getActorRuntimeContext(); - if (context.kind === "personal") { - return []; - } - if (context.installationId != null) { - try { - return await appShell.github.listInstallationMembers(context.installationId, context.githubLogin); - } catch (error) { - if (!context.accessToken) { - throw error; - } - } - } - if (!context.accessToken) { - return []; - } - return await appShell.github.listOrganizationMembers(context.accessToken, context.githubLogin); -} - -async function listPullRequestsForRepositories( - context: Awaited>, - repositories: GithubRepositoryRecord[], -): Promise { - const { appShell } = getActorRuntimeContext(); - if (repositories.length === 0) { - return []; - } - - let pullRequests: Array<{ - repoFullName: string; - cloneUrl: string; - number: number; - title: string; - body?: string | null; - state: string; - url: string; - headRefName: string; - baseRefName: string; - authorLogin?: string | null; - isDraft?: boolean; - merged?: boolean; - }> = []; - - if (context.installationId != null) { - try { - pullRequests = await appShell.github.listInstallationPullRequestsForRepositories(context.installationId, repositories); - } catch (error) { - if (!context.accessToken) { - throw error; - } - } - } - - if (pullRequests.length === 0 && context.accessToken) { - pullRequests = await appShell.github.listPullRequestsForUserRepositories(context.accessToken, repositories); - } - - return pullRequests.map((pullRequest) => ({ - repoId: repoIdFromRemote(pullRequest.cloneUrl), - repoFullName: pullRequest.repoFullName, - number: pullRequest.number, - title: pullRequest.title, - body: pullRequest.body ?? null, - state: normalizePrStatus(pullRequest), - url: pullRequest.url, - headRefName: pullRequest.headRefName, - baseRefName: pullRequest.baseRefName, - authorLogin: pullRequest.authorLogin ?? null, - isDraft: Boolean(pullRequest.isDraft), - updatedAt: Date.now(), - })); -} - -async function readAllPullRequestRows(c: any) { - return await c.db.select().from(githubPullRequests).all(); -} - -/** Config returned by fullSyncSetup, passed to subsequent sync phases. */ -export interface FullSyncConfig { - syncGeneration: number; - startedAt: number; - totalRepositoryCount: number; - connectedAccount: string; - installationStatus: string; - installationId: number | null; - beforePrRows: any[]; -} - -async function readRepositoriesFromDb(c: any): Promise { - const rows = await c.db.select().from(githubRepositories).all(); - return rows.map((r: any) => ({ - fullName: r.fullName, - cloneUrl: r.cloneUrl, - private: Boolean(r.private), - defaultBranch: r.defaultBranch, - })); -} - -/** - * Phase 1: Discover repositories and persist them. - * Returns the config needed by all subsequent phases, or null if nothing to do. - */ -export async function fullSyncSetup(c: any, input: FullSyncInput = {}): Promise { - const startedAt = Date.now(); - const beforePrRows = await readAllPullRequestRows(c); - const currentMeta = await readMeta(c); - const context = await getOrganizationContext(c, input); - const syncGeneration = currentMeta.syncGeneration + 1; - - await publishSyncProgress(c, { - connectedAccount: context.connectedAccount, - installationStatus: context.installationStatus, - installationId: context.installationId, - syncStatus: "syncing", - lastSyncLabel: input.label?.trim() || "Syncing GitHub data...", - syncGeneration, - syncPhase: "discovering_repositories", - processedRepositoryCount: 0, - totalRepositoryCount: 0, - }); - - const repositories = await resolveRepositories(c, context); - const totalRepositoryCount = repositories.length; - - await publishSyncProgress(c, { - connectedAccount: context.connectedAccount, - installationStatus: context.installationStatus, - installationId: context.installationId, - syncStatus: "syncing", - lastSyncLabel: totalRepositoryCount > 0 ? `Importing ${totalRepositoryCount} repositories...` : "No repositories available", - syncGeneration, - syncPhase: "syncing_repositories", - processedRepositoryCount: totalRepositoryCount, - totalRepositoryCount, - }); - - await upsertRepositories(c, repositories, startedAt, syncGeneration); - - return { - syncGeneration, - startedAt, - totalRepositoryCount, - connectedAccount: context.connectedAccount, - installationStatus: context.installationStatus, - installationId: context.installationId, - beforePrRows, - }; -} - -/** - * Phase 2: Resolve, upsert, and sweep members. - */ -export async function fullSyncMembers(c: any, config: FullSyncConfig): Promise { - await publishSyncProgress(c, { - connectedAccount: config.connectedAccount, - installationStatus: config.installationStatus, - installationId: config.installationId, - syncStatus: "syncing", - lastSyncLabel: "Syncing GitHub members...", - syncGeneration: config.syncGeneration, - syncPhase: "syncing_members", - processedRepositoryCount: config.totalRepositoryCount, - totalRepositoryCount: config.totalRepositoryCount, - }); - - const context = await getOrganizationContext(c, { - connectedAccount: config.connectedAccount, - installationStatus: config.installationStatus as any, - installationId: config.installationId, - }); - const members = await resolveMembers(c, context); - await upsertMembers(c, members, config.startedAt, config.syncGeneration); - await sweepMembers(c, config.syncGeneration); -} - -/** - * Phase 3 (per-batch): Fetch and upsert pull requests for one batch of repos. - * Returns true when all batches have been processed. - */ -export async function fullSyncPullRequestBatch(c: any, config: FullSyncConfig, batchIndex: number): Promise { - const repos = await readRepositoriesFromDb(c); - const batches = chunkItems(repos, SYNC_REPOSITORY_BATCH_SIZE); - if (batchIndex >= batches.length) return true; - - const batch = batches[batchIndex]!; - const context = await getOrganizationContext(c, { - connectedAccount: config.connectedAccount, - installationStatus: config.installationStatus as any, - installationId: config.installationId, - }); - const batchPRs = await listPullRequestsForRepositories(context, batch); - await upsertPullRequests(c, batchPRs, config.syncGeneration); - - const processedCount = Math.min((batchIndex + 1) * SYNC_REPOSITORY_BATCH_SIZE, repos.length); - await publishSyncProgress(c, { - connectedAccount: config.connectedAccount, - installationStatus: config.installationStatus, - installationId: config.installationId, - syncStatus: "syncing", - lastSyncLabel: `Synced pull requests for ${processedCount} of ${repos.length} repositories`, - syncGeneration: config.syncGeneration, - syncPhase: "syncing_pull_requests", - processedRepositoryCount: processedCount, - totalRepositoryCount: repos.length, - }); - - return false; -} - -/** - * Phase 4: Sweep stale data, publish final state, emit PR change events. - */ -export async function fullSyncFinalize(c: any, config: FullSyncConfig): Promise { - await sweepPullRequests(c, config.syncGeneration); - await sweepRepositories(c, config.syncGeneration); - - await publishSyncProgress(c, { - connectedAccount: config.connectedAccount, - installationStatus: config.installationStatus, - installationId: config.installationId, - syncStatus: "synced", - lastSyncLabel: config.totalRepositoryCount > 0 ? `Synced ${config.totalRepositoryCount} repositories` : "No repositories available", - lastSyncAt: config.startedAt, - syncGeneration: config.syncGeneration, - syncPhase: null, - processedRepositoryCount: config.totalRepositoryCount, - totalRepositoryCount: config.totalRepositoryCount, - }); - - const afterRows = await readAllPullRequestRows(c); - await emitPullRequestChangeEvents(c, config.beforePrRows, afterRows); -} - -/** - * Error handler: publish error sync state when a full sync fails. - */ -/** - * Single-shot full sync: runs all phases (setup, branches, members, PRs, finalize) - * using native JS loops. This must NOT use workflow primitives (step/loop/sleep) - * because it runs inside a workflow step. See workflow.ts for context on why - * sub-loops cause HistoryDivergedError. - */ -export async function runFullSync(c: any, input: FullSyncInput = {}): Promise { - const config = await fullSyncSetup(c, input); - - // Members - await fullSyncMembers(c, config); - - // Pull requests — native loop over batches - for (let i = 0; ; i++) { - const done = await fullSyncPullRequestBatch(c, config, i); - if (done) break; - } - - // Finalize - await fullSyncFinalize(c, config); -} - -export async function fullSyncError(c: any, error: unknown): Promise { - const currentMeta = await readMeta(c); - const message = error instanceof Error ? error.message : "GitHub import failed"; - await publishSyncProgress(c, { - connectedAccount: currentMeta.connectedAccount, - installationStatus: currentMeta.installationStatus, - installationId: currentMeta.installationId, - syncStatus: "error", - lastSyncLabel: message, - syncGeneration: currentMeta.syncGeneration, - syncPhase: null, - processedRepositoryCount: 0, - totalRepositoryCount: 0, - }); -} - -// --------------------------------------------------------------------------- -// Workflow command loop -// --------------------------------------------------------------------------- - -type GithubDataWorkflowHandler = (loopCtx: any, body: any) => Promise; - -const GITHUB_DATA_COMMAND_HANDLERS: Record = { - "githubData.command.syncRepos": async (c, body) => { - try { - await runFullSync(c, body); - return { ok: true }; - } catch (error) { - try { - await fullSyncError(c, error); - } catch { - /* best effort */ - } - throw error; - } - }, - "githubData.command.handlePullRequestWebhook": async (c, body) => { - await handlePullRequestWebhookMutation(c, body); - return { ok: true }; - }, - "githubData.command.clearState": async (c, body) => { - await clearStateMutation(c, body); - return { ok: true }; - }, -}; - -async function runGithubDataWorkflow(ctx: any): Promise { - await ctx.loop("github-data-command-loop", async (loopCtx: any) => { - const msg = await loopCtx.queue.next("next-github-data-command", { - names: [...GITHUB_DATA_QUEUE_NAMES], - completable: true, - }); - - if (!msg) { - return Loop.continue(undefined); - } - - const handler = GITHUB_DATA_COMMAND_HANDLERS[msg.name as GithubDataQueueName]; - if (!handler) { - logActorWarning("github-data", "unknown github-data command", { command: msg.name }); - await msg.complete({ error: `Unknown command: ${msg.name}` }).catch(() => {}); - return Loop.continue(undefined); - } - - try { - // Wrap in a step so c.state and c.db are accessible inside mutation functions. - const result = await loopCtx.step({ - name: msg.name, - timeout: 10 * 60_000, - run: async () => handler(loopCtx, msg.body), - }); - await msg.complete(result); - } catch (error) { - const message = resolveErrorMessage(error); - logActorWarning("github-data", "github-data workflow command failed", { - command: msg.name, - error: message, - }); - await msg.complete({ error: message }).catch(() => {}); - } - - return Loop.continue(undefined); - }); -} - -export const githubData = actor({ - db: githubDataDb, - queues: Object.fromEntries(GITHUB_DATA_QUEUE_NAMES.map((name) => [name, queue()])), - options: { - name: "GitHub Data", - icon: "github", - actionTimeout: 10 * 60_000, - }, - createState: (_c, input: GithubDataInput) => ({ - organizationId: input.organizationId, - }), - actions: { - async getSummary(c) { - const repositories = await c.db.select().from(githubRepositories).all(); - const members = await c.db.select().from(githubMembers).all(); - const pullRequests = await c.db.select().from(githubPullRequests).all(); - return { - ...(await readMeta(c)), - repositoryCount: repositories.length, - memberCount: members.length, - pullRequestCount: pullRequests.length, - }; - }, - - async listRepositories(c) { - const rows = await c.db.select().from(githubRepositories).all(); - return rows.map((row) => ({ - repoId: row.repoId, - fullName: row.fullName, - cloneUrl: row.cloneUrl, - private: Boolean(row.private), - defaultBranch: row.defaultBranch, - })); - }, - - async getRepository(c, input: { repoId: string }) { - const row = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, input.repoId)).get(); - if (!row) { - return null; - } - return { - repoId: row.repoId, - fullName: row.fullName, - cloneUrl: row.cloneUrl, - private: Boolean(row.private), - defaultBranch: row.defaultBranch, - }; - }, - - async listOpenPullRequests(c) { - const rows = await c.db - .select() - .from(githubPullRequests) - .where(inArray(githubPullRequests.state, ["OPEN", "DRAFT"])) - .all(); - return rows.map((row) => pullRequestSummaryFromRow(row)); - }, - }, - run: workflow(runGithubDataWorkflow), -}); - -export async function clearStateMutation(c: any, input: ClearStateInput) { - const beforeRows = await readAllPullRequestRows(c); - const currentMeta = await readMeta(c); - await c.db.delete(githubPullRequests).run(); - await c.db.delete(githubRepositories).run(); - await c.db.delete(githubMembers).run(); - await writeMeta(c, { - connectedAccount: input.connectedAccount, - installationStatus: input.installationStatus, - installationId: input.installationId, - syncStatus: "pending", - lastSyncLabel: input.label, - lastSyncAt: null, - syncGeneration: currentMeta.syncGeneration, - syncPhase: null, - processedRepositoryCount: 0, - totalRepositoryCount: 0, - }); - - await emitPullRequestChangeEvents(c, beforeRows, []); -} - -export async function handlePullRequestWebhookMutation(c: any, input: PullRequestWebhookInput) { - const beforeRows = await readAllPullRequestRows(c); - const repoId = repoIdFromRemote(input.repository.cloneUrl); - const currentRepository = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, repoId)).get(); - const updatedAt = Date.now(); - const currentMeta = await readMeta(c); - const state = normalizePrStatus(input.pullRequest); - const prId = `${repoId}#${input.pullRequest.number}`; - - await c.db - .insert(githubRepositories) - .values({ - repoId, - fullName: input.repository.fullName, - cloneUrl: input.repository.cloneUrl, - private: input.repository.private ? 1 : 0, - defaultBranch: currentRepository?.defaultBranch ?? input.pullRequest.baseRefName ?? "main", - syncGeneration: currentMeta.syncGeneration, - updatedAt, - }) - .onConflictDoUpdate({ - target: githubRepositories.repoId, - set: { - fullName: input.repository.fullName, - cloneUrl: input.repository.cloneUrl, - private: input.repository.private ? 1 : 0, - defaultBranch: currentRepository?.defaultBranch ?? input.pullRequest.baseRefName ?? "main", - syncGeneration: currentMeta.syncGeneration, - updatedAt, - }, - }) - .run(); - - if (state === "CLOSED" || state === "MERGED") { - await c.db.delete(githubPullRequests).where(eq(githubPullRequests.prId, prId)).run(); - } else { - await c.db - .insert(githubPullRequests) - .values({ - prId, - repoId, - repoFullName: input.repository.fullName, - number: input.pullRequest.number, - title: input.pullRequest.title, - body: input.pullRequest.body ?? null, - state, - url: input.pullRequest.url, - headRefName: input.pullRequest.headRefName, - baseRefName: input.pullRequest.baseRefName, - authorLogin: input.pullRequest.authorLogin ?? null, - isDraft: input.pullRequest.isDraft ? 1 : 0, - syncGeneration: currentMeta.syncGeneration, - updatedAt, - }) - .onConflictDoUpdate({ - target: githubPullRequests.prId, - set: { - title: input.pullRequest.title, - body: input.pullRequest.body ?? null, - state, - url: input.pullRequest.url, - headRefName: input.pullRequest.headRefName, - baseRefName: input.pullRequest.baseRefName, - authorLogin: input.pullRequest.authorLogin ?? null, - isDraft: input.pullRequest.isDraft ? 1 : 0, - syncGeneration: currentMeta.syncGeneration, - updatedAt, - }, - }) - .run(); - } - - await publishSyncProgress(c, { - connectedAccount: input.connectedAccount, - installationStatus: input.installationStatus, - installationId: input.installationId, - syncStatus: "synced", - lastSyncLabel: "GitHub webhook received", - lastSyncAt: updatedAt, - syncPhase: null, - processedRepositoryCount: 0, - totalRepositoryCount: 0, - }); - - const afterRows = await readAllPullRequestRows(c); - await emitPullRequestChangeEvents(c, beforeRows, afterRows); - if (state === "CLOSED" || state === "MERGED") { - const previous = beforeRows.find((row) => row.prId === prId); - if (previous) { - await autoArchiveTaskForClosedPullRequest(c, { - ...previous, - state, - }); - } - } -} diff --git a/foundry/packages/backend/src/actors/github-data/workflow.ts b/foundry/packages/backend/src/actors/github-data/workflow.ts deleted file mode 100644 index 11ece75..0000000 --- a/foundry/packages/backend/src/actors/github-data/workflow.ts +++ /dev/null @@ -1,73 +0,0 @@ -// @ts-nocheck -import { logActorWarning, resolveErrorMessage } from "../logging.js"; - -// Dynamic imports to break circular dependency: index.ts imports workflow.ts, -// and workflow.ts needs functions from index.ts. -async function getIndexModule() { - return await import("./index.js"); -} - -export const GITHUB_DATA_QUEUE_NAMES = [ - "githubData.command.syncRepos", - "githubData.command.handlePullRequestWebhook", - "githubData.command.clearState", -] as const; - -export type GithubDataQueueName = (typeof GITHUB_DATA_QUEUE_NAMES)[number]; - -export function githubDataWorkflowQueueName(name: GithubDataQueueName): GithubDataQueueName { - return name; -} - -/** - * Plain run handler (no workflow engine). Drains the queue using `c.queue.iter()` - * with completable messages. This avoids the RivetKit bug where actors created - * from another actor's workflow context never start their `run: workflow(...)`. - */ -export async function runGithubDataCommandLoop(c: any): Promise { - for await (const msg of c.queue.iter({ names: [...GITHUB_DATA_QUEUE_NAMES], completable: true })) { - try { - if (msg.name === "githubData.command.syncRepos") { - try { - const { runFullSync } = await getIndexModule(); - await runFullSync(c, msg.body); - await msg.complete({ ok: true }); - } catch (error) { - const { fullSyncError } = await getIndexModule(); - try { - await fullSyncError(c, error); - } catch { - /* best effort */ - } - const message = error instanceof Error ? error.message : String(error); - await msg.complete({ error: message }).catch(() => {}); - } - continue; - } - - if (msg.name === "githubData.command.handlePullRequestWebhook") { - const { handlePullRequestWebhookMutation } = await getIndexModule(); - await handlePullRequestWebhookMutation(c, msg.body); - await msg.complete({ ok: true }); - continue; - } - - if (msg.name === "githubData.command.clearState") { - const { clearStateMutation } = await getIndexModule(); - await clearStateMutation(c, msg.body); - await msg.complete({ ok: true }); - continue; - } - - logActorWarning("githubData", "unknown queue message", { queueName: msg.name }); - await msg.complete({ error: `Unknown command: ${msg.name}` }); - } catch (error) { - const message = resolveErrorMessage(error); - logActorWarning("githubData", "github-data command failed", { - queueName: msg.name, - error: message, - }); - await msg.complete({ error: message }).catch(() => {}); - } - } -} diff --git a/foundry/packages/backend/src/actors/handles.ts b/foundry/packages/backend/src/actors/handles.ts index 5aa5715..02de614 100644 --- a/foundry/packages/backend/src/actors/handles.ts +++ b/foundry/packages/backend/src/actors/handles.ts @@ -1,85 +1,151 @@ -import { auditLogKey, githubDataKey, organizationKey, taskKey, taskSandboxKey, userKey } from "./keys.js"; +import { + authUserKey, + taskKey, + taskStatusSyncKey, + historyKey, + projectBranchSyncKey, + projectKey, + projectPrSyncKey, + sandboxInstanceKey, + workspaceKey, +} from "./keys.js"; +import type { ProviderId } from "@sandbox-agent/foundry-shared"; export function actorClient(c: any) { return c.client(); } -export async function getOrCreateOrganization(c: any, organizationId: string) { - return await actorClient(c).organization.getOrCreate(organizationKey(organizationId), { - createWithInput: organizationId, +export async function getOrCreateWorkspace(c: any, workspaceId: string) { + return await actorClient(c).workspace.getOrCreate(workspaceKey(workspaceId), { + createWithInput: workspaceId, }); } -export async function getOrCreateUser(c: any, userId: string) { - return await actorClient(c).user.getOrCreate(userKey(userId), { +export async function getOrCreateAuthUser(c: any, userId: string) { + return await actorClient(c).authUser.getOrCreate(authUserKey(userId), { createWithInput: { userId }, }); } -export function getUser(c: any, userId: string) { - return actorClient(c).user.get(userKey(userId)); +export function getAuthUser(c: any, userId: string) { + return actorClient(c).authUser.get(authUserKey(userId)); } -export function getTask(c: any, organizationId: string, repoId: string, taskId: string) { - return actorClient(c).task.get(taskKey(organizationId, repoId, taskId)); -} - -export async function getOrCreateTask(c: any, organizationId: string, repoId: string, taskId: string, createWithInput: Record) { - return await actorClient(c).task.getOrCreate(taskKey(organizationId, repoId, taskId), { - createWithInput, - }); -} - -export async function getOrCreateAuditLog(c: any, organizationId: string) { - return await actorClient(c).auditLog.getOrCreate(auditLogKey(organizationId), { +export async function getOrCreateProject(c: any, workspaceId: string, repoId: string, remoteUrl: string) { + return await actorClient(c).project.getOrCreate(projectKey(workspaceId, repoId), { createWithInput: { - organizationId, + workspaceId, + repoId, + remoteUrl, }, }); } -export async function getOrCreateGithubData(c: any, organizationId: string) { - return await actorClient(c).githubData.getOrCreate(githubDataKey(organizationId), { - createWithInput: { - organizationId, - }, - }); +export function getProject(c: any, workspaceId: string, repoId: string) { + return actorClient(c).project.get(projectKey(workspaceId, repoId)); } -export function getGithubData(c: any, organizationId: string) { - return actorClient(c).githubData.get(githubDataKey(organizationId)); +export function getTask(c: any, workspaceId: string, repoId: string, taskId: string) { + return actorClient(c).task.get(taskKey(workspaceId, repoId, taskId)); } -export function getTaskSandbox(c: any, organizationId: string, sandboxId: string) { - return actorClient(c).taskSandbox.get(taskSandboxKey(organizationId, sandboxId)); -} - -export async function getOrCreateTaskSandbox(c: any, organizationId: string, sandboxId: string, createWithInput?: Record) { - return await actorClient(c).taskSandbox.getOrCreate(taskSandboxKey(organizationId, sandboxId), { +export async function getOrCreateTask(c: any, workspaceId: string, repoId: string, taskId: string, createWithInput: Record) { + return await actorClient(c).task.getOrCreate(taskKey(workspaceId, repoId, taskId), { createWithInput, }); } -export function selfAuditLog(c: any) { - return actorClient(c).auditLog.getForId(c.actorId); +export async function getOrCreateHistory(c: any, workspaceId: string, repoId: string) { + return await actorClient(c).history.getOrCreate(historyKey(workspaceId, repoId), { + createWithInput: { + workspaceId, + repoId, + }, + }); +} + +export async function getOrCreateProjectPrSync(c: any, workspaceId: string, repoId: string, repoPath: string, intervalMs: number) { + return await actorClient(c).projectPrSync.getOrCreate(projectPrSyncKey(workspaceId, repoId), { + createWithInput: { + workspaceId, + repoId, + repoPath, + intervalMs, + }, + }); +} + +export async function getOrCreateProjectBranchSync(c: any, workspaceId: string, repoId: string, repoPath: string, intervalMs: number) { + return await actorClient(c).projectBranchSync.getOrCreate(projectBranchSyncKey(workspaceId, repoId), { + createWithInput: { + workspaceId, + repoId, + repoPath, + intervalMs, + }, + }); +} + +export function getSandboxInstance(c: any, workspaceId: string, providerId: ProviderId, sandboxId: string) { + return actorClient(c).sandboxInstance.get(sandboxInstanceKey(workspaceId, providerId, sandboxId)); +} + +export async function getOrCreateSandboxInstance( + c: any, + workspaceId: string, + providerId: ProviderId, + sandboxId: string, + createWithInput: Record, +) { + return await actorClient(c).sandboxInstance.getOrCreate(sandboxInstanceKey(workspaceId, providerId, sandboxId), { createWithInput }); +} + +export async function getOrCreateTaskStatusSync( + c: any, + workspaceId: string, + repoId: string, + taskId: string, + sandboxId: string, + sessionId: string, + createWithInput: Record, +) { + return await actorClient(c).taskStatusSync.getOrCreate(taskStatusSyncKey(workspaceId, repoId, taskId, sandboxId, sessionId), { + createWithInput, + }); +} + +export function selfProjectPrSync(c: any) { + return actorClient(c).projectPrSync.getForId(c.actorId); +} + +export function selfProjectBranchSync(c: any) { + return actorClient(c).projectBranchSync.getForId(c.actorId); +} + +export function selfTaskStatusSync(c: any) { + return actorClient(c).taskStatusSync.getForId(c.actorId); +} + +export function selfHistory(c: any) { + return actorClient(c).history.getForId(c.actorId); } export function selfTask(c: any) { return actorClient(c).task.getForId(c.actorId); } -export function selfOrganization(c: any) { - return actorClient(c).organization.getForId(c.actorId); +export function selfWorkspace(c: any) { + return actorClient(c).workspace.getForId(c.actorId); } -export function selfUser(c: any) { - return actorClient(c).user.getForId(c.actorId); +export function selfProject(c: any) { + return actorClient(c).project.getForId(c.actorId); } -export function selfGithubData(c: any) { - return actorClient(c).githubData.getForId(c.actorId); +export function selfSandboxInstance(c: any) { + return actorClient(c).sandboxInstance.getForId(c.actorId); } -export function selfTaskSandbox(c: any) { - return actorClient(c).taskSandbox.getForId(c.actorId); +export function selfAuthUser(c: any) { + return actorClient(c).authUser.getForId(c.actorId); } diff --git a/foundry/packages/backend/src/actors/user/db/db.ts b/foundry/packages/backend/src/actors/history/db/db.ts similarity index 70% rename from foundry/packages/backend/src/actors/user/db/db.ts rename to foundry/packages/backend/src/actors/history/db/db.ts index a864893..ef76e36 100644 --- a/foundry/packages/backend/src/actors/user/db/db.ts +++ b/foundry/packages/backend/src/actors/history/db/db.ts @@ -2,4 +2,4 @@ import { db } from "rivetkit/db/drizzle"; import * as schema from "./schema.js"; import migrations from "./migrations.js"; -export const userDb = db({ schema, migrations }); +export const historyDb = db({ schema, migrations }); diff --git a/foundry/packages/backend/src/actors/history/db/drizzle.config.ts b/foundry/packages/backend/src/actors/history/db/drizzle.config.ts new file mode 100644 index 0000000..3b1d8bd --- /dev/null +++ b/foundry/packages/backend/src/actors/history/db/drizzle.config.ts @@ -0,0 +1,6 @@ +import { defineConfig } from "rivetkit/db/drizzle"; + +export default defineConfig({ + out: "./src/actors/history/db/drizzle", + schema: "./src/actors/history/db/schema.ts", +}); diff --git a/foundry/packages/backend/src/actors/audit-log/db/drizzle/0000_fluffy_kid_colt.sql b/foundry/packages/backend/src/actors/history/db/drizzle/0000_fluffy_kid_colt.sql similarity index 100% rename from foundry/packages/backend/src/actors/audit-log/db/drizzle/0000_fluffy_kid_colt.sql rename to foundry/packages/backend/src/actors/history/db/drizzle/0000_fluffy_kid_colt.sql diff --git a/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/0000_snapshot.json b/foundry/packages/backend/src/actors/history/db/drizzle/meta/0000_snapshot.json similarity index 100% rename from foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/0000_snapshot.json rename to foundry/packages/backend/src/actors/history/db/drizzle/meta/0000_snapshot.json diff --git a/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/_journal.json b/foundry/packages/backend/src/actors/history/db/drizzle/meta/_journal.json similarity index 59% rename from foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/_journal.json rename to foundry/packages/backend/src/actors/history/db/drizzle/meta/_journal.json index 0393be2..93cf8ce 100644 --- a/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/_journal.json +++ b/foundry/packages/backend/src/actors/history/db/drizzle/meta/_journal.json @@ -8,13 +8,6 @@ "when": 1773376223815, "tag": "0000_fluffy_kid_colt", "breakpoints": true - }, - { - "idx": 1, - "version": "6", - "when": 1773376223816, - "tag": "0001_add_repo_id", - "breakpoints": true } ] } diff --git a/foundry/packages/backend/src/actors/audit-log/db/migrations.ts b/foundry/packages/backend/src/actors/history/db/migrations.ts similarity index 78% rename from foundry/packages/backend/src/actors/audit-log/db/migrations.ts rename to foundry/packages/backend/src/actors/history/db/migrations.ts index 5bf9b5a..766c225 100644 --- a/foundry/packages/backend/src/actors/audit-log/db/migrations.ts +++ b/foundry/packages/backend/src/actors/history/db/migrations.ts @@ -10,12 +10,6 @@ const journal = { tag: "0000_fluffy_kid_colt", breakpoints: true, }, - { - idx: 1, - when: 1773376223816, - tag: "0001_add_repo_id", - breakpoints: true, - }, ], } as const; @@ -30,8 +24,6 @@ export default { \`payload_json\` text NOT NULL, \`created_at\` integer NOT NULL ); -`, - m0001: `ALTER TABLE \`events\` ADD COLUMN \`repo_id\` text; `, } as const, }; diff --git a/foundry/packages/backend/src/actors/audit-log/db/schema.ts b/foundry/packages/backend/src/actors/history/db/schema.ts similarity index 77% rename from foundry/packages/backend/src/actors/audit-log/db/schema.ts rename to foundry/packages/backend/src/actors/history/db/schema.ts index d275dd4..80eb7f4 100644 --- a/foundry/packages/backend/src/actors/audit-log/db/schema.ts +++ b/foundry/packages/backend/src/actors/history/db/schema.ts @@ -2,11 +2,10 @@ import { integer, sqliteTable, text } from "rivetkit/db/drizzle"; export const events = sqliteTable("events", { id: integer("id").primaryKey({ autoIncrement: true }), - repoId: text("repo_id"), taskId: text("task_id"), branchName: text("branch_name"), kind: text("kind").notNull(), - // Structured by the audit-log event kind definitions in application code. + // Structured by the history event kind definitions in application code. payloadJson: text("payload_json").notNull(), createdAt: integer("created_at").notNull(), }); diff --git a/foundry/packages/backend/src/actors/history/index.ts b/foundry/packages/backend/src/actors/history/index.ts new file mode 100644 index 0000000..d2caa12 --- /dev/null +++ b/foundry/packages/backend/src/actors/history/index.ts @@ -0,0 +1,115 @@ +// @ts-nocheck +import { and, desc, eq } from "drizzle-orm"; +import { actor, queue } from "rivetkit"; +import { Loop, workflow } from "rivetkit/workflow"; +import type { HistoryEvent } from "@sandbox-agent/foundry-shared"; +import { selfHistory } from "../handles.js"; +import { historyDb } from "./db/db.js"; +import { events } from "./db/schema.js"; + +export interface HistoryInput { + workspaceId: string; + repoId: string; +} + +export interface AppendHistoryCommand { + kind: string; + taskId?: string; + branchName?: string; + payload: Record; +} + +export interface ListHistoryParams { + branch?: string; + taskId?: string; + limit?: number; +} + +const HISTORY_QUEUE_NAMES = ["history.command.append"] as const; + +async function appendHistoryRow(loopCtx: any, body: AppendHistoryCommand): Promise { + const now = Date.now(); + await loopCtx.db + .insert(events) + .values({ + taskId: body.taskId ?? null, + branchName: body.branchName ?? null, + kind: body.kind, + payloadJson: JSON.stringify(body.payload), + createdAt: now, + }) + .run(); +} + +async function runHistoryWorkflow(ctx: any): Promise { + await ctx.loop("history-command-loop", async (loopCtx: any) => { + const msg = await loopCtx.queue.next("next-history-command", { + names: [...HISTORY_QUEUE_NAMES], + completable: true, + }); + if (!msg) { + return Loop.continue(undefined); + } + + if (msg.name === "history.command.append") { + await loopCtx.step("append-history-row", async () => appendHistoryRow(loopCtx, msg.body as AppendHistoryCommand)); + await msg.complete({ ok: true }); + } + + return Loop.continue(undefined); + }); +} + +export const history = actor({ + db: historyDb, + queues: { + "history.command.append": queue(), + }, + options: { + name: "History", + icon: "database", + }, + createState: (_c, input: HistoryInput) => ({ + workspaceId: input.workspaceId, + repoId: input.repoId, + }), + actions: { + async append(c, command: AppendHistoryCommand): Promise { + const self = selfHistory(c); + await self.send("history.command.append", command, { wait: true, timeout: 15_000 }); + }, + + async list(c, params?: ListHistoryParams): Promise { + const whereParts = []; + if (params?.taskId) { + whereParts.push(eq(events.taskId, params.taskId)); + } + if (params?.branch) { + whereParts.push(eq(events.branchName, params.branch)); + } + + const base = c.db + .select({ + id: events.id, + taskId: events.taskId, + branchName: events.branchName, + kind: events.kind, + payloadJson: events.payloadJson, + createdAt: events.createdAt, + }) + .from(events); + + const rows = await (whereParts.length > 0 ? base.where(and(...whereParts)) : base) + .orderBy(desc(events.createdAt)) + .limit(params?.limit ?? 100) + .all(); + + return rows.map((row) => ({ + ...row, + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + })); + }, + }, + run: workflow(runHistoryWorkflow), +}); diff --git a/foundry/packages/backend/src/actors/index.ts b/foundry/packages/backend/src/actors/index.ts index 74ede4a..245b6a4 100644 --- a/foundry/packages/backend/src/actors/index.ts +++ b/foundry/packages/backend/src/actors/index.ts @@ -1,38 +1,49 @@ -import { user } from "./user/index.js"; +import { authUser } from "./auth-user/index.js"; import { setup } from "rivetkit"; -import { githubData } from "./github-data/index.js"; +import { taskStatusSync } from "./task-status-sync/index.js"; import { task } from "./task/index.js"; -import { auditLog } from "./audit-log/index.js"; -import { taskSandbox } from "./sandbox/index.js"; -import { organization } from "./organization/index.js"; +import { history } from "./history/index.js"; +import { projectBranchSync } from "./project-branch-sync/index.js"; +import { projectPrSync } from "./project-pr-sync/index.js"; +import { project } from "./project/index.js"; +import { sandboxInstance } from "./sandbox-instance/index.js"; +import { workspace } from "./workspace/index.js"; import { logger } from "../logging.js"; -import { resolveRunnerVersion } from "../config/runner-version.js"; -const runnerVersion = resolveRunnerVersion(); +const RUNNER_VERSION = Math.floor(Date.now() / 1000); export const registry = setup({ serverless: { basePath: "/v1/rivet", }, - runner: { version: runnerVersion }, + runner: { + version: RUNNER_VERSION, + }, logging: { baseLogger: logger, }, use: { - user, - organization, + authUser, + workspace, + project, task, - taskSandbox, - auditLog, - githubData, + sandboxInstance, + history, + projectPrSync, + projectBranchSync, + taskStatusSync, }, }); export * from "./context.js"; -export * from "./audit-log/index.js"; -export * from "./user/index.js"; -export * from "./github-data/index.js"; +export * from "./events.js"; +export * from "./auth-user/index.js"; +export * from "./task-status-sync/index.js"; export * from "./task/index.js"; +export * from "./history/index.js"; export * from "./keys.js"; -export * from "./sandbox/index.js"; -export * from "./organization/index.js"; +export * from "./project-branch-sync/index.js"; +export * from "./project-pr-sync/index.js"; +export * from "./project/index.js"; +export * from "./sandbox-instance/index.js"; +export * from "./workspace/index.js"; diff --git a/foundry/packages/backend/src/actors/keys.ts b/foundry/packages/backend/src/actors/keys.ts index 03bd014..bec675f 100644 --- a/foundry/packages/backend/src/actors/keys.ts +++ b/foundry/packages/backend/src/actors/keys.ts @@ -1,26 +1,38 @@ export type ActorKey = string[]; -export function organizationKey(organizationId: string): ActorKey { - return ["org", organizationId]; +export function workspaceKey(workspaceId: string): ActorKey { + return ["ws", workspaceId]; } -export function userKey(userId: string): ActorKey { - return ["org", "app", "user", userId]; +export function authUserKey(userId: string): ActorKey { + return ["ws", "app", "user", userId]; } -export function taskKey(organizationId: string, repoId: string, taskId: string): ActorKey { - return ["org", organizationId, "task", repoId, taskId]; +export function projectKey(workspaceId: string, repoId: string): ActorKey { + return ["ws", workspaceId, "project", repoId]; } -export function taskSandboxKey(organizationId: string, sandboxId: string): ActorKey { - return ["org", organizationId, "sandbox", sandboxId]; +export function taskKey(workspaceId: string, repoId: string, taskId: string): ActorKey { + return ["ws", workspaceId, "project", repoId, "task", taskId]; } -/** One audit log per org (not per repo) — see audit-log/index.ts for rationale. */ -export function auditLogKey(organizationId: string): ActorKey { - return ["org", organizationId, "audit-log"]; +export function sandboxInstanceKey(workspaceId: string, providerId: string, sandboxId: string): ActorKey { + return ["ws", workspaceId, "provider", providerId, "sandbox", sandboxId]; } -export function githubDataKey(organizationId: string): ActorKey { - return ["org", organizationId, "github-data"]; +export function historyKey(workspaceId: string, repoId: string): ActorKey { + return ["ws", workspaceId, "project", repoId, "history"]; +} + +export function projectPrSyncKey(workspaceId: string, repoId: string): ActorKey { + return ["ws", workspaceId, "project", repoId, "pr-sync"]; +} + +export function projectBranchSyncKey(workspaceId: string, repoId: string): ActorKey { + return ["ws", workspaceId, "project", repoId, "branch-sync"]; +} + +export function taskStatusSyncKey(workspaceId: string, repoId: string, taskId: string, sandboxId: string, sessionId: string): ActorKey { + // Include sandbox + session so multiple sandboxes/sessions can be tracked per task. + return ["ws", workspaceId, "project", repoId, "task", taskId, "status-sync", sandboxId, sessionId]; } diff --git a/foundry/packages/backend/src/actors/logging.ts b/foundry/packages/backend/src/actors/logging.ts index a61685f..6a4616a 100644 --- a/foundry/packages/backend/src/actors/logging.ts +++ b/foundry/packages/backend/src/actors/logging.ts @@ -2,11 +2,7 @@ import { logger } from "../logging.js"; export function resolveErrorMessage(error: unknown): string { if (error instanceof Error) { - let msg = error.message; - if (error.cause) { - msg += ` [cause: ${resolveErrorMessage(error.cause)}]`; - } - return msg; + return error.message; } return String(error); } @@ -22,16 +18,6 @@ export function resolveErrorStack(error: unknown): string | undefined { return undefined; } -export function logActorInfo(scope: string, message: string, context?: Record): void { - logger.info( - { - scope, - ...(context ?? {}), - }, - message, - ); -} - export function logActorWarning(scope: string, message: string, context?: Record): void { logger.warn( { diff --git a/foundry/packages/backend/src/actors/organization/actions.ts b/foundry/packages/backend/src/actors/organization/actions.ts deleted file mode 100644 index 2298cd9..0000000 --- a/foundry/packages/backend/src/actors/organization/actions.ts +++ /dev/null @@ -1,253 +0,0 @@ -// @ts-nocheck -import { desc, eq } from "drizzle-orm"; -import type { - RepoRecord, - WorkspaceRepositorySummary, - WorkspaceTaskSummary, - OrganizationEvent, - OrganizationGithubSummary, - OrganizationSummarySnapshot, - OrganizationUseInput, -} from "@sandbox-agent/foundry-shared"; -import { logActorWarning, resolveErrorMessage } from "../logging.js"; -import { getOrCreateGithubData } from "../handles.js"; -import { organizationProfile, taskSummaries } from "./db/schema.js"; -import { organizationAppActions } from "./actions/app.js"; -import { organizationBetterAuthActions } from "./actions/better-auth.js"; -import { organizationOnboardingActions } from "./actions/onboarding.js"; -import { organizationGithubActions } from "./actions/github.js"; -import { organizationShellActions } from "./actions/organization.js"; -import { organizationTaskActions } from "./actions/tasks.js"; -import { updateOrganizationShellProfileMutation } from "./app-shell.js"; - -interface OrganizationState { - organizationId: string; -} - -const ORGANIZATION_PROFILE_ROW_ID = 1; - -function assertOrganization(c: { state: OrganizationState }, organizationId: string): void { - if (organizationId !== c.state.organizationId) { - throw new Error(`Organization actor mismatch: actor=${c.state.organizationId} command=${organizationId}`); - } -} - -function repoLabelFromRemote(remoteUrl: string): string { - try { - const url = new URL(remoteUrl.startsWith("http") ? remoteUrl : `https://${remoteUrl}`); - const parts = url.pathname.replace(/\/+$/, "").split("/").filter(Boolean); - if (parts.length >= 2) { - return `${parts[0]}/${(parts[1] ?? "").replace(/\.git$/, "")}`; - } - } catch { - // ignore - } - - return remoteUrl; -} - -function buildGithubSummary(profile: any, importedRepoCount: number): OrganizationGithubSummary { - return { - connectedAccount: profile?.githubConnectedAccount ?? "", - installationStatus: profile?.githubInstallationStatus ?? "install_required", - syncStatus: profile?.githubSyncStatus ?? "pending", - importedRepoCount, - lastSyncLabel: profile?.githubLastSyncLabel ?? "Waiting for first import", - lastSyncAt: profile?.githubLastSyncAt ?? null, - lastWebhookAt: profile?.githubLastWebhookAt ?? null, - lastWebhookEvent: profile?.githubLastWebhookEvent ?? "", - syncGeneration: profile?.githubSyncGeneration ?? 0, - syncPhase: profile?.githubSyncPhase ?? null, - processedRepositoryCount: profile?.githubProcessedRepositoryCount ?? 0, - totalRepositoryCount: profile?.githubTotalRepositoryCount ?? 0, - }; -} - -/** - * Reads the organization sidebar snapshot from local tables only — no fan-out - * to child actors. Task summaries are organization-owned and updated via push - * from task actors. - */ -async function getOrganizationSummarySnapshot(c: any): Promise { - const profile = await c.db.select().from(organizationProfile).where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)).get(); - - // Fetch repos + open PRs from github-data actor (single actor, not fan-out) - let repoRows: Array<{ repoId: string; fullName: string; cloneUrl: string; private: boolean; defaultBranch: string }> = []; - let openPullRequests: any[] = []; - try { - const githubData = await getOrCreateGithubData(c, c.state.organizationId); - [repoRows, openPullRequests] = await Promise.all([githubData.listRepositories({}), githubData.listOpenPullRequests({})]); - } catch { - // github-data actor may not exist yet - } - - const summaryRows = await c.db.select().from(taskSummaries).orderBy(desc(taskSummaries.updatedAtMs)).all(); - const summaries = summaryRows.map((row) => ({ - id: row.taskId, - repoId: row.repoId, - title: row.title, - status: row.status, - repoName: row.repoName, - updatedAtMs: row.updatedAtMs, - branch: row.branch ?? null, - pullRequest: row.pullRequestJson - ? (() => { - try { - return JSON.parse(row.pullRequestJson); - } catch { - return null; - } - })() - : null, - sessionsSummary: row.sessionsSummaryJson - ? (() => { - try { - return JSON.parse(row.sessionsSummaryJson); - } catch { - return []; - } - })() - : [], - })); - - return { - organizationId: c.state.organizationId, - github: buildGithubSummary(profile, repoRows.length), - repos: repoRows - .map((repo) => { - const repoTasks = summaries.filter((t) => t.repoId === repo.repoId); - const latestTaskMs = repoTasks.reduce((latest, t) => Math.max(latest, t.updatedAtMs), 0); - return { - id: repo.repoId, - label: repoLabelFromRemote(repo.cloneUrl), - taskCount: repoTasks.length, - latestActivityMs: latestTaskMs || Date.now(), - }; - }) - .sort((a, b) => b.latestActivityMs - a.latestActivityMs), - taskSummaries: summaries, - openPullRequests, - }; -} - -export async function refreshOrganizationSnapshotMutation(c: any): Promise { - c.broadcast("organizationUpdated", { - type: "organizationUpdated", - snapshot: await getOrganizationSummarySnapshot(c), - } satisfies OrganizationEvent); -} - -export const organizationActions = { - ...organizationBetterAuthActions, - ...organizationGithubActions, - ...organizationOnboardingActions, - ...organizationShellActions, - ...organizationAppActions, - ...organizationTaskActions, - async useOrganization(c: any, input: OrganizationUseInput): Promise<{ organizationId: string }> { - assertOrganization(c, input.organizationId); - return { organizationId: c.state.organizationId }; - }, - - async listRepos(c: any, input: OrganizationUseInput): Promise { - assertOrganization(c, input.organizationId); - try { - const githubData = await getOrCreateGithubData(c, c.state.organizationId); - const rows = await githubData.listRepositories({}); - return rows.map((row: any) => ({ - organizationId: c.state.organizationId, - repoId: row.repoId, - remoteUrl: row.cloneUrl, - createdAt: row.updatedAt ?? Date.now(), - updatedAt: row.updatedAt ?? Date.now(), - })); - } catch { - return []; - } - }, - - async getOrganizationSummary(c: any, input: OrganizationUseInput): Promise { - assertOrganization(c, input.organizationId); - return await getOrganizationSummarySnapshot(c); - }, - - // updateShellProfile stays as a direct action — called with await from HTTP handler where the user can retry - async updateShellProfile(c: any, input: { displayName?: string; slug?: string; primaryDomain?: string }): Promise { - await updateOrganizationShellProfileMutation(c, input); - }, -}; - -export async function applyGithubSyncProgressMutation( - c: any, - input: { - connectedAccount: string; - installationStatus: string; - installationId: number | null; - syncStatus: string; - lastSyncLabel: string; - lastSyncAt: number | null; - syncGeneration: number; - syncPhase: string | null; - processedRepositoryCount: number; - totalRepositoryCount: number; - }, -): Promise { - const profile = await c.db - .select({ id: organizationProfile.id }) - .from(organizationProfile) - .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) - .get(); - if (!profile) { - return; - } - - await c.db - .update(organizationProfile) - .set({ - githubConnectedAccount: input.connectedAccount, - githubInstallationStatus: input.installationStatus, - githubSyncStatus: input.syncStatus, - githubInstallationId: input.installationId, - githubLastSyncLabel: input.lastSyncLabel, - githubLastSyncAt: input.lastSyncAt, - githubSyncGeneration: input.syncGeneration, - githubSyncPhase: input.syncPhase, - githubProcessedRepositoryCount: input.processedRepositoryCount, - githubTotalRepositoryCount: input.totalRepositoryCount, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) - .run(); - - await refreshOrganizationSnapshotMutation(c); -} - -export async function recordGithubWebhookReceiptMutation( - c: any, - input: { - organizationId: string; - event: string; - action?: string | null; - receivedAt?: number; - }, -): Promise { - assertOrganization(c, input.organizationId); - - const profile = await c.db - .select({ id: organizationProfile.id }) - .from(organizationProfile) - .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) - .get(); - if (!profile) { - return; - } - - await c.db - .update(organizationProfile) - .set({ - githubLastWebhookAt: input.receivedAt ?? Date.now(), - githubLastWebhookEvent: input.action ? `${input.event}.${input.action}` : input.event, - }) - .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) - .run(); -} diff --git a/foundry/packages/backend/src/actors/organization/actions/app.ts b/foundry/packages/backend/src/actors/organization/actions/app.ts deleted file mode 100644 index d3cc329..0000000 --- a/foundry/packages/backend/src/actors/organization/actions/app.ts +++ /dev/null @@ -1 +0,0 @@ -export { organizationAppActions } from "../app-shell.js"; diff --git a/foundry/packages/backend/src/actors/organization/actions/better-auth.ts b/foundry/packages/backend/src/actors/organization/actions/better-auth.ts deleted file mode 100644 index 060ceed..0000000 --- a/foundry/packages/backend/src/actors/organization/actions/better-auth.ts +++ /dev/null @@ -1,360 +0,0 @@ -import { and, asc, count as sqlCount, desc, eq, gt, gte, inArray, isNotNull, isNull, like, lt, lte, ne, notInArray, or } from "drizzle-orm"; -import { authAccountIndex, authEmailIndex, authSessionIndex, authVerification } from "../db/schema.js"; -import { APP_SHELL_ORGANIZATION_ID } from "../constants.js"; - -function assertAppOrganization(c: any): void { - if (c.state.organizationId !== APP_SHELL_ORGANIZATION_ID) { - throw new Error(`App shell action requires organization ${APP_SHELL_ORGANIZATION_ID}, got ${c.state.organizationId}`); - } -} - -function organizationAuthColumn(table: any, field: string): any { - const column = table[field]; - if (!column) { - throw new Error(`Unknown auth table field: ${field}`); - } - return column; -} - -function normalizeAuthValue(value: unknown): unknown { - if (value instanceof Date) { - return value.getTime(); - } - if (Array.isArray(value)) { - return value.map((entry) => normalizeAuthValue(entry)); - } - return value; -} - -function organizationAuthClause(table: any, clause: { field: string; value: unknown; operator?: string }): any { - const column = organizationAuthColumn(table, clause.field); - const value = normalizeAuthValue(clause.value); - switch (clause.operator) { - case "ne": - return value === null ? isNotNull(column) : ne(column, value as any); - case "lt": - return lt(column, value as any); - case "lte": - return lte(column, value as any); - case "gt": - return gt(column, value as any); - case "gte": - return gte(column, value as any); - case "in": - return inArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); - case "not_in": - return notInArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); - case "contains": - return like(column, `%${String(value ?? "")}%`); - case "starts_with": - return like(column, `${String(value ?? "")}%`); - case "ends_with": - return like(column, `%${String(value ?? "")}`); - case "eq": - default: - return value === null ? isNull(column) : eq(column, value as any); - } -} - -function organizationBetterAuthWhere(table: any, clauses: any[] | undefined): any { - if (!clauses || clauses.length === 0) { - return undefined; - } - let expr = organizationAuthClause(table, clauses[0]); - for (const clause of clauses.slice(1)) { - const next = organizationAuthClause(table, clause); - expr = clause.connector === "OR" ? or(expr, next) : and(expr, next); - } - return expr; -} - -export async function betterAuthUpsertSessionIndexMutation(c: any, input: { sessionId: string; sessionToken: string; userId: string }) { - assertAppOrganization(c); - - const now = Date.now(); - await c.db - .insert(authSessionIndex) - .values({ - sessionId: input.sessionId, - sessionToken: input.sessionToken, - userId: input.userId, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: authSessionIndex.sessionId, - set: { - sessionToken: input.sessionToken, - userId: input.userId, - updatedAt: now, - }, - }) - .run(); - return await c.db.select().from(authSessionIndex).where(eq(authSessionIndex.sessionId, input.sessionId)).get(); -} - -export async function betterAuthDeleteSessionIndexMutation(c: any, input: { sessionId?: string; sessionToken?: string }) { - assertAppOrganization(c); - - const clauses = [ - ...(input.sessionId ? [{ field: "sessionId", value: input.sessionId }] : []), - ...(input.sessionToken ? [{ field: "sessionToken", value: input.sessionToken }] : []), - ]; - if (clauses.length === 0) { - return; - } - const predicate = organizationBetterAuthWhere(authSessionIndex, clauses); - await c.db.delete(authSessionIndex).where(predicate!).run(); -} - -export async function betterAuthUpsertEmailIndexMutation(c: any, input: { email: string; userId: string }) { - assertAppOrganization(c); - - const now = Date.now(); - await c.db - .insert(authEmailIndex) - .values({ - email: input.email, - userId: input.userId, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: authEmailIndex.email, - set: { - userId: input.userId, - updatedAt: now, - }, - }) - .run(); - return await c.db.select().from(authEmailIndex).where(eq(authEmailIndex.email, input.email)).get(); -} - -export async function betterAuthDeleteEmailIndexMutation(c: any, input: { email: string }) { - assertAppOrganization(c); - await c.db.delete(authEmailIndex).where(eq(authEmailIndex.email, input.email)).run(); -} - -export async function betterAuthUpsertAccountIndexMutation(c: any, input: { id: string; providerId: string; accountId: string; userId: string }) { - assertAppOrganization(c); - - const now = Date.now(); - await c.db - .insert(authAccountIndex) - .values({ - id: input.id, - providerId: input.providerId, - accountId: input.accountId, - userId: input.userId, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: authAccountIndex.id, - set: { - providerId: input.providerId, - accountId: input.accountId, - userId: input.userId, - updatedAt: now, - }, - }) - .run(); - return await c.db.select().from(authAccountIndex).where(eq(authAccountIndex.id, input.id)).get(); -} - -export async function betterAuthDeleteAccountIndexMutation(c: any, input: { id?: string; providerId?: string; accountId?: string }) { - assertAppOrganization(c); - - if (input.id) { - await c.db.delete(authAccountIndex).where(eq(authAccountIndex.id, input.id)).run(); - return; - } - if (input.providerId && input.accountId) { - await c.db - .delete(authAccountIndex) - .where(and(eq(authAccountIndex.providerId, input.providerId), eq(authAccountIndex.accountId, input.accountId))) - .run(); - } -} - -export async function betterAuthCreateVerificationMutation(c: any, input: { data: Record }) { - assertAppOrganization(c); - - await c.db - .insert(authVerification) - .values(input.data as any) - .run(); - return await c.db - .select() - .from(authVerification) - .where(eq(authVerification.id, input.data.id as string)) - .get(); -} - -export async function betterAuthUpdateVerificationMutation(c: any, input: { where: any[]; update: Record }) { - assertAppOrganization(c); - - const predicate = organizationBetterAuthWhere(authVerification, input.where); - if (!predicate) { - return null; - } - await c.db - .update(authVerification) - .set(input.update as any) - .where(predicate) - .run(); - return await c.db.select().from(authVerification).where(predicate).get(); -} - -export async function betterAuthUpdateManyVerificationMutation(c: any, input: { where: any[]; update: Record }) { - assertAppOrganization(c); - - const predicate = organizationBetterAuthWhere(authVerification, input.where); - if (!predicate) { - return 0; - } - await c.db - .update(authVerification) - .set(input.update as any) - .where(predicate) - .run(); - const row = await c.db.select({ value: sqlCount() }).from(authVerification).where(predicate).get(); - return row?.value ?? 0; -} - -export async function betterAuthDeleteVerificationMutation(c: any, input: { where: any[] }) { - assertAppOrganization(c); - - const predicate = organizationBetterAuthWhere(authVerification, input.where); - if (!predicate) { - return; - } - await c.db.delete(authVerification).where(predicate).run(); -} - -export async function betterAuthDeleteManyVerificationMutation(c: any, input: { where: any[] }) { - assertAppOrganization(c); - - const predicate = organizationBetterAuthWhere(authVerification, input.where); - if (!predicate) { - return 0; - } - const rows = await c.db.select().from(authVerification).where(predicate).all(); - await c.db.delete(authVerification).where(predicate).run(); - return rows.length; -} - -// Exception to the CLAUDE.md queue-for-mutations rule: Better Auth adapter operations -// use direct actions even for mutations. Better Auth runs during OAuth callbacks on the -// HTTP request path, not through the normal organization lifecycle. Routing through the -// queue adds multiple sequential round-trips (each with actor wake-up + step overhead) -// that cause 30-second OAuth callbacks and proxy retry storms. These mutations are simple -// SQLite upserts/deletes with no cross-actor coordination or broadcast side effects. -export const organizationBetterAuthActions = { - // --- Mutation actions (called by the Better Auth adapter in better-auth.ts) --- - async betterAuthUpsertSessionIndex(c: any, input: { sessionId: string; sessionToken: string; userId: string }) { - return await betterAuthUpsertSessionIndexMutation(c, input); - }, - async betterAuthDeleteSessionIndex(c: any, input: { sessionId?: string; sessionToken?: string }) { - await betterAuthDeleteSessionIndexMutation(c, input); - }, - async betterAuthUpsertEmailIndex(c: any, input: { email: string; userId: string }) { - return await betterAuthUpsertEmailIndexMutation(c, input); - }, - async betterAuthDeleteEmailIndex(c: any, input: { email: string }) { - await betterAuthDeleteEmailIndexMutation(c, input); - }, - async betterAuthUpsertAccountIndex(c: any, input: { id: string; providerId: string; accountId: string; userId: string }) { - return await betterAuthUpsertAccountIndexMutation(c, input); - }, - async betterAuthDeleteAccountIndex(c: any, input: { id?: string; providerId?: string; accountId?: string }) { - await betterAuthDeleteAccountIndexMutation(c, input); - }, - async betterAuthCreateVerification(c: any, input: { data: Record }) { - return await betterAuthCreateVerificationMutation(c, input); - }, - async betterAuthUpdateVerification(c: any, input: { where: any[]; update: Record }) { - return await betterAuthUpdateVerificationMutation(c, input); - }, - async betterAuthUpdateManyVerification(c: any, input: { where: any[]; update: Record }) { - return await betterAuthUpdateManyVerificationMutation(c, input); - }, - async betterAuthDeleteVerification(c: any, input: { where: any[] }) { - await betterAuthDeleteVerificationMutation(c, input); - }, - async betterAuthDeleteManyVerification(c: any, input: { where: any[] }) { - return await betterAuthDeleteManyVerificationMutation(c, input); - }, - - // --- Read actions --- - async betterAuthFindSessionIndex(c: any, input: { sessionId?: string; sessionToken?: string }) { - assertAppOrganization(c); - - const clauses = [ - ...(input.sessionId ? [{ field: "sessionId", value: input.sessionId }] : []), - ...(input.sessionToken ? [{ field: "sessionToken", value: input.sessionToken }] : []), - ]; - if (clauses.length === 0) { - return null; - } - const predicate = organizationBetterAuthWhere(authSessionIndex, clauses); - return await c.db.select().from(authSessionIndex).where(predicate!).get(); - }, - - async betterAuthFindEmailIndex(c: any, input: { email: string }) { - assertAppOrganization(c); - return await c.db.select().from(authEmailIndex).where(eq(authEmailIndex.email, input.email)).get(); - }, - - async betterAuthFindAccountIndex(c: any, input: { id?: string; providerId?: string; accountId?: string }) { - assertAppOrganization(c); - - if (input.id) { - return await c.db.select().from(authAccountIndex).where(eq(authAccountIndex.id, input.id)).get(); - } - if (!input.providerId || !input.accountId) { - return null; - } - return await c.db - .select() - .from(authAccountIndex) - .where(and(eq(authAccountIndex.providerId, input.providerId), eq(authAccountIndex.accountId, input.accountId))) - .get(); - }, - - async betterAuthFindOneVerification(c: any, input: { where: any[] }) { - assertAppOrganization(c); - - const predicate = organizationBetterAuthWhere(authVerification, input.where); - return predicate ? await c.db.select().from(authVerification).where(predicate).get() : null; - }, - - async betterAuthFindManyVerification(c: any, input: { where?: any[]; limit?: number; sortBy?: any; offset?: number }) { - assertAppOrganization(c); - - const predicate = organizationBetterAuthWhere(authVerification, input.where); - let query = c.db.select().from(authVerification); - if (predicate) { - query = query.where(predicate); - } - if (input.sortBy?.field) { - const column = organizationAuthColumn(authVerification, input.sortBy.field); - query = query.orderBy(input.sortBy.direction === "asc" ? asc(column) : desc(column)); - } - if (typeof input.limit === "number") { - query = query.limit(input.limit); - } - if (typeof input.offset === "number") { - query = query.offset(input.offset); - } - return await query.all(); - }, - - async betterAuthCountVerification(c: any, input: { where?: any[] }) { - assertAppOrganization(c); - - const predicate = organizationBetterAuthWhere(authVerification, input.where); - const row = predicate - ? await c.db.select({ value: sqlCount() }).from(authVerification).where(predicate).get() - : await c.db.select({ value: sqlCount() }).from(authVerification).get(); - return row?.value ?? 0; - }, -}; diff --git a/foundry/packages/backend/src/actors/organization/actions/github.ts b/foundry/packages/backend/src/actors/organization/actions/github.ts deleted file mode 100644 index 43818c0..0000000 --- a/foundry/packages/backend/src/actors/organization/actions/github.ts +++ /dev/null @@ -1,80 +0,0 @@ -import { desc } from "drizzle-orm"; -import type { FoundryAppSnapshot } from "@sandbox-agent/foundry-shared"; -import { getOrCreateGithubData, getOrCreateOrganization } from "../../handles.js"; -import { githubDataWorkflowQueueName } from "../../github-data/index.js"; -import { authSessionIndex } from "../db/schema.js"; -import { assertAppOrganization, buildAppSnapshot, requireEligibleOrganization, requireSignedInSession } from "../app-shell.js"; -import { getBetterAuthService } from "../../../services/better-auth.js"; -import { refreshOrganizationSnapshotMutation } from "../actions.js"; -import { organizationWorkflowQueueName } from "../queues.js"; - -export const organizationGithubActions = { - async resolveAppGithubToken( - c: any, - input: { organizationId: string; requireRepoScope?: boolean }, - ): Promise<{ accessToken: string; scopes: string[] } | null> { - assertAppOrganization(c); - const auth = getBetterAuthService(); - const rows = await c.db.select().from(authSessionIndex).orderBy(desc(authSessionIndex.updatedAt)).all(); - - for (const row of rows) { - const authState = await auth.getAuthState(row.sessionId); - if (authState?.sessionState?.activeOrganizationId !== input.organizationId) { - continue; - } - - const token = await auth.getAccessTokenForSession(row.sessionId); - if (!token?.accessToken) { - continue; - } - - const scopes = token.scopes; - if (input.requireRepoScope !== false && scopes.length > 0 && !scopes.some((scope) => scope === "repo" || scope.startsWith("repo:"))) { - continue; - } - - return { - accessToken: token.accessToken, - scopes, - }; - } - - return null; - }, - - async triggerAppRepoImport(c: any, input: { sessionId: string; organizationId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - - const githubData = await getOrCreateGithubData(c, input.organizationId); - const summary = await githubData.getSummary({}); - if (summary.syncStatus === "syncing") { - return await buildAppSnapshot(c, input.sessionId); - } - - const organizationHandle = await getOrCreateOrganization(c, input.organizationId); - await organizationHandle.send( - organizationWorkflowQueueName("organization.command.shell.sync_started.mark"), - { label: "Importing repository catalog..." }, - { wait: false }, - ); - await organizationHandle.send(organizationWorkflowQueueName("organization.command.snapshot.broadcast"), {}, { wait: false }); - - void githubData - .send(githubDataWorkflowQueueName("githubData.command.syncRepos"), { label: "Importing repository catalog..." }, { wait: false }) - .catch(() => {}); - - return await buildAppSnapshot(c, input.sessionId); - }, - - async adminReloadGithubOrganization(c: any): Promise { - const githubData = await getOrCreateGithubData(c, c.state.organizationId); - await githubData.send(githubDataWorkflowQueueName("githubData.command.syncRepos"), { label: "Reloading GitHub organization..." }, { wait: false }); - }, - - async adminReloadGithubRepository(c: any, _input: { repoId: string }): Promise { - const githubData = await getOrCreateGithubData(c, c.state.organizationId); - await githubData.send(githubDataWorkflowQueueName("githubData.command.syncRepos"), { label: "Reloading repository..." }, { wait: false }); - }, -}; diff --git a/foundry/packages/backend/src/actors/organization/actions/onboarding.ts b/foundry/packages/backend/src/actors/organization/actions/onboarding.ts deleted file mode 100644 index 22153f4..0000000 --- a/foundry/packages/backend/src/actors/organization/actions/onboarding.ts +++ /dev/null @@ -1,82 +0,0 @@ -import { randomUUID } from "node:crypto"; -import type { FoundryAppSnapshot, StarSandboxAgentRepoInput, StarSandboxAgentRepoResult } from "@sandbox-agent/foundry-shared"; -import { getOrCreateGithubData, getOrCreateOrganization } from "../../handles.js"; -import { - assertAppOrganization, - buildAppSnapshot, - getOrganizationState, - requireEligibleOrganization, - requireSignedInSession, -} from "../app-shell.js"; -import { getBetterAuthService } from "../../../services/better-auth.js"; -import { getActorRuntimeContext } from "../../context.js"; -import { resolveOrganizationGithubAuth } from "../../../services/github-auth.js"; - -const SANDBOX_AGENT_REPO = "rivet-dev/sandbox-agent"; - -export const organizationOnboardingActions = { - async skipAppStarterRepo(c: any, input: { sessionId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - await getBetterAuthService().upsertUserProfile(session.authUserId, { - starterRepoStatus: "skipped", - starterRepoSkippedAt: Date.now(), - starterRepoStarredAt: null, - }); - return await buildAppSnapshot(c, input.sessionId); - }, - - async starAppStarterRepo(c: any, input: { sessionId: string; organizationId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const organization = await getOrCreateOrganization(c, input.organizationId); - await organization.starSandboxAgentRepo({ - organizationId: input.organizationId, - }); - await getBetterAuthService().upsertUserProfile(session.authUserId, { - starterRepoStatus: "starred", - starterRepoStarredAt: Date.now(), - starterRepoSkippedAt: null, - }); - return await buildAppSnapshot(c, input.sessionId); - }, - - async selectAppOrganization(c: any, input: { sessionId: string; organizationId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - await getBetterAuthService().setActiveOrganization(input.sessionId, input.organizationId); - await getOrCreateGithubData(c, input.organizationId); - return await buildAppSnapshot(c, input.sessionId); - }, - - async beginAppGithubInstall(c: any, input: { sessionId: string; organizationId: string }): Promise<{ url: string }> { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const { appShell } = getActorRuntimeContext(); - const organizationHandle = await getOrCreateOrganization(c, input.organizationId); - const organizationState = await getOrganizationState(organizationHandle); - if (organizationState.snapshot.kind !== "organization") { - return { - url: `${appShell.appUrl}/organizations/${input.organizationId}`, - }; - } - return { - url: await appShell.github.buildInstallationUrl(organizationState.githubLogin, randomUUID()), - }; - }, - - async starSandboxAgentRepo(c: any, input: StarSandboxAgentRepoInput): Promise { - const { driver } = getActorRuntimeContext(); - const auth = await resolveOrganizationGithubAuth(c, c.state.organizationId); - await driver.github.starRepository(SANDBOX_AGENT_REPO, { - githubToken: auth?.githubToken ?? null, - }); - return { - repo: SANDBOX_AGENT_REPO, - starredAt: Date.now(), - }; - }, -}; diff --git a/foundry/packages/backend/src/actors/organization/actions/organization.ts b/foundry/packages/backend/src/actors/organization/actions/organization.ts deleted file mode 100644 index 9e1cbd6..0000000 --- a/foundry/packages/backend/src/actors/organization/actions/organization.ts +++ /dev/null @@ -1,53 +0,0 @@ -import type { FoundryAppSnapshot, UpdateFoundryOrganizationProfileInput, WorkspaceModelId } from "@sandbox-agent/foundry-shared"; -import { getBetterAuthService } from "../../../services/better-auth.js"; -import { getOrCreateOrganization } from "../../handles.js"; -import { - assertAppOrganization, - assertOrganizationShell, - buildAppSnapshot, - buildOrganizationState, - buildOrganizationStateIfInitialized, - requireEligibleOrganization, - requireSignedInSession, -} from "../app-shell.js"; - -export const organizationShellActions = { - async getAppSnapshot(c: any, input: { sessionId: string }): Promise { - return await buildAppSnapshot(c, input.sessionId); - }, - - async setAppDefaultModel(c: any, input: { sessionId: string; defaultModel: WorkspaceModelId }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - await getBetterAuthService().upsertUserProfile(session.authUserId, { - defaultModel: input.defaultModel, - }); - return await buildAppSnapshot(c, input.sessionId); - }, - - async updateAppOrganizationProfile( - c: any, - input: { sessionId: string; organizationId: string } & UpdateFoundryOrganizationProfileInput, - ): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const organization = await getOrCreateOrganization(c, input.organizationId); - await organization.updateShellProfile({ - displayName: input.displayName, - slug: input.slug, - primaryDomain: input.primaryDomain, - }); - return await buildAppSnapshot(c, input.sessionId); - }, - - async getOrganizationShellState(c: any): Promise { - assertOrganizationShell(c); - return await buildOrganizationState(c); - }, - - async getOrganizationShellStateIfInitialized(c: any): Promise { - assertOrganizationShell(c); - return await buildOrganizationStateIfInitialized(c); - }, -}; diff --git a/foundry/packages/backend/src/actors/organization/actions/task-mutations.ts b/foundry/packages/backend/src/actors/organization/actions/task-mutations.ts deleted file mode 100644 index 3affccd..0000000 --- a/foundry/packages/backend/src/actors/organization/actions/task-mutations.ts +++ /dev/null @@ -1,478 +0,0 @@ -// @ts-nocheck -import { randomUUID } from "node:crypto"; -import { and, desc, eq, isNotNull, ne } from "drizzle-orm"; -import type { - RepoOverview, - SandboxProviderId, - TaskRecord, - TaskSummary, - WorkspacePullRequestSummary, - WorkspaceSessionSummary, - WorkspaceTaskSummary, -} from "@sandbox-agent/foundry-shared"; -import { getActorRuntimeContext } from "../../context.js"; -import { getGithubData, getOrCreateAuditLog, getOrCreateTask, getTask } from "../../handles.js"; -// task actions called directly (no queue) -import { deriveFallbackTitle, resolveCreateFlowDecision } from "../../../services/create-flow.js"; -// actions return directly (no queue response unwrapping) -import { isActorNotFoundError, logActorWarning, resolveErrorMessage } from "../../logging.js"; -import { defaultSandboxProviderId } from "../../../sandbox-config.js"; -import { taskWorkflowQueueName } from "../../task/workflow/queue.js"; -import { expectQueueResponse } from "../../../services/queue.js"; -import { taskIndex, taskSummaries } from "../db/schema.js"; -import { refreshOrganizationSnapshotMutation } from "../actions.js"; - -interface CreateTaskCommand { - repoId: string; - task: string; - sandboxProviderId: SandboxProviderId; - explicitTitle: string | null; - explicitBranchName: string | null; - onBranch: string | null; -} - -interface RegisterTaskBranchCommand { - repoId: string; - taskId: string; - branchName: string; -} - -function isStaleTaskReferenceError(error: unknown): boolean { - const message = resolveErrorMessage(error); - return isActorNotFoundError(error) || message.startsWith("Task not found:"); -} - -function parseJsonValue(value: string | null | undefined, fallback: T): T { - if (!value) { - return fallback; - } - - try { - return JSON.parse(value) as T; - } catch { - return fallback; - } -} - -function taskSummaryRowFromSummary(taskSummary: WorkspaceTaskSummary) { - return { - taskId: taskSummary.id, - repoId: taskSummary.repoId, - title: taskSummary.title, - status: taskSummary.status, - repoName: taskSummary.repoName, - updatedAtMs: taskSummary.updatedAtMs, - branch: taskSummary.branch, - pullRequestJson: JSON.stringify(taskSummary.pullRequest), - sessionsSummaryJson: JSON.stringify(taskSummary.sessionsSummary), - primaryUserLogin: taskSummary.primaryUserLogin ?? null, - primaryUserAvatarUrl: taskSummary.primaryUserAvatarUrl ?? null, - }; -} - -export function taskSummaryFromRow(repoId: string, row: any): WorkspaceTaskSummary { - return { - id: row.taskId, - repoId, - title: row.title, - status: row.status, - repoName: row.repoName, - updatedAtMs: row.updatedAtMs, - branch: row.branch ?? null, - pullRequest: parseJsonValue(row.pullRequestJson, null), - sessionsSummary: parseJsonValue(row.sessionsSummaryJson, []), - primaryUserLogin: row.primaryUserLogin ?? null, - primaryUserAvatarUrl: row.primaryUserAvatarUrl ?? null, - }; -} - -export async function upsertTaskSummary(c: any, taskSummary: WorkspaceTaskSummary): Promise { - await c.db - .insert(taskSummaries) - .values(taskSummaryRowFromSummary(taskSummary)) - .onConflictDoUpdate({ - target: taskSummaries.taskId, - set: taskSummaryRowFromSummary(taskSummary), - }) - .run(); -} - -async function deleteStaleTaskIndexRow(c: any, taskId: string): Promise { - try { - await c.db.delete(taskIndex).where(eq(taskIndex.taskId, taskId)).run(); - } catch { - // Best effort cleanup only. - } -} - -async function listKnownTaskBranches(c: any, repoId: string): Promise { - const rows = await c.db - .select({ branchName: taskIndex.branchName }) - .from(taskIndex) - .where(and(eq(taskIndex.repoId, repoId), isNotNull(taskIndex.branchName))) - .all(); - return rows.map((row) => row.branchName).filter((value): value is string => typeof value === "string" && value.trim().length > 0); -} - -async function resolveGitHubRepository(c: any, repoId: string) { - const githubData = getGithubData(c, c.state.organizationId); - return await githubData.getRepository({ repoId }).catch(() => null); -} - -async function resolveRepositoryRemoteUrl(c: any, repoId: string): Promise { - const repository = await resolveGitHubRepository(c, repoId); - const remoteUrl = repository?.cloneUrl?.trim(); - if (!remoteUrl) { - throw new Error(`Missing remote URL for repo ${repoId}`); - } - return remoteUrl; -} - -/** - * The ONLY backend code path that creates a task actor via getOrCreateTask. - * Called when a user explicitly creates a new task (not during sync/webhooks). - * - * All other code must use getTask (handles.ts) which calls .get() and will - * error if the actor doesn't exist. Virtual tasks created during PR sync - * are materialized lazily by the client's getOrCreate in backend-client.ts. - * - * NEVER call this from a sync loop or webhook handler. - */ -export async function createTaskMutation(c: any, cmd: CreateTaskCommand): Promise { - const organizationId = c.state.organizationId; - const repoId = cmd.repoId; - await resolveRepositoryRemoteUrl(c, repoId); - const onBranch = cmd.onBranch?.trim() || null; - const taskId = randomUUID(); - let initialBranchName: string | null = null; - let initialTitle: string | null = null; - - if (onBranch) { - initialBranchName = onBranch; - initialTitle = deriveFallbackTitle(cmd.task, cmd.explicitTitle ?? undefined); - - await registerTaskBranchMutation(c, { - repoId, - taskId, - branchName: onBranch, - }); - } else { - const reservedBranches = await listKnownTaskBranches(c, repoId); - const resolved = resolveCreateFlowDecision({ - task: cmd.task, - explicitTitle: cmd.explicitTitle ?? undefined, - explicitBranchName: cmd.explicitBranchName ?? undefined, - localBranches: [], - taskBranches: reservedBranches, - }); - - initialBranchName = resolved.branchName; - initialTitle = resolved.title; - - const now = Date.now(); - await c.db - .insert(taskIndex) - .values({ - taskId, - repoId, - branchName: resolved.branchName, - createdAt: now, - updatedAt: now, - }) - .onConflictDoNothing() - .run(); - } - - let taskHandle: Awaited>; - try { - taskHandle = await getOrCreateTask(c, organizationId, repoId, taskId, { - organizationId, - repoId, - taskId, - }); - } catch (error) { - if (initialBranchName) { - await deleteStaleTaskIndexRow(c, taskId); - } - throw error; - } - - const created = expectQueueResponse( - await taskHandle.send( - taskWorkflowQueueName("task.command.initialize"), - { - sandboxProviderId: cmd.sandboxProviderId, - branchName: initialBranchName, - title: initialTitle, - task: cmd.task, - }, - { wait: true, timeout: 10_000 }, - ), - ); - - try { - await upsertTaskSummary(c, await taskHandle.getTaskSummary({})); - await refreshOrganizationSnapshotMutation(c); - } catch (error) { - logActorWarning("organization", "failed seeding task summary after task creation", { - organizationId, - repoId, - taskId, - error: resolveErrorMessage(error), - }); - } - - const auditLog = await getOrCreateAuditLog(c, organizationId); - void auditLog.append({ - kind: "task.created", - repoId, - taskId, - payload: { - repoId, - sandboxProviderId: cmd.sandboxProviderId, - }, - }); - - try { - const taskSummary = await taskHandle.getTaskSummary({}); - await upsertTaskSummary(c, taskSummary); - } catch (error) { - logActorWarning("organization", "failed seeding organization task projection", { - organizationId, - repoId, - taskId, - error: resolveErrorMessage(error), - }); - } - - return created; -} - -export async function registerTaskBranchMutation(c: any, cmd: RegisterTaskBranchCommand): Promise<{ branchName: string }> { - const branchName = cmd.branchName.trim(); - if (!branchName) { - throw new Error("branchName is required"); - } - - const existingOwner = await c.db - .select({ taskId: taskIndex.taskId }) - .from(taskIndex) - .where(and(eq(taskIndex.branchName, branchName), eq(taskIndex.repoId, cmd.repoId), ne(taskIndex.taskId, cmd.taskId))) - .get(); - - if (existingOwner) { - let ownerMissing = false; - try { - await getTask(c, c.state.organizationId, cmd.repoId, existingOwner.taskId).get(); - } catch (error) { - if (isStaleTaskReferenceError(error)) { - ownerMissing = true; - await deleteStaleTaskIndexRow(c, existingOwner.taskId); - } else { - throw error; - } - } - if (!ownerMissing) { - throw new Error(`branch is already assigned to a different task: ${branchName}`); - } - } - - const now = Date.now(); - await c.db - .insert(taskIndex) - .values({ - taskId: cmd.taskId, - repoId: cmd.repoId, - branchName, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: taskIndex.taskId, - set: { - branchName, - updatedAt: now, - }, - }) - .run(); - - return { branchName }; -} - -export async function applyTaskSummaryUpdateMutation(c: any, input: { taskSummary: WorkspaceTaskSummary }): Promise { - await upsertTaskSummary(c, input.taskSummary); - await refreshOrganizationSnapshotMutation(c); -} - -export async function removeTaskSummaryMutation(c: any, input: { taskId: string }): Promise { - await c.db.delete(taskSummaries).where(eq(taskSummaries.taskId, input.taskId)).run(); - await refreshOrganizationSnapshotMutation(c); -} - -/** - * Called for every changed PR during sync and on webhook PR events. - * Runs in a bulk loop — MUST NOT create task actors or make cross-actor calls - * to task actors. Only writes to the org's local taskIndex/taskSummaries tables. - * Task actors are created lazily when the user views the task. - */ -export async function refreshTaskSummaryForBranchMutation( - c: any, - input: { repoId: string; branchName: string; pullRequest?: WorkspacePullRequestSummary | null; repoName?: string }, -): Promise { - const pullRequest = input.pullRequest ?? null; - let rows = await c.db - .select({ taskId: taskSummaries.taskId }) - .from(taskSummaries) - .where(and(eq(taskSummaries.branch, input.branchName), eq(taskSummaries.repoId, input.repoId))) - .all(); - - if (rows.length === 0 && pullRequest) { - // Create a virtual task entry in the org's local tables only. - // No task actor is spawned — it will be created lazily when the user - // clicks on the task in the sidebar (the "materialize" path). - const taskId = randomUUID(); - const now = Date.now(); - const title = pullRequest.title?.trim() || input.branchName; - const repoName = input.repoName ?? `${c.state.organizationId}/${input.repoId}`; - - await c.db - .insert(taskIndex) - .values({ taskId, repoId: input.repoId, branchName: input.branchName, createdAt: now, updatedAt: now }) - .onConflictDoNothing() - .run(); - - await c.db - .insert(taskSummaries) - .values({ - taskId, - repoId: input.repoId, - title, - status: "init_complete", - repoName, - updatedAtMs: pullRequest.updatedAtMs ?? now, - branch: input.branchName, - pullRequestJson: JSON.stringify(pullRequest), - sessionsSummaryJson: "[]", - }) - .onConflictDoNothing() - .run(); - - rows = [{ taskId }]; - } else { - // Update PR data on existing task summaries locally. - // If a real task actor exists, also notify it. - for (const row of rows) { - // Update the local summary with the new PR data - await c.db - .update(taskSummaries) - .set({ - pullRequestJson: pullRequest ? JSON.stringify(pullRequest) : null, - updatedAtMs: pullRequest?.updatedAtMs ?? Date.now(), - }) - .where(eq(taskSummaries.taskId, row.taskId)) - .run(); - - // Best-effort notify the task actor if it exists (fire-and-forget) - try { - const task = getTask(c, c.state.organizationId, input.repoId, row.taskId); - void task.syncPullRequest({ pullRequest }).catch(() => {}); - } catch { - // Task actor doesn't exist yet — that's fine, it's virtual - } - } - } - - await refreshOrganizationSnapshotMutation(c); -} - -export async function listTaskSummariesForRepo(c: any, repoId: string, includeArchived = false): Promise { - const rows = await c.db.select().from(taskSummaries).where(eq(taskSummaries.repoId, repoId)).orderBy(desc(taskSummaries.updatedAtMs)).all(); - return rows - .map((row) => ({ - organizationId: c.state.organizationId, - repoId, - taskId: row.taskId, - branchName: row.branch ?? null, - title: row.title, - status: row.status, - updatedAt: row.updatedAtMs, - pullRequest: parseJsonValue(row.pullRequestJson, null), - })) - .filter((row) => includeArchived || row.status !== "archived"); -} - -export async function listAllTaskSummaries(c: any, includeArchived = false): Promise { - const rows = await c.db.select().from(taskSummaries).orderBy(desc(taskSummaries.updatedAtMs)).all(); - return rows - .map((row) => ({ - organizationId: c.state.organizationId, - repoId: row.repoId, - taskId: row.taskId, - branchName: row.branch ?? null, - title: row.title, - status: row.status, - updatedAt: row.updatedAtMs, - pullRequest: parseJsonValue(row.pullRequestJson, null), - })) - .filter((row) => includeArchived || row.status !== "archived"); -} - -export async function listWorkspaceTaskSummaries(c: any): Promise { - const rows = await c.db.select().from(taskSummaries).orderBy(desc(taskSummaries.updatedAtMs)).all(); - return rows.map((row) => taskSummaryFromRow(row.repoId, row)); -} - -export async function getRepoOverviewFromOrg(c: any, repoId: string): Promise { - const now = Date.now(); - const repository = await resolveGitHubRepository(c, repoId); - const remoteUrl = await resolveRepositoryRemoteUrl(c, repoId); - const taskRows = await c.db.select().from(taskSummaries).where(eq(taskSummaries.repoId, repoId)).all(); - - const branches = taskRows - .filter((row: any) => row.branch) - .map((row: any) => { - const pr = parseJsonValue(row.pullRequestJson, null); - return { - branchName: row.branch!, - commitSha: "", - taskId: row.taskId, - taskTitle: row.title ?? null, - taskStatus: row.status ?? null, - pullRequest: pr, - ciStatus: null, - updatedAt: Math.max(row.updatedAtMs ?? 0, pr?.updatedAtMs ?? 0, now), - }; - }) - .sort((a: any, b: any) => b.updatedAt - a.updatedAt); - - return { - organizationId: c.state.organizationId, - repoId, - remoteUrl, - baseRef: repository?.defaultBranch ?? null, - fetchedAt: now, - branches, - }; -} - -export async function getRepositoryMetadataFromOrg( - c: any, - repoId: string, -): Promise<{ defaultBranch: string | null; fullName: string | null; remoteUrl: string }> { - const repository = await resolveGitHubRepository(c, repoId); - const remoteUrl = await resolveRepositoryRemoteUrl(c, repoId); - return { - defaultBranch: repository?.defaultBranch ?? null, - fullName: repository?.fullName ?? null, - remoteUrl, - }; -} - -export async function findTaskForBranch(c: any, repoId: string, branchName: string): Promise<{ taskId: string | null }> { - const row = await c.db - .select({ taskId: taskSummaries.taskId }) - .from(taskSummaries) - .where(and(eq(taskSummaries.branch, branchName), eq(taskSummaries.repoId, repoId))) - .get(); - return { taskId: row?.taskId ?? null }; -} diff --git a/foundry/packages/backend/src/actors/organization/actions/tasks.ts b/foundry/packages/backend/src/actors/organization/actions/tasks.ts deleted file mode 100644 index 80bb2f9..0000000 --- a/foundry/packages/backend/src/actors/organization/actions/tasks.ts +++ /dev/null @@ -1,377 +0,0 @@ -// @ts-nocheck -import { desc, eq } from "drizzle-orm"; -import type { - AuditLogEvent, - CreateTaskInput, - HistoryQueryInput, - ListTasksInput, - RepoOverview, - SwitchResult, - TaskRecord, - TaskSummary, - TaskWorkspaceChangeModelInput, - TaskWorkspaceChangeOwnerInput, - TaskWorkspaceCreateTaskInput, - TaskWorkspaceDiffInput, - TaskWorkspaceRenameInput, - TaskWorkspaceRenameSessionInput, - TaskWorkspaceSelectInput, - TaskWorkspaceSetSessionUnreadInput, - TaskWorkspaceSendMessageInput, - TaskWorkspaceSessionInput, - TaskWorkspaceUpdateDraftInput, -} from "@sandbox-agent/foundry-shared"; -import { getActorRuntimeContext } from "../../context.js"; -import { getOrCreateAuditLog, getOrCreateTask, getTask as getTaskHandle } from "../../handles.js"; -import { defaultSandboxProviderId } from "../../../sandbox-config.js"; -import { logActorWarning, resolveErrorMessage } from "../../logging.js"; -import { taskWorkflowQueueName } from "../../task/workflow/queue.js"; -import { expectQueueResponse } from "../../../services/queue.js"; -import { taskIndex, taskSummaries } from "../db/schema.js"; -import { - createTaskMutation, - getRepoOverviewFromOrg, - getRepositoryMetadataFromOrg, - findTaskForBranch, - listTaskSummariesForRepo, - listAllTaskSummaries, -} from "./task-mutations.js"; - -function assertOrganization(c: { state: { organizationId: string } }, organizationId: string): void { - if (organizationId !== c.state.organizationId) { - throw new Error(`Organization actor mismatch: actor=${c.state.organizationId} command=${organizationId}`); - } -} - -/** - * Look up the repoId for a task from the local task index. - * Used when callers (e.g. sandbox actor) only have taskId but need repoId - * to construct the task actor key. - */ -async function resolveTaskRepoId(c: any, taskId: string): Promise { - const row = await c.db.select({ repoId: taskIndex.repoId }).from(taskIndex).where(eq(taskIndex.taskId, taskId)).get(); - if (!row) { - throw new Error(`Task ${taskId} not found in task index`); - } - return row.repoId; -} - -/** - * Get or lazily create a task actor for a user-initiated action. - * Uses getOrCreate because the user may be interacting with a virtual task - * (PR-driven) that has no actor yet. The task actor self-initializes in - * getCurrentRecord() from the org's getTaskIndexEntry data. - * - * This is safe because requireWorkspaceTask is only called from user-initiated - * actions (createSession, sendMessage, etc.), never from sync loops. - * See CLAUDE.md "Lazy Task Actor Creation". - */ -async function requireWorkspaceTask(c: any, repoId: string, taskId: string) { - return getOrCreateTask(c, c.state.organizationId, repoId, taskId, { - organizationId: c.state.organizationId, - repoId, - taskId, - }); -} - -interface GetTaskInput { - organizationId: string; - repoId: string; - taskId: string; -} - -interface TaskProxyActionInput extends GetTaskInput { - reason?: string; -} - -interface RepoOverviewInput { - organizationId: string; - repoId: string; -} - -export { createTaskMutation }; - -export const organizationTaskActions = { - async createTask(c: any, input: CreateTaskInput): Promise { - assertOrganization(c, input.organizationId); - const { config } = getActorRuntimeContext(); - const sandboxProviderId = input.sandboxProviderId ?? defaultSandboxProviderId(config); - - // Self-call: call the mutation directly since we're inside the org actor - return await createTaskMutation(c, { - repoId: input.repoId, - task: input.task, - sandboxProviderId, - explicitTitle: input.explicitTitle ?? null, - explicitBranchName: input.explicitBranchName ?? null, - onBranch: input.onBranch ?? null, - }); - }, - - async materializeTask(c: any, input: { organizationId: string; repoId: string; virtualTaskId: string }): Promise { - assertOrganization(c, input.organizationId); - const { config } = getActorRuntimeContext(); - // Self-call: call the mutation directly - return await createTaskMutation(c, { - repoId: input.repoId, - task: input.virtualTaskId, - sandboxProviderId: defaultSandboxProviderId(config), - explicitTitle: null, - explicitBranchName: null, - onBranch: null, - }); - }, - - async createWorkspaceTask(c: any, input: TaskWorkspaceCreateTaskInput): Promise<{ taskId: string; sessionId?: string }> { - const created = await organizationTaskActions.createTask(c, { - organizationId: c.state.organizationId, - repoId: input.repoId, - task: input.task, - ...(input.title ? { explicitTitle: input.title } : {}), - ...(input.onBranch ? { onBranch: input.onBranch } : input.branch ? { explicitBranchName: input.branch } : {}), - }); - - const task = await requireWorkspaceTask(c, input.repoId, created.taskId); - void task - .send( - taskWorkflowQueueName("task.command.workspace.create_session_and_send"), - { - model: input.model, - text: input.task, - authSessionId: input.authSessionId, - }, - { wait: false }, - ) - .catch(() => {}); - - return { taskId: created.taskId }; - }, - - async markWorkspaceUnread(c: any, input: TaskWorkspaceSelectInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - await task.markUnread({ authSessionId: input.authSessionId }); - }, - - async renameWorkspaceTask(c: any, input: TaskWorkspaceRenameInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - await task.renameTask({ value: input.value }); - }, - - async createWorkspaceSession(c: any, input: TaskWorkspaceSelectInput & { model?: string }): Promise<{ sessionId: string }> { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - return expectQueueResponse( - await task.send( - taskWorkflowQueueName("task.command.workspace.create_session"), - { - ...(input.model ? { model: input.model } : {}), - ...(input.authSessionId ? { authSessionId: input.authSessionId } : {}), - }, - { wait: true, timeout: 10_000 }, - ), - ); - }, - - async renameWorkspaceSession(c: any, input: TaskWorkspaceRenameSessionInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - await task.renameSession({ sessionId: input.sessionId, title: input.title }); - }, - - async selectWorkspaceSession(c: any, input: TaskWorkspaceSessionInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - await task.selectSession({ sessionId: input.sessionId, authSessionId: input.authSessionId }); - }, - - async setWorkspaceSessionUnread(c: any, input: TaskWorkspaceSetSessionUnreadInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - await task.setSessionUnread({ sessionId: input.sessionId, unread: input.unread, authSessionId: input.authSessionId }); - }, - - async updateWorkspaceDraft(c: any, input: TaskWorkspaceUpdateDraftInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - void task - .updateDraft({ - sessionId: input.sessionId, - text: input.text, - attachments: input.attachments, - authSessionId: input.authSessionId, - }) - .catch(() => {}); - }, - - async changeWorkspaceModel(c: any, input: TaskWorkspaceChangeModelInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - await task.changeModel({ sessionId: input.sessionId, model: input.model, authSessionId: input.authSessionId }); - }, - - async sendWorkspaceMessage(c: any, input: TaskWorkspaceSendMessageInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - void task - .send( - taskWorkflowQueueName("task.command.workspace.send_message"), - { - sessionId: input.sessionId, - text: input.text, - attachments: input.attachments, - authSessionId: input.authSessionId, - }, - { wait: false }, - ) - .catch(() => {}); - }, - - async stopWorkspaceSession(c: any, input: TaskWorkspaceSessionInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - void task - .send(taskWorkflowQueueName("task.command.workspace.stop_session"), { sessionId: input.sessionId, authSessionId: input.authSessionId }, { wait: false }) - .catch(() => {}); - }, - - async closeWorkspaceSession(c: any, input: TaskWorkspaceSessionInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - void task - .send(taskWorkflowQueueName("task.command.workspace.close_session"), { sessionId: input.sessionId, authSessionId: input.authSessionId }, { wait: false }) - .catch(() => {}); - }, - - async publishWorkspacePr(c: any, input: TaskWorkspaceSelectInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - void task.send(taskWorkflowQueueName("task.command.workspace.publish_pr"), {}, { wait: false }).catch(() => {}); - }, - - async changeWorkspaceTaskOwner(c: any, input: TaskWorkspaceChangeOwnerInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - await task.send( - taskWorkflowQueueName("task.command.workspace.change_owner"), - { - primaryUserId: input.targetUserId, - primaryGithubLogin: input.targetUserName, - primaryGithubEmail: input.targetUserEmail, - primaryGithubAvatarUrl: null, - }, - { wait: false }, - ); - }, - - async revertWorkspaceFile(c: any, input: TaskWorkspaceDiffInput): Promise { - const task = await requireWorkspaceTask(c, input.repoId, input.taskId); - void task.send(taskWorkflowQueueName("task.command.workspace.revert_file"), input, { wait: false }).catch(() => {}); - }, - - async getRepoOverview(c: any, input: RepoOverviewInput): Promise { - assertOrganization(c, input.organizationId); - - return await getRepoOverviewFromOrg(c, input.repoId); - }, - - async listTasks(c: any, input: ListTasksInput): Promise { - assertOrganization(c, input.organizationId); - if (input.repoId) { - return await listTaskSummariesForRepo(c, input.repoId, true); - } - return await listAllTaskSummaries(c, true); - }, - - async switchTask(c: any, input: { repoId: string; taskId: string }): Promise { - const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); - const record = await h.get(); - const switched = expectQueueResponse<{ switchTarget: string | null }>( - await h.send(taskWorkflowQueueName("task.command.switch"), {}, { wait: true, timeout: 10_000 }), - ); - return { - organizationId: c.state.organizationId, - taskId: input.taskId, - sandboxProviderId: record.sandboxProviderId, - switchTarget: switched.switchTarget, - }; - }, - - async auditLog(c: any, input: HistoryQueryInput): Promise { - assertOrganization(c, input.organizationId); - const auditLog = await getOrCreateAuditLog(c, c.state.organizationId); - return await auditLog.list({ - repoId: input.repoId, - branch: input.branch, - taskId: input.taskId, - limit: input.limit ?? 20, - }); - }, - - async getTask(c: any, input: GetTaskInput): Promise { - assertOrganization(c, input.organizationId); - // Resolve repoId from local task index if not provided (e.g. sandbox actor only has taskId) - const repoId = input.repoId || (await resolveTaskRepoId(c, input.taskId)); - // Use getOrCreate — the task may be virtual (PR-driven, no actor yet). - // The task actor self-initializes in getCurrentRecord(). - const handle = await getOrCreateTask(c, c.state.organizationId, repoId, input.taskId, { - organizationId: c.state.organizationId, - repoId, - taskId: input.taskId, - }); - return await handle.get(); - }, - - async attachTask(c: any, input: TaskProxyActionInput): Promise<{ target: string; sessionId: string | null }> { - assertOrganization(c, input.organizationId); - - const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); - return expectQueueResponse(await h.send(taskWorkflowQueueName("task.command.attach"), { reason: input.reason }, { wait: true, timeout: 10_000 })); - }, - - async pushTask(c: any, input: TaskProxyActionInput): Promise { - assertOrganization(c, input.organizationId); - - const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); - void h.send(taskWorkflowQueueName("task.command.push"), { reason: input.reason }, { wait: false }).catch(() => {}); - }, - - async syncTask(c: any, input: TaskProxyActionInput): Promise { - assertOrganization(c, input.organizationId); - - const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); - void h.send(taskWorkflowQueueName("task.command.sync"), { reason: input.reason }, { wait: false }).catch(() => {}); - }, - - async mergeTask(c: any, input: TaskProxyActionInput): Promise { - assertOrganization(c, input.organizationId); - - const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); - void h.send(taskWorkflowQueueName("task.command.merge"), { reason: input.reason }, { wait: false }).catch(() => {}); - }, - - async archiveTask(c: any, input: TaskProxyActionInput): Promise { - assertOrganization(c, input.organizationId); - - const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); - void h.send(taskWorkflowQueueName("task.command.archive"), { reason: input.reason }, { wait: false }).catch(() => {}); - }, - - async killTask(c: any, input: TaskProxyActionInput): Promise { - assertOrganization(c, input.organizationId); - - const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); - void h.send(taskWorkflowQueueName("task.command.kill"), { reason: input.reason }, { wait: false }).catch(() => {}); - }, - - async getRepositoryMetadata(c: any, input: { repoId: string }): Promise<{ defaultBranch: string | null; fullName: string | null; remoteUrl: string }> { - return await getRepositoryMetadataFromOrg(c, input.repoId); - }, - - async findTaskForBranch(c: any, input: { repoId: string; branchName: string }): Promise<{ taskId: string | null }> { - return await findTaskForBranch(c, input.repoId, input.branchName); - }, - - /** - * Lightweight read of task index + summary data. Used by the task actor - * to self-initialize when lazily materialized from a virtual task. - * Does NOT trigger materialization — no circular dependency. - */ - async getTaskIndexEntry(c: any, input: { taskId: string }): Promise<{ branchName: string | null; title: string | null } | null> { - const idx = await c.db.select({ branchName: taskIndex.branchName }).from(taskIndex).where(eq(taskIndex.taskId, input.taskId)).get(); - const summary = await c.db.select({ title: taskSummaries.title }).from(taskSummaries).where(eq(taskSummaries.taskId, input.taskId)).get(); - if (!idx && !summary) return null; - return { - branchName: idx?.branchName ?? null, - title: summary?.title ?? null, - }; - }, -}; diff --git a/foundry/packages/backend/src/actors/organization/app-shell.ts b/foundry/packages/backend/src/actors/organization/app-shell.ts deleted file mode 100644 index ed1005a..0000000 --- a/foundry/packages/backend/src/actors/organization/app-shell.ts +++ /dev/null @@ -1,1455 +0,0 @@ -import { desc, eq } from "drizzle-orm"; -import { randomUUID } from "node:crypto"; -import type { - FoundryAppSnapshot, - FoundryBillingPlanId, - FoundryBillingState, - FoundryOrganization, - FoundryOrganizationMember, - FoundryUser, - UpdateFoundryOrganizationProfileInput, - WorkspaceModelId, -} from "@sandbox-agent/foundry-shared"; -import { DEFAULT_WORKSPACE_MODEL_ID } from "@sandbox-agent/foundry-shared"; -import { getActorRuntimeContext } from "../context.js"; -import { getOrCreateGithubData, getOrCreateOrganization, selfOrganization } from "../handles.js"; -import { GitHubAppError } from "../../services/app-github.js"; -import { getBetterAuthService } from "../../services/better-auth.js"; -import { repoLabelFromRemote } from "../../services/repo.js"; -import { logger } from "../../logging.js"; -import { githubDataWorkflowQueueName } from "../github-data/index.js"; -import { organizationWorkflowQueueName } from "./queues.js"; -import { invoices, organizationMembers, organizationProfile, seatAssignments, stripeLookup } from "./db/schema.js"; -import { APP_SHELL_ORGANIZATION_ID } from "./constants.js"; - -const githubWebhookLogger = logger.child({ - scope: "github-webhook", -}); - -const PROFILE_ROW_ID = 1; - -function roundDurationMs(start: number): number { - return Math.round((performance.now() - start) * 100) / 100; -} - -export function assertAppOrganization(c: any): void { - if (c.state.organizationId !== APP_SHELL_ORGANIZATION_ID) { - throw new Error(`App shell action requires organization ${APP_SHELL_ORGANIZATION_ID}, got ${c.state.organizationId}`); - } -} - -export function assertOrganizationShell(c: any): void { - if (c.state.organizationId === APP_SHELL_ORGANIZATION_ID) { - throw new Error("Organization action cannot run on the reserved app organization"); - } -} - -function slugify(value: string): string { - return value - .trim() - .toLowerCase() - .replace(/[^a-z0-9]+/g, "-") - .replace(/^-+|-+$/g, ""); -} - -function personalOrganizationId(login: string): string { - return `personal-${slugify(login)}`; -} - -function organizationOrganizationId(kind: FoundryOrganization["kind"], login: string): string { - return kind === "personal" ? personalOrganizationId(login) : slugify(login); -} - -function parseEligibleOrganizationIds(value: string): string[] { - try { - const parsed = JSON.parse(value); - if (!Array.isArray(parsed)) { - return []; - } - return parsed.filter((entry): entry is string => typeof entry === "string" && entry.length > 0); - } catch { - return []; - } -} - -function encodeEligibleOrganizationIds(value: string[]): string { - return JSON.stringify([...new Set(value)]); -} - -function errorMessage(error: unknown): string { - return error instanceof Error ? error.message : String(error); -} - -function seatsIncludedForPlan(planId: FoundryBillingPlanId): number { - switch (planId) { - case "free": - return 1; - case "team": - return 5; - } -} - -function stripeStatusToBillingStatus(stripeStatus: string, cancelAtPeriodEnd: boolean): FoundryBillingState["status"] { - if (cancelAtPeriodEnd) { - return "scheduled_cancel"; - } - if (stripeStatus === "trialing") { - return "trialing"; - } - if (stripeStatus === "past_due" || stripeStatus === "unpaid" || stripeStatus === "incomplete") { - return "past_due"; - } - return "active"; -} - -function formatUnixDate(value: number): string { - return new Date(value * 1000).toISOString().slice(0, 10); -} - -function legacyRepoImportStatusToGithubSyncStatus(value: string | null | undefined): FoundryOrganization["github"]["syncStatus"] { - switch (value) { - case "ready": - return "synced"; - case "importing": - return "syncing"; - default: - return "pending"; - } -} - -function stringFromMetadata(metadata: unknown, key: string): string | null { - if (!metadata || typeof metadata !== "object") { - return null; - } - const value = (metadata as Record)[key]; - return typeof value === "string" && value.length > 0 ? value : null; -} - -function stripeWebhookSubscription(event: any) { - const object = event.data.object as Record; - const items = (object.items as { data?: Array> } | undefined)?.data ?? []; - const price = items[0]?.price as Record | undefined; - return { - id: typeof object.id === "string" ? object.id : "", - customerId: typeof object.customer === "string" ? object.customer : "", - priceId: typeof price?.id === "string" ? price.id : null, - status: typeof object.status === "string" ? object.status : "active", - cancelAtPeriodEnd: object.cancel_at_period_end === true, - currentPeriodEnd: typeof object.current_period_end === "number" ? object.current_period_end : null, - trialEnd: typeof object.trial_end === "number" ? object.trial_end : null, - defaultPaymentMethodLabel: "Payment method on file", - }; -} - -// sendOrganizationCommand removed — org actions called directly - -export async function getOrganizationState(organization: any) { - return await organization.getOrganizationShellState({}); -} - -async function getOrganizationStateIfInitialized(organization: any) { - return await organization.getOrganizationShellStateIfInitialized({}); -} - -async function listSnapshotOrganizations(c: any, sessionId: string, organizationIds: string[]) { - const results = await Promise.all( - organizationIds.map(async (organizationId) => { - const organizationStartedAt = performance.now(); - try { - const organization = await getOrCreateOrganization(c, organizationId); - const organizationState = await getOrganizationStateIfInitialized(organization); - if (!organizationState) { - logger.warn( - { - sessionId, - actorOrganizationId: c.state.organizationId, - organizationId, - durationMs: roundDurationMs(organizationStartedAt), - }, - "build_app_snapshot_organization_uninitialized", - ); - return { organizationId, snapshot: null, status: "uninitialized" as const }; - } - logger.info( - { - sessionId, - actorOrganizationId: c.state.organizationId, - organizationId, - durationMs: roundDurationMs(organizationStartedAt), - }, - "build_app_snapshot_organization_completed", - ); - return { organizationId, snapshot: organizationState.snapshot, status: "ok" as const }; - } catch (error) { - const message = errorMessage(error); - if (!message.includes("Actor not found")) { - logger.error( - { - sessionId, - actorOrganizationId: c.state.organizationId, - organizationId, - durationMs: roundDurationMs(organizationStartedAt), - errorMessage: message, - errorStack: error instanceof Error ? error.stack : undefined, - }, - "build_app_snapshot_organization_failed", - ); - throw error; - } - logger.info( - { - sessionId, - actorOrganizationId: c.state.organizationId, - organizationId, - durationMs: roundDurationMs(organizationStartedAt), - }, - "build_app_snapshot_organization_missing", - ); - return { organizationId, snapshot: null, status: "missing" as const }; - } - }), - ); - - return { - organizations: results.map((result) => result.snapshot).filter((organization): organization is FoundryOrganization => organization !== null), - uninitializedOrganizationIds: results.filter((result) => result.status === "uninitialized").map((result) => result.organizationId), - }; -} - -export async function buildAppSnapshot(c: any, sessionId: string, allowOrganizationRepair = true): Promise { - assertAppOrganization(c); - const startedAt = performance.now(); - const auth = getBetterAuthService(); - let authState = await auth.getAuthState(sessionId); - // Inline fallback: if the user is signed in but has no eligible organizations yet - // (e.g. first load after OAuth callback), sync GitHub orgs before building the snapshot. - if (authState?.user && parseEligibleOrganizationIds(authState.profile?.eligibleOrganizationIdsJson ?? "[]").length === 0) { - const token = await auth.getAccessTokenForSession(sessionId); - if (token?.accessToken) { - logger.info({ sessionId }, "build_app_snapshot_sync_orgs"); - await syncGithubOrganizations(c, { sessionId, accessToken: token.accessToken }); - authState = await auth.getAuthState(sessionId); - } else { - logger.warn({ sessionId }, "build_app_snapshot_no_access_token"); - } - } - - const session = authState?.session ?? null; - const user = authState?.user ?? null; - const profile = authState?.profile ?? null; - const currentSessionState = authState?.sessionState ?? null; - const githubAccount = authState?.accounts?.find((account: any) => account.providerId === "github") ?? null; - const eligibleOrganizationIds = parseEligibleOrganizationIds(profile?.eligibleOrganizationIdsJson ?? "[]"); - - logger.info( - { - sessionId, - organizationId: c.state.organizationId, - eligibleOrganizationCount: eligibleOrganizationIds.length, - eligibleOrganizationIds, - }, - "build_app_snapshot_started", - ); - - let { organizations, uninitializedOrganizationIds } = await listSnapshotOrganizations(c, sessionId, eligibleOrganizationIds); - - if (allowOrganizationRepair && uninitializedOrganizationIds.length > 0) { - const token = await auth.getAccessTokenForSession(sessionId); - if (token?.accessToken) { - logger.info( - { - sessionId, - organizationId: c.state.organizationId, - organizationIds: uninitializedOrganizationIds, - }, - "build_app_snapshot_repairing_organizations", - ); - await syncGithubOrganizationsInternal(c, { sessionId, accessToken: token.accessToken }, { broadcast: false }); - return await buildAppSnapshot(c, sessionId, false); - } - logger.warn( - { - sessionId, - organizationId: c.state.organizationId, - organizationIds: uninitializedOrganizationIds, - }, - "build_app_snapshot_repair_skipped_no_access_token", - ); - } - - const currentUser: FoundryUser | null = user - ? { - id: profile?.githubAccountId ?? githubAccount?.accountId ?? user.id, - name: user.name, - email: user.email, - githubLogin: profile?.githubLogin ?? "", - roleLabel: profile?.roleLabel ?? "GitHub user", - eligibleOrganizationIds, - defaultModel: profile?.defaultModel ?? DEFAULT_WORKSPACE_MODEL_ID, - } - : null; - - const activeOrganizationId = - currentUser && - currentSessionState?.activeOrganizationId && - organizations.some((organization) => organization.id === currentSessionState.activeOrganizationId) - ? currentSessionState.activeOrganizationId - : currentUser && organizations.length === 1 - ? (organizations[0]?.id ?? null) - : null; - - const snapshot: FoundryAppSnapshot = { - auth: { - status: currentUser ? "signed_in" : "signed_out", - currentUserId: currentUser?.id ?? null, - }, - activeOrganizationId, - onboarding: { - starterRepo: { - repoFullName: "rivet-dev/sandbox-agent", - repoUrl: "https://github.com/rivet-dev/sandbox-agent", - status: profile?.starterRepoStatus ?? "pending", - starredAt: profile?.starterRepoStarredAt ?? null, - skippedAt: profile?.starterRepoSkippedAt ?? null, - }, - }, - users: currentUser ? [currentUser] : [], - organizations, - }; - - logger.info( - { - sessionId, - organizationId: c.state.organizationId, - eligibleOrganizationCount: eligibleOrganizationIds.length, - organizationCount: organizations.length, - durationMs: roundDurationMs(startedAt), - }, - "build_app_snapshot_completed", - ); - - return snapshot; -} - -export async function requireSignedInSession(c: any, sessionId: string) { - const auth = getBetterAuthService(); - const authState = await auth.getAuthState(sessionId); - const user = authState?.user ?? null; - const profile = authState?.profile ?? null; - const githubAccount = authState?.accounts?.find((account: any) => account.providerId === "github") ?? null; - if (!authState?.session || !user?.email) { - throw new Error("User must be signed in"); - } - const token = await auth.getAccessTokenForSession(sessionId); - return { - ...authState.session, - authUserId: user.id, - currentUserId: profile?.githubAccountId ?? githubAccount?.accountId ?? user.id, - currentUserName: user.name, - currentUserEmail: user.email, - currentUserGithubLogin: profile?.githubLogin ?? "", - currentUserRoleLabel: profile?.roleLabel ?? "GitHub user", - eligibleOrganizationIdsJson: profile?.eligibleOrganizationIdsJson ?? "[]", - githubAccessToken: token?.accessToken ?? null, - githubScope: (token?.scopes ?? []).join(","), - starterRepoStatus: profile?.starterRepoStatus ?? "pending", - starterRepoStarredAt: profile?.starterRepoStarredAt ?? null, - starterRepoSkippedAt: profile?.starterRepoSkippedAt ?? null, - }; -} - -export function requireEligibleOrganization(session: any, organizationId: string): void { - const eligibleOrganizationIds = parseEligibleOrganizationIds(session.eligibleOrganizationIdsJson); - if (!eligibleOrganizationIds.includes(organizationId)) { - throw new Error(`Organization ${organizationId} is not available in this app session`); - } -} - -async function upsertStripeLookupEntries(c: any, organizationId: string, customerId: string | null, subscriptionId: string | null): Promise { - assertAppOrganization(c); - const now = Date.now(); - for (const lookupKey of [customerId ? `customer:${customerId}` : null, subscriptionId ? `subscription:${subscriptionId}` : null]) { - if (!lookupKey) { - continue; - } - await c.db - .insert(stripeLookup) - .values({ - lookupKey, - organizationId, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: stripeLookup.lookupKey, - set: { - organizationId, - updatedAt: now, - }, - }) - .run(); - } -} - -async function findOrganizationIdForStripeEvent(c: any, customerId: string | null, subscriptionId: string | null): Promise { - assertAppOrganization(c); - const customerLookup = customerId - ? await c.db - .select({ organizationId: stripeLookup.organizationId }) - .from(stripeLookup) - .where(eq(stripeLookup.lookupKey, `customer:${customerId}`)) - .get() - : null; - if (customerLookup?.organizationId) { - return customerLookup.organizationId; - } - - const subscriptionLookup = subscriptionId - ? await c.db - .select({ organizationId: stripeLookup.organizationId }) - .from(stripeLookup) - .where(eq(stripeLookup.lookupKey, `subscription:${subscriptionId}`)) - .get() - : null; - return subscriptionLookup?.organizationId ?? null; -} - -async function safeListOrganizations(accessToken: string): Promise { - const { appShell } = getActorRuntimeContext(); - try { - return await appShell.github.listOrganizations(accessToken); - } catch (error) { - if (error instanceof GitHubAppError && error.status === 403) { - return []; - } - throw error; - } -} - -async function safeListInstallations(accessToken: string): Promise { - const { appShell } = getActorRuntimeContext(); - try { - return await appShell.github.listInstallations(accessToken); - } catch (error) { - if (error instanceof GitHubAppError && (error.status === 403 || error.status === 404)) { - return []; - } - throw error; - } -} - -/** - * Slow path: list GitHub orgs + installations, sync each org organization, - * and update the session's eligible organization list. Called from the - * workflow queue so it runs in the background after the callback has - * already returned a redirect to the browser. - */ -export async function syncGithubOrganizations(c: any, input: { sessionId: string; accessToken: string }): Promise { - await syncGithubOrganizationsInternal(c, input, { broadcast: true }); -} - -async function syncGithubOrganizationsInternal(c: any, input: { sessionId: string; accessToken: string }, options: { broadcast: boolean }): Promise { - assertAppOrganization(c); - const auth = getBetterAuthService(); - const { appShell } = getActorRuntimeContext(); - const { sessionId, accessToken } = input; - const authState = await auth.getAuthState(sessionId); - if (!authState?.user) { - throw new Error("User must be signed in"); - } - const viewer = await appShell.github.getViewer(accessToken); - const organizations = await safeListOrganizations(accessToken); - const installations = await safeListInstallations(accessToken); - const authUserId = authState.user.id; - const githubUserId = String(viewer.id); - - const linkedOrganizationIds: string[] = []; - const accounts = [ - { - githubAccountId: viewer.id, - githubLogin: viewer.login, - githubAccountType: "User", - kind: "personal" as const, - displayName: viewer.name || viewer.login, - }, - ...organizations.map((organization) => ({ - githubAccountId: organization.id, - githubLogin: organization.login, - githubAccountType: "Organization", - kind: "organization" as const, - displayName: organization.name || organization.login, - })), - ]; - - for (const account of accounts) { - const organizationId = organizationOrganizationId(account.kind, account.githubLogin); - const installation = installations.find((candidate) => candidate.accountLogin === account.githubLogin) ?? null; - const organization = await getOrCreateOrganization(c, organizationId); - await organization.send( - organizationWorkflowQueueName("organization.command.github.organization_shell.sync_from_github"), - { - userId: githubUserId, - userName: viewer.name || viewer.login, - userEmail: viewer.email ?? `${viewer.login}@users.noreply.github.com`, - githubUserLogin: viewer.login, - githubAccountId: account.githubAccountId, - githubLogin: account.githubLogin, - githubAccountType: account.githubAccountType, - kind: account.kind, - displayName: account.displayName, - installationId: installation?.id ?? null, - appConfigured: appShell.github.isAppConfigured(), - }, - { wait: true, timeout: 10_000 }, - ); - linkedOrganizationIds.push(organizationId); - } - - const activeOrganizationId = - authState.sessionState?.activeOrganizationId && linkedOrganizationIds.includes(authState.sessionState.activeOrganizationId) - ? authState.sessionState.activeOrganizationId - : linkedOrganizationIds.length === 1 - ? (linkedOrganizationIds[0] ?? null) - : null; - - await auth.setActiveOrganization(sessionId, activeOrganizationId); - await auth.upsertUserProfile(authUserId, { - githubAccountId: String(viewer.id), - githubLogin: viewer.login, - roleLabel: "GitHub user", - eligibleOrganizationIdsJson: encodeEligibleOrganizationIds(linkedOrganizationIds), - }); - if (!options.broadcast) { - return; - } - c.broadcast("appUpdated", { - type: "appUpdated", - snapshot: await buildAppSnapshot(c, sessionId), - }); -} - -async function readOrganizationProfileRow(c: any) { - assertOrganizationShell(c); - return await c.db.select().from(organizationProfile).where(eq(organizationProfile.id, PROFILE_ROW_ID)).get(); -} - -async function requireOrganizationProfileRow(c: any) { - const row = await readOrganizationProfileRow(c); - if (!row) { - throw new Error(`Organization profile is not initialized for organization ${c.state.organizationId}`); - } - return row; -} - -async function listOrganizationMembers(c: any): Promise { - assertOrganizationShell(c); - const rows = await c.db.select().from(organizationMembers).orderBy(organizationMembers.role, organizationMembers.name).all(); - return rows.map((row) => ({ - id: row.id, - name: row.name, - email: row.email, - role: row.role, - state: row.state, - })); -} - -async function listOrganizationSeatAssignments(c: any): Promise { - assertOrganizationShell(c); - const rows = await c.db.select({ email: seatAssignments.email }).from(seatAssignments).orderBy(seatAssignments.email).all(); - return rows.map((row) => row.email); -} - -async function listOrganizationInvoices(c: any): Promise { - assertOrganizationShell(c); - const rows = await c.db.select().from(invoices).orderBy(desc(invoices.issuedAt), desc(invoices.createdAt)).all(); - return rows.map((row) => ({ - id: row.id, - label: row.label, - issuedAt: row.issuedAt, - amountUsd: row.amountUsd, - status: row.status, - })); -} - -async function listOrganizationRepoCatalog(c: any): Promise { - assertOrganizationShell(c); - try { - const githubData = await getOrCreateGithubData(c, c.state.organizationId); - const rows = await githubData.listRepositories({}); - return rows.map((row: any) => repoLabelFromRemote(row.cloneUrl)).sort((a: string, b: string) => a.localeCompare(b)); - } catch { - return []; - } -} - -export async function buildOrganizationState(c: any) { - const startedAt = performance.now(); - const row = await requireOrganizationProfileRow(c); - return await buildOrganizationStateFromRow(c, row, startedAt); -} - -export async function buildOrganizationStateIfInitialized(c: any) { - const startedAt = performance.now(); - const row = await readOrganizationProfileRow(c); - if (!row) { - return null; - } - return await buildOrganizationStateFromRow(c, row, startedAt); -} - -async function buildOrganizationStateFromRow(c: any, row: any, startedAt: number) { - const repoCatalog = await listOrganizationRepoCatalog(c); - const members = await listOrganizationMembers(c); - const seatAssignmentEmails = await listOrganizationSeatAssignments(c); - const invoiceRows = await listOrganizationInvoices(c); - - const state = { - id: c.state.organizationId, - organizationId: c.state.organizationId, - kind: row.kind, - githubLogin: row.githubLogin, - githubInstallationId: row.githubInstallationId ?? null, - stripeCustomerId: row.stripeCustomerId ?? null, - stripeSubscriptionId: row.stripeSubscriptionId ?? null, - stripePriceId: row.stripePriceId ?? null, - billingPlanId: row.billingPlanId, - snapshot: { - id: c.state.organizationId, - organizationId: c.state.organizationId, - kind: row.kind, - settings: { - displayName: row.displayName, - slug: row.slug, - primaryDomain: row.primaryDomain, - seatAccrualMode: "first_prompt", - autoImportRepos: row.autoImportRepos === 1, - }, - github: { - connectedAccount: row.githubConnectedAccount, - installationStatus: row.githubInstallationStatus, - syncStatus: row.githubSyncStatus ?? legacyRepoImportStatusToGithubSyncStatus(row.repoImportStatus), - importedRepoCount: repoCatalog.length, - lastSyncLabel: row.githubLastSyncLabel, - lastSyncAt: row.githubLastSyncAt ?? null, - lastWebhookAt: row.githubLastWebhookAt ?? null, - lastWebhookEvent: row.githubLastWebhookEvent ?? "", - syncGeneration: row.githubSyncGeneration ?? 0, - syncPhase: row.githubSyncPhase ?? null, - processedRepositoryCount: row.githubProcessedRepositoryCount ?? 0, - totalRepositoryCount: row.githubTotalRepositoryCount ?? 0, - }, - billing: { - planId: row.billingPlanId, - status: row.billingStatus, - seatsIncluded: row.billingSeatsIncluded, - trialEndsAt: row.billingTrialEndsAt, - renewalAt: row.billingRenewalAt, - stripeCustomerId: row.stripeCustomerId ?? "", - paymentMethodLabel: row.billingPaymentMethodLabel, - invoices: invoiceRows, - }, - members, - seatAssignments: seatAssignmentEmails, - repoCatalog, - }, - }; - - logger.info( - { - organizationId: c.state.organizationId, - githubLogin: row.githubLogin, - repoCount: repoCatalog.length, - memberCount: members.length, - seatAssignmentCount: seatAssignmentEmails.length, - invoiceCount: invoiceRows.length, - durationMs: roundDurationMs(startedAt), - }, - "build_organization_state_completed", - ); - - return state; -} - -async function applySubscriptionState( - organization: any, - subscription: { - id: string; - customerId: string; - priceId: string | null; - status: string; - cancelAtPeriodEnd: boolean; - currentPeriodEnd: number | null; - trialEnd: number | null; - defaultPaymentMethodLabel: string; - }, - fallbackPlanId: FoundryBillingPlanId, -): Promise { - await organization.send( - organizationWorkflowQueueName("organization.command.billing.stripe_subscription.apply"), - { subscription, fallbackPlanId }, - { wait: true, timeout: 10_000 }, - ); -} - -export const organizationAppActions = { - async createAppCheckoutSession(c: any, input: { sessionId: string; organizationId: string; planId: FoundryBillingPlanId }): Promise<{ url: string }> { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const { appShell } = getActorRuntimeContext(); - const organizationHandle = await getOrCreateOrganization(c, input.organizationId); - const organizationState = await getOrganizationState(organizationHandle); - - if (input.planId === "free") { - await organizationHandle.send( - organizationWorkflowQueueName("organization.command.billing.free_plan.apply"), - { clearSubscription: false }, - { wait: true, timeout: 10_000 }, - ); - return { - url: `${appShell.appUrl}/organizations/${input.organizationId}/billing`, - }; - } - - if (!appShell.stripe.isConfigured()) { - throw new Error("Stripe is not configured"); - } - - let customerId = organizationState.stripeCustomerId; - if (!customerId) { - customerId = ( - await appShell.stripe.createCustomer({ - organizationId: input.organizationId, - displayName: organizationState.snapshot.settings.displayName, - email: session.currentUserEmail, - }) - ).id; - await organizationHandle.send( - organizationWorkflowQueueName("organization.command.billing.stripe_customer.apply"), - { customerId }, - { wait: true, timeout: 10_000 }, - ); - await upsertStripeLookupEntries(c, input.organizationId, customerId, null); - } - - return { - url: await appShell.stripe - .createCheckoutSession({ - organizationId: input.organizationId, - customerId, - customerEmail: session.currentUserEmail, - planId: input.planId, - successUrl: `${appShell.apiUrl}/v1/billing/checkout/complete?organizationId=${encodeURIComponent( - input.organizationId, - )}&session_id={CHECKOUT_SESSION_ID}`, - cancelUrl: `${appShell.appUrl}/organizations/${input.organizationId}/billing`, - }) - .then((checkout) => checkout.url), - }; - }, - - async finalizeAppCheckoutSession(c: any, input: { sessionId: string; organizationId: string; checkoutSessionId: string }): Promise<{ redirectTo: string }> { - assertAppOrganization(c); - const { appShell } = getActorRuntimeContext(); - const organizationHandle = await getOrCreateOrganization(c, input.organizationId); - const organizationState = await getOrganizationState(organizationHandle); - const completion = await appShell.stripe.retrieveCheckoutCompletion(input.checkoutSessionId); - - if (completion.customerId) { - await organizationHandle.send( - organizationWorkflowQueueName("organization.command.billing.stripe_customer.apply"), - { customerId: completion.customerId }, - { wait: true, timeout: 10_000 }, - ); - } - await upsertStripeLookupEntries(c, input.organizationId, completion.customerId, completion.subscriptionId); - - if (completion.subscriptionId) { - const subscription = await appShell.stripe.retrieveSubscription(completion.subscriptionId); - await applySubscriptionState(organizationHandle, subscription, completion.planId ?? organizationState.billingPlanId); - } - - if (completion.paymentMethodLabel) { - await organizationHandle.send( - organizationWorkflowQueueName("organization.command.billing.payment_method.set"), - { label: completion.paymentMethodLabel }, - { wait: true, timeout: 10_000 }, - ); - } - - return { - redirectTo: `${appShell.appUrl}/organizations/${input.organizationId}/billing`, - }; - }, - - async createAppBillingPortalSession(c: any, input: { sessionId: string; organizationId: string }): Promise<{ url: string }> { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const { appShell } = getActorRuntimeContext(); - const organizationHandle = await getOrCreateOrganization(c, input.organizationId); - const organizationState = await getOrganizationState(organizationHandle); - if (!organizationState.stripeCustomerId) { - throw new Error("Stripe customer is not available for this organization"); - } - const portal = await appShell.stripe.createPortalSession({ - customerId: organizationState.stripeCustomerId, - returnUrl: `${appShell.appUrl}/organizations/${input.organizationId}/billing`, - }); - return { url: portal.url }; - }, - - async cancelAppScheduledRenewal(c: any, input: { sessionId: string; organizationId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const { appShell } = getActorRuntimeContext(); - const organizationHandle = await getOrCreateOrganization(c, input.organizationId); - const organizationState = await getOrganizationState(organizationHandle); - - if (organizationState.stripeSubscriptionId && appShell.stripe.isConfigured()) { - const subscription = await appShell.stripe.updateSubscriptionCancellation(organizationState.stripeSubscriptionId, true); - await applySubscriptionState(organizationHandle, subscription, organizationState.billingPlanId); - await upsertStripeLookupEntries(c, input.organizationId, subscription.customerId ?? organizationState.stripeCustomerId, subscription.id); - } else { - await organizationHandle.send( - organizationWorkflowQueueName("organization.command.billing.status.set"), - { status: "scheduled_cancel" }, - { wait: true, timeout: 10_000 }, - ); - } - - return await buildAppSnapshot(c, input.sessionId); - }, - - async resumeAppSubscription(c: any, input: { sessionId: string; organizationId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const { appShell } = getActorRuntimeContext(); - const organizationHandle = await getOrCreateOrganization(c, input.organizationId); - const organizationState = await getOrganizationState(organizationHandle); - - if (organizationState.stripeSubscriptionId && appShell.stripe.isConfigured()) { - const subscription = await appShell.stripe.updateSubscriptionCancellation(organizationState.stripeSubscriptionId, false); - await applySubscriptionState(organizationHandle, subscription, organizationState.billingPlanId); - await upsertStripeLookupEntries(c, input.organizationId, subscription.customerId ?? organizationState.stripeCustomerId, subscription.id); - } else { - await organizationHandle.send( - organizationWorkflowQueueName("organization.command.billing.status.set"), - { status: "active" }, - { wait: true, timeout: 10_000 }, - ); - } - - return await buildAppSnapshot(c, input.sessionId); - }, - - async recordAppSeatUsage(c: any, input: { sessionId: string; organizationId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const organization = await getOrCreateOrganization(c, input.organizationId); - await organization.send( - organizationWorkflowQueueName("organization.command.billing.seat_usage.record"), - { email: session.currentUserEmail }, - { wait: true, timeout: 10_000 }, - ); - return await buildAppSnapshot(c, input.sessionId); - }, - - async handleAppStripeWebhook(c: any, input: { payload: string; signatureHeader: string | null }): Promise<{ ok: true }> { - assertAppOrganization(c); - const { appShell } = getActorRuntimeContext(); - const event = appShell.stripe.verifyWebhookEvent(input.payload, input.signatureHeader); - - if (event.type === "checkout.session.completed") { - const object = event.data.object as Record; - const organizationId = - stringFromMetadata(object.metadata, "organizationId") ?? - (await findOrganizationIdForStripeEvent( - c, - typeof object.customer === "string" ? object.customer : null, - typeof object.subscription === "string" ? object.subscription : null, - )); - if (organizationId) { - const organization = await getOrCreateOrganization(c, organizationId); - if (typeof object.customer === "string") { - await organization.send( - organizationWorkflowQueueName("organization.command.billing.stripe_customer.apply"), - { customerId: object.customer }, - { wait: true, timeout: 10_000 }, - ); - } - await upsertStripeLookupEntries( - c, - organizationId, - typeof object.customer === "string" ? object.customer : null, - typeof object.subscription === "string" ? object.subscription : null, - ); - } - return { ok: true }; - } - - if (event.type === "customer.subscription.updated" || event.type === "customer.subscription.created") { - const subscription = stripeWebhookSubscription(event); - const organizationId = await findOrganizationIdForStripeEvent(c, subscription.customerId, subscription.id); - if (organizationId) { - const organizationHandle = await getOrCreateOrganization(c, organizationId); - const organizationState = await getOrganizationState(organizationHandle); - await applySubscriptionState( - organizationHandle, - subscription, - appShell.stripe.planIdForPriceId(subscription.priceId ?? "") ?? organizationState.billingPlanId, - ); - await upsertStripeLookupEntries(c, organizationId, subscription.customerId, subscription.id); - } - return { ok: true }; - } - - if (event.type === "customer.subscription.deleted") { - const subscription = stripeWebhookSubscription(event); - const organizationId = await findOrganizationIdForStripeEvent(c, subscription.customerId, subscription.id); - if (organizationId) { - const organization = await getOrCreateOrganization(c, organizationId); - await organization.send( - organizationWorkflowQueueName("organization.command.billing.free_plan.apply"), - { clearSubscription: true }, - { wait: true, timeout: 10_000 }, - ); - } - return { ok: true }; - } - - if (event.type === "invoice.paid" || event.type === "invoice.payment_failed") { - const invoice = event.data.object as Record; - const organizationId = await findOrganizationIdForStripeEvent(c, typeof invoice.customer === "string" ? invoice.customer : null, null); - if (organizationId) { - const organization = await getOrCreateOrganization(c, organizationId); - const rawAmount = typeof invoice.amount_paid === "number" ? invoice.amount_paid : invoice.amount_due; - const amountUsd = Math.round((typeof rawAmount === "number" ? rawAmount : 0) / 100); - await organization.send( - organizationWorkflowQueueName("organization.command.billing.invoice.upsert"), - { - id: String(invoice.id), - label: typeof invoice.number === "string" ? `Invoice ${invoice.number}` : "Stripe invoice", - issuedAt: formatUnixDate(typeof invoice.created === "number" ? invoice.created : Math.floor(Date.now() / 1000)), - amountUsd: Number.isFinite(amountUsd) ? amountUsd : 0, - status: event.type === "invoice.paid" ? "paid" : "open", - }, - { wait: true, timeout: 10_000 }, - ); - } - } - - return { ok: true }; - }, - - async handleAppGithubWebhook(c: any, input: { payload: string; signatureHeader: string | null; eventHeader: string | null }): Promise<{ ok: true }> { - assertAppOrganization(c); - const { appShell } = getActorRuntimeContext(); - const { event, body } = appShell.github.verifyWebhookEvent(input.payload, input.signatureHeader, input.eventHeader); - - const accountLogin = body.installation?.account?.login ?? body.repository?.owner?.login ?? body.organization?.login ?? null; - const accountType = body.installation?.account?.type ?? (body.organization?.login ? "Organization" : null); - if (!accountLogin) { - githubWebhookLogger.info( - { - event, - action: body.action ?? null, - reason: "missing_installation_account", - }, - "ignored", - ); - return { ok: true }; - } - - const kind: FoundryOrganization["kind"] = accountType === "User" ? "personal" : "organization"; - const organizationId = organizationOrganizationId(kind, accountLogin); - const receivedAt = Date.now(); - const organization = await getOrCreateOrganization(c, organizationId); - await organization.send( - organizationWorkflowQueueName("organization.command.github.webhook_receipt.record"), - { organizationId, event, action: body.action ?? null, receivedAt }, - { wait: false }, - ); - const githubData = await getOrCreateGithubData(c, organizationId); - - if (event === "installation" && (body.action === "created" || body.action === "deleted" || body.action === "suspend" || body.action === "unsuspend")) { - githubWebhookLogger.info( - { - event, - action: body.action, - accountLogin, - organizationId, - }, - "installation_event", - ); - if (body.action === "deleted") { - await githubData.send( - githubDataWorkflowQueueName("githubData.command.clearState"), - { connectedAccount: accountLogin, installationStatus: "install_required", installationId: null, label: "GitHub App installation removed" }, - { wait: false }, - ); - } else if (body.action === "created") { - void githubData - .send( - githubDataWorkflowQueueName("githubData.command.syncRepos"), - { - connectedAccount: accountLogin, - installationStatus: "connected", - installationId: body.installation?.id ?? null, - githubLogin: accountLogin, - kind, - label: "Syncing GitHub data from installation webhook...", - }, - { wait: false }, - ) - .catch(() => {}); - } else if (body.action === "suspend") { - await githubData.send( - githubDataWorkflowQueueName("githubData.command.clearState"), - { - connectedAccount: accountLogin, - installationStatus: "reconnect_required", - installationId: body.installation?.id ?? null, - label: "GitHub App installation suspended", - }, - { wait: false }, - ); - } else if (body.action === "unsuspend") { - void githubData - .send( - githubDataWorkflowQueueName("githubData.command.syncRepos"), - { - connectedAccount: accountLogin, - installationStatus: "connected", - installationId: body.installation?.id ?? null, - githubLogin: accountLogin, - kind, - label: "Resyncing GitHub data after unsuspend...", - }, - { wait: false }, - ) - .catch(() => {}); - } - return { ok: true }; - } - - if (event === "installation_repositories") { - githubWebhookLogger.info( - { - event, - action: body.action ?? null, - accountLogin, - organizationId, - repositoriesAdded: body.repositories_added?.length ?? 0, - repositoriesRemoved: body.repositories_removed?.length ?? 0, - }, - "repository_membership_changed", - ); - void githubData - .send( - githubDataWorkflowQueueName("githubData.command.syncRepos"), - { - connectedAccount: accountLogin, - installationStatus: "connected", - installationId: body.installation?.id ?? null, - githubLogin: accountLogin, - kind, - label: "Resyncing GitHub data after repository access change...", - }, - { wait: false }, - ) - .catch(() => {}); - return { ok: true }; - } - - if ( - event === "push" || - event === "pull_request" || - event === "pull_request_review" || - event === "pull_request_review_comment" || - event === "check_run" || - event === "check_suite" || - event === "status" || - event === "create" || - event === "delete" - ) { - const repoFullName = body.repository?.full_name; - if (repoFullName) { - githubWebhookLogger.info( - { - event, - action: body.action ?? null, - accountLogin, - organizationId, - repoFullName, - }, - "repository_event", - ); - if (event === "pull_request" && body.repository?.clone_url && body.pull_request) { - await githubData.send( - githubDataWorkflowQueueName("githubData.command.handlePullRequestWebhook"), - { - connectedAccount: accountLogin, - installationStatus: "connected", - installationId: body.installation?.id ?? null, - repository: { - fullName: body.repository.full_name, - cloneUrl: body.repository.clone_url, - private: Boolean(body.repository.private), - }, - pullRequest: { - number: body.pull_request.number, - status: body.pull_request.draft ? "draft" : "ready", - title: body.pull_request.title ?? "", - body: body.pull_request.body ?? null, - state: body.pull_request.state ?? "open", - url: body.pull_request.html_url ?? `https://github.com/${body.repository.full_name}/pull/${body.pull_request.number}`, - headRefName: body.pull_request.head?.ref ?? "", - baseRefName: body.pull_request.base?.ref ?? "", - authorLogin: body.pull_request.user?.login ?? null, - isDraft: Boolean(body.pull_request.draft), - merged: Boolean(body.pull_request.merged), - }, - }, - { wait: false }, - ); - } - } - return { ok: true }; - } - - githubWebhookLogger.info( - { - event, - action: body.action ?? null, - accountLogin, - organizationId, - }, - "unhandled_event", - ); - return { ok: true }; - }, -}; - -export async function syncOrganizationShellFromGithubMutation( - c: any, - input: { - userId: string; - userName: string; - userEmail: string; - githubUserLogin: string; - githubAccountId: string; - githubLogin: string; - githubAccountType: string; - kind: FoundryOrganization["kind"]; - displayName: string; - installationId: number | null; - appConfigured: boolean; - }, -): Promise<{ organizationId: string }> { - assertOrganizationShell(c); - const now = Date.now(); - const existing = await readOrganizationProfileRow(c); - const slug = existing?.slug ?? slugify(input.githubLogin); - const organizationId = organizationOrganizationId(input.kind, input.githubLogin); - if (organizationId !== c.state.organizationId) { - throw new Error(`Organization actor mismatch: actor=${c.state.organizationId} github=${organizationId}`); - } - - const installationStatus = - input.kind === "personal" ? "connected" : input.installationId ? "connected" : input.appConfigured ? "install_required" : "reconnect_required"; - const syncStatus = existing?.githubSyncStatus ?? legacyRepoImportStatusToGithubSyncStatus(existing?.repoImportStatus); - const lastSyncLabel = - syncStatus === "synced" - ? existing.githubLastSyncLabel - : installationStatus === "connected" - ? "Waiting for first import" - : installationStatus === "install_required" - ? "GitHub App installation required" - : "GitHub App configuration incomplete"; - const hasStripeBillingState = Boolean(existing?.stripeCustomerId || existing?.stripeSubscriptionId || existing?.stripePriceId); - const defaultBillingPlanId = input.kind === "personal" || !hasStripeBillingState ? "free" : (existing?.billingPlanId ?? "team"); - const defaultSeatsIncluded = input.kind === "personal" || !hasStripeBillingState ? 1 : (existing?.billingSeatsIncluded ?? 5); - const defaultPaymentMethodLabel = - input.kind === "personal" - ? "No card required" - : hasStripeBillingState - ? (existing?.billingPaymentMethodLabel ?? "Payment method on file") - : "No payment method on file"; - - await c.db - .insert(organizationProfile) - .values({ - id: PROFILE_ROW_ID, - kind: input.kind, - githubAccountId: input.githubAccountId, - githubLogin: input.githubLogin, - githubAccountType: input.githubAccountType, - displayName: input.displayName, - slug, - defaultModel: existing?.defaultModel ?? DEFAULT_WORKSPACE_MODEL_ID, - primaryDomain: existing?.primaryDomain ?? (input.kind === "personal" ? "personal" : `${slug}.github`), - autoImportRepos: existing?.autoImportRepos ?? 1, - repoImportStatus: existing?.repoImportStatus ?? "not_started", - githubConnectedAccount: input.githubLogin, - githubInstallationStatus: installationStatus, - githubSyncStatus: syncStatus, - githubInstallationId: input.installationId, - githubLastSyncLabel: lastSyncLabel, - githubLastSyncAt: existing?.githubLastSyncAt ?? null, - githubSyncGeneration: existing?.githubSyncGeneration ?? 0, - githubSyncPhase: existing?.githubSyncPhase ?? null, - githubProcessedRepositoryCount: existing?.githubProcessedRepositoryCount ?? 0, - githubTotalRepositoryCount: existing?.githubTotalRepositoryCount ?? 0, - stripeCustomerId: existing?.stripeCustomerId ?? null, - stripeSubscriptionId: existing?.stripeSubscriptionId ?? null, - stripePriceId: existing?.stripePriceId ?? null, - billingPlanId: defaultBillingPlanId, - billingStatus: existing?.billingStatus ?? "active", - billingSeatsIncluded: defaultSeatsIncluded, - billingTrialEndsAt: existing?.billingTrialEndsAt ?? null, - billingRenewalAt: existing?.billingRenewalAt ?? null, - billingPaymentMethodLabel: defaultPaymentMethodLabel, - createdAt: existing?.createdAt ?? now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: organizationProfile.id, - set: { - kind: input.kind, - githubAccountId: input.githubAccountId, - githubLogin: input.githubLogin, - githubAccountType: input.githubAccountType, - displayName: input.displayName, - githubConnectedAccount: input.githubLogin, - githubInstallationStatus: installationStatus, - githubSyncStatus: syncStatus, - githubInstallationId: input.installationId, - githubLastSyncLabel: lastSyncLabel, - githubLastSyncAt: existing?.githubLastSyncAt ?? null, - githubSyncGeneration: existing?.githubSyncGeneration ?? 0, - githubSyncPhase: existing?.githubSyncPhase ?? null, - githubProcessedRepositoryCount: existing?.githubProcessedRepositoryCount ?? 0, - githubTotalRepositoryCount: existing?.githubTotalRepositoryCount ?? 0, - billingPlanId: defaultBillingPlanId, - billingSeatsIncluded: defaultSeatsIncluded, - billingPaymentMethodLabel: defaultPaymentMethodLabel, - updatedAt: now, - }, - }) - .run(); - - await c.db - .insert(organizationMembers) - .values({ - id: input.userId, - name: input.userName, - email: input.userEmail, - role: input.kind === "personal" ? "owner" : "admin", - state: "active", - updatedAt: now, - }) - .onConflictDoUpdate({ - target: organizationMembers.id, - set: { - name: input.userName, - email: input.userEmail, - role: input.kind === "personal" ? "owner" : "admin", - state: "active", - updatedAt: now, - }, - }) - .run(); - - // Auto-trigger github-data sync when the org has a connected installation - // but hasn't synced yet. This handles the common case where a personal - // account or an org with an existing GitHub App installation signs in for - // the first time on a fresh DB — the installation webhook already fired - // before the org actor existed, so we kick off the sync here instead. - const needsInitialSync = installationStatus === "connected" && syncStatus === "pending"; - if (needsInitialSync) { - const githubData = await getOrCreateGithubData(c, organizationId); - void githubData - .send( - githubDataWorkflowQueueName("githubData.command.syncRepos"), - { - connectedAccount: input.githubLogin, - installationStatus: "connected", - installationId: input.installationId, - githubLogin: input.githubLogin, - kind: input.kind, - label: "Initial repository sync...", - }, - { wait: false }, - ) - .catch(() => {}); - } - - return { organizationId }; -} - -export async function updateOrganizationShellProfileMutation( - c: any, - input: Pick, -): Promise { - assertOrganizationShell(c); - const existing = await requireOrganizationProfileRow(c); - await c.db - .update(organizationProfile) - .set({ - displayName: input.displayName.trim() || existing.displayName, - slug: input.slug.trim() || existing.slug, - primaryDomain: input.primaryDomain.trim() || existing.primaryDomain, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); -} - -export async function markOrganizationSyncStartedMutation(c: any, input: { label: string }): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - githubSyncStatus: "syncing", - githubLastSyncLabel: input.label, - githubSyncPhase: "discovering_repositories", - githubProcessedRepositoryCount: 0, - githubTotalRepositoryCount: 0, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); -} - -export async function applyOrganizationStripeCustomerMutation(c: any, input: { customerId: string }): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - stripeCustomerId: input.customerId, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); -} - -export async function applyOrganizationStripeSubscriptionMutation( - c: any, - input: { - subscription: { - id: string; - customerId: string; - priceId: string | null; - status: string; - cancelAtPeriodEnd: boolean; - currentPeriodEnd: number | null; - trialEnd: number | null; - defaultPaymentMethodLabel: string; - }; - fallbackPlanId: FoundryBillingPlanId; - }, -): Promise { - assertOrganizationShell(c); - const { appShell } = getActorRuntimeContext(); - const planId = appShell.stripe.planIdForPriceId(input.subscription.priceId ?? "") ?? input.fallbackPlanId; - await c.db - .update(organizationProfile) - .set({ - stripeCustomerId: input.subscription.customerId || null, - stripeSubscriptionId: input.subscription.id || null, - stripePriceId: input.subscription.priceId, - billingPlanId: planId, - billingStatus: stripeStatusToBillingStatus(input.subscription.status, input.subscription.cancelAtPeriodEnd), - billingSeatsIncluded: seatsIncludedForPlan(planId), - billingTrialEndsAt: input.subscription.trialEnd ? new Date(input.subscription.trialEnd * 1000).toISOString() : null, - billingRenewalAt: input.subscription.currentPeriodEnd ? new Date(input.subscription.currentPeriodEnd * 1000).toISOString() : null, - billingPaymentMethodLabel: input.subscription.defaultPaymentMethodLabel || "Payment method on file", - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); -} - -export async function applyOrganizationFreePlanMutation(c: any, input: { clearSubscription: boolean }): Promise { - assertOrganizationShell(c); - const patch: Record = { - billingPlanId: "free", - billingStatus: "active", - billingSeatsIncluded: 1, - billingTrialEndsAt: null, - billingRenewalAt: null, - billingPaymentMethodLabel: "No card required", - updatedAt: Date.now(), - }; - if (input.clearSubscription) { - patch.stripeSubscriptionId = null; - patch.stripePriceId = null; - } - await c.db.update(organizationProfile).set(patch).where(eq(organizationProfile.id, PROFILE_ROW_ID)).run(); -} - -export async function setOrganizationBillingPaymentMethodMutation(c: any, input: { label: string }): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - billingPaymentMethodLabel: input.label, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); -} - -export async function setOrganizationBillingStatusMutation(c: any, input: { status: FoundryBillingState["status"] }): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - billingStatus: input.status, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); -} - -export async function upsertOrganizationInvoiceMutation( - c: any, - input: { id: string; label: string; issuedAt: string; amountUsd: number; status: "paid" | "open" }, -): Promise { - assertOrganizationShell(c); - await c.db - .insert(invoices) - .values({ - id: input.id, - label: input.label, - issuedAt: input.issuedAt, - amountUsd: input.amountUsd, - status: input.status, - createdAt: Date.now(), - }) - .onConflictDoUpdate({ - target: invoices.id, - set: { - label: input.label, - issuedAt: input.issuedAt, - amountUsd: input.amountUsd, - status: input.status, - }, - }) - .run(); -} - -export async function recordOrganizationSeatUsageMutation(c: any, input: { email: string }): Promise { - assertOrganizationShell(c); - await c.db - .insert(seatAssignments) - .values({ - email: input.email, - createdAt: Date.now(), - }) - .onConflictDoNothing() - .run(); -} diff --git a/foundry/packages/backend/src/actors/organization/constants.ts b/foundry/packages/backend/src/actors/organization/constants.ts deleted file mode 100644 index 0b8e3c0..0000000 --- a/foundry/packages/backend/src/actors/organization/constants.ts +++ /dev/null @@ -1 +0,0 @@ -export const APP_SHELL_ORGANIZATION_ID = "app"; diff --git a/foundry/packages/backend/src/actors/organization/db/drizzle.config.ts b/foundry/packages/backend/src/actors/organization/db/drizzle.config.ts deleted file mode 100644 index eb43667..0000000 --- a/foundry/packages/backend/src/actors/organization/db/drizzle.config.ts +++ /dev/null @@ -1,6 +0,0 @@ -import { defineConfig } from "rivetkit/db/drizzle"; - -export default defineConfig({ - out: "./src/actors/organization/db/drizzle", - schema: "./src/actors/organization/db/schema.ts", -}); diff --git a/foundry/packages/backend/src/actors/organization/db/drizzle/0001_add_auth_and_task_tables.sql b/foundry/packages/backend/src/actors/organization/db/drizzle/0001_add_auth_and_task_tables.sql deleted file mode 100644 index fcd1b60..0000000 --- a/foundry/packages/backend/src/actors/organization/db/drizzle/0001_add_auth_and_task_tables.sql +++ /dev/null @@ -1,50 +0,0 @@ -CREATE TABLE IF NOT EXISTS `auth_session_index` ( - `session_id` text PRIMARY KEY NOT NULL, - `session_token` text NOT NULL, - `user_id` text NOT NULL, - `created_at` integer NOT NULL, - `updated_at` integer NOT NULL -); ---> statement-breakpoint -CREATE TABLE IF NOT EXISTS `auth_email_index` ( - `email` text PRIMARY KEY NOT NULL, - `user_id` text NOT NULL, - `updated_at` integer NOT NULL -); ---> statement-breakpoint -CREATE TABLE IF NOT EXISTS `auth_account_index` ( - `id` text PRIMARY KEY NOT NULL, - `provider_id` text NOT NULL, - `account_id` text NOT NULL, - `user_id` text NOT NULL, - `updated_at` integer NOT NULL -); ---> statement-breakpoint -CREATE TABLE IF NOT EXISTS `auth_verification` ( - `id` text PRIMARY KEY NOT NULL, - `identifier` text NOT NULL, - `value` text NOT NULL, - `expires_at` integer NOT NULL, - `created_at` integer NOT NULL, - `updated_at` integer NOT NULL -); ---> statement-breakpoint -CREATE TABLE IF NOT EXISTS `task_index` ( - `task_id` text PRIMARY KEY NOT NULL, - `repo_id` text NOT NULL, - `branch_name` text, - `created_at` integer NOT NULL, - `updated_at` integer NOT NULL -); ---> statement-breakpoint -CREATE TABLE IF NOT EXISTS `task_summaries` ( - `task_id` text PRIMARY KEY NOT NULL, - `repo_id` text NOT NULL, - `title` text NOT NULL, - `status` text NOT NULL, - `repo_name` text NOT NULL, - `updated_at_ms` integer NOT NULL, - `branch` text, - `pull_request_json` text, - `sessions_summary_json` text DEFAULT '[]' NOT NULL -); diff --git a/foundry/packages/backend/src/actors/organization/db/schema.ts b/foundry/packages/backend/src/actors/organization/db/schema.ts deleted file mode 100644 index 3978a5f..0000000 --- a/foundry/packages/backend/src/actors/organization/db/schema.ts +++ /dev/null @@ -1,160 +0,0 @@ -import { check, integer, sqliteTable, text } from "rivetkit/db/drizzle"; -import { sql } from "drizzle-orm"; -import { DEFAULT_WORKSPACE_MODEL_ID } from "@sandbox-agent/foundry-shared"; - -// SQLite is per organization actor instance, so no organizationId column needed. - -/** - * Coordinator index of TaskActor instances. - * The organization actor is the direct coordinator for tasks (not a per-repo - * actor) because the sidebar needs to query all tasks across all repos on - * every snapshot. With many repos, fanning out to N repo actors on the hot - * read path is too expensive — owning the index here keeps that a single - * local table scan. Each row maps a taskId to its repo and immutable branch - * name. Used for branch conflict checking (scoped by repoId) and - * task-by-branch lookups. - */ -export const taskIndex = sqliteTable("task_index", { - taskId: text("task_id").notNull().primaryKey(), - repoId: text("repo_id").notNull(), - branchName: text("branch_name"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -/** - * Organization-owned materialized task summary projection. - * Task actors push summary updates directly to the organization coordinator, - * which keeps this table local for fast list/lookups without fan-out. - * Same rationale as taskIndex: the sidebar repeatedly reads all tasks across - * all repos, so the org must own the materialized view to avoid O(repos) - * actor fan-out on the hot read path. - */ -export const taskSummaries = sqliteTable("task_summaries", { - taskId: text("task_id").notNull().primaryKey(), - repoId: text("repo_id").notNull(), - title: text("title").notNull(), - status: text("status").notNull(), - repoName: text("repo_name").notNull(), - updatedAtMs: integer("updated_at_ms").notNull(), - branch: text("branch"), - pullRequestJson: text("pull_request_json"), - sessionsSummaryJson: text("sessions_summary_json").notNull().default("[]"), - primaryUserLogin: text("primary_user_login"), - primaryUserAvatarUrl: text("primary_user_avatar_url"), -}); - -export const organizationProfile = sqliteTable( - "organization_profile", - { - id: integer("id").primaryKey(), - kind: text("kind").notNull(), - githubAccountId: text("github_account_id").notNull(), - githubLogin: text("github_login").notNull(), - githubAccountType: text("github_account_type").notNull(), - displayName: text("display_name").notNull(), - slug: text("slug").notNull(), - defaultModel: text("default_model").notNull().default(DEFAULT_WORKSPACE_MODEL_ID), - primaryDomain: text("primary_domain").notNull(), - autoImportRepos: integer("auto_import_repos").notNull(), - repoImportStatus: text("repo_import_status").notNull(), - githubConnectedAccount: text("github_connected_account").notNull(), - githubInstallationStatus: text("github_installation_status").notNull(), - githubSyncStatus: text("github_sync_status").notNull(), - githubInstallationId: integer("github_installation_id"), - githubLastSyncLabel: text("github_last_sync_label").notNull(), - githubLastSyncAt: integer("github_last_sync_at"), - githubLastWebhookAt: integer("github_last_webhook_at"), - githubLastWebhookEvent: text("github_last_webhook_event"), - githubSyncGeneration: integer("github_sync_generation").notNull(), - githubSyncPhase: text("github_sync_phase"), - githubProcessedRepositoryCount: integer("github_processed_repository_count").notNull(), - githubTotalRepositoryCount: integer("github_total_repository_count").notNull(), - stripeCustomerId: text("stripe_customer_id"), - stripeSubscriptionId: text("stripe_subscription_id"), - stripePriceId: text("stripe_price_id"), - billingPlanId: text("billing_plan_id").notNull(), - billingStatus: text("billing_status").notNull(), - billingSeatsIncluded: integer("billing_seats_included").notNull(), - billingTrialEndsAt: text("billing_trial_ends_at"), - billingRenewalAt: text("billing_renewal_at"), - billingPaymentMethodLabel: text("billing_payment_method_label").notNull(), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), - }, - (table) => [check("organization_profile_singleton_id_check", sql`${table.id} = 1`)], -); - -export const organizationMembers = sqliteTable("organization_members", { - id: text("id").notNull().primaryKey(), - name: text("name").notNull(), - email: text("email").notNull(), - role: text("role").notNull(), - state: text("state").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -export const seatAssignments = sqliteTable("seat_assignments", { - email: text("email").notNull().primaryKey(), - createdAt: integer("created_at").notNull(), -}); - -export const invoices = sqliteTable("invoices", { - id: text("id").notNull().primaryKey(), - label: text("label").notNull(), - issuedAt: text("issued_at").notNull(), - amountUsd: integer("amount_usd").notNull(), - status: text("status").notNull(), - createdAt: integer("created_at").notNull(), -}); - -/** - * Coordinator index of AuthUserActor instances — routes session token → userId. - * Better Auth adapter uses this to resolve which user actor to query - * before the user identity is known. - */ -export const authSessionIndex = sqliteTable("auth_session_index", { - sessionId: text("session_id").notNull().primaryKey(), - sessionToken: text("session_token").notNull(), - userId: text("user_id").notNull(), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -/** - * Coordinator index of AuthUserActor instances — routes email → userId. - * Better Auth adapter uses this to resolve which user actor to query. - */ -export const authEmailIndex = sqliteTable("auth_email_index", { - email: text("email").notNull().primaryKey(), - userId: text("user_id").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -/** - * Coordinator index of AuthUserActor instances — routes OAuth account → userId. - * Better Auth adapter uses this to resolve which user actor to query. - */ -export const authAccountIndex = sqliteTable("auth_account_index", { - id: text("id").notNull().primaryKey(), - providerId: text("provider_id").notNull(), - accountId: text("account_id").notNull(), - userId: text("user_id").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -/** Better Auth core model — schema defined at https://better-auth.com/docs/concepts/database */ -export const authVerification = sqliteTable("auth_verification", { - id: text("id").notNull().primaryKey(), - identifier: text("identifier").notNull(), - value: text("value").notNull(), - expiresAt: integer("expires_at").notNull(), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -export const stripeLookup = sqliteTable("stripe_lookup", { - lookupKey: text("lookup_key").notNull().primaryKey(), - organizationId: text("organization_id").notNull(), - updatedAt: integer("updated_at").notNull(), -}); diff --git a/foundry/packages/backend/src/actors/organization/index.ts b/foundry/packages/backend/src/actors/organization/index.ts deleted file mode 100644 index 9ceb27f..0000000 --- a/foundry/packages/backend/src/actors/organization/index.ts +++ /dev/null @@ -1,23 +0,0 @@ -import { actor, queue } from "rivetkit"; -import { workflow } from "rivetkit/workflow"; -import { organizationDb } from "./db/db.js"; -import { organizationActions } from "./actions.js"; -import { runOrganizationWorkflow } from "./workflow.js"; -import { ORGANIZATION_QUEUE_NAMES } from "./queues.js"; - -export const organization = actor({ - db: organizationDb, - queues: Object.fromEntries(ORGANIZATION_QUEUE_NAMES.map((name) => [name, queue()])), - options: { - name: "Organization", - icon: "compass", - actionTimeout: 5 * 60_000, - }, - createState: (_c, organizationId: string) => ({ - organizationId, - }), - actions: { - ...organizationActions, - }, - run: workflow(runOrganizationWorkflow), -}); diff --git a/foundry/packages/backend/src/actors/organization/queues.ts b/foundry/packages/backend/src/actors/organization/queues.ts deleted file mode 100644 index 2e67dc5..0000000 --- a/foundry/packages/backend/src/actors/organization/queues.ts +++ /dev/null @@ -1,26 +0,0 @@ -export const ORGANIZATION_QUEUE_NAMES = [ - "organization.command.createTask", - "organization.command.materializeTask", - "organization.command.applyTaskSummaryUpdate", - "organization.command.removeTaskSummary", - "organization.command.refreshTaskSummaryForBranch", - "organization.command.snapshot.broadcast", - "organization.command.syncGithubSession", - "organization.command.github.organization_shell.sync_from_github", - "organization.command.github.sync_progress.apply", - "organization.command.github.webhook_receipt.record", - "organization.command.shell.sync_started.mark", - "organization.command.billing.stripe_customer.apply", - "organization.command.billing.stripe_subscription.apply", - "organization.command.billing.free_plan.apply", - "organization.command.billing.payment_method.set", - "organization.command.billing.status.set", - "organization.command.billing.invoice.upsert", - "organization.command.billing.seat_usage.record", -] as const; - -export type OrganizationQueueName = (typeof ORGANIZATION_QUEUE_NAMES)[number]; - -export function organizationWorkflowQueueName(name: OrganizationQueueName): OrganizationQueueName { - return name; -} diff --git a/foundry/packages/backend/src/actors/organization/workflow.ts b/foundry/packages/backend/src/actors/organization/workflow.ts deleted file mode 100644 index e62e80d..0000000 --- a/foundry/packages/backend/src/actors/organization/workflow.ts +++ /dev/null @@ -1,164 +0,0 @@ -// @ts-nocheck -/** - * Organization workflow — queue-based command loop. - * - * Mutations are dispatched through named queues and processed inside workflow - * steps so that every command appears in the RivetKit inspector's workflow - * history. Read actions remain direct (no queue). - * - * Callers send commands directly via `.send()` to the appropriate queue name. - */ -import { Loop } from "rivetkit/workflow"; -import { logActorWarning, resolveErrorMessage } from "../logging.js"; -import { ORGANIZATION_QUEUE_NAMES, type OrganizationQueueName } from "./queues.js"; - -import { applyGithubSyncProgressMutation, recordGithubWebhookReceiptMutation, refreshOrganizationSnapshotMutation } from "./actions.js"; -import { - applyTaskSummaryUpdateMutation, - createTaskMutation, - refreshTaskSummaryForBranchMutation, - removeTaskSummaryMutation, -} from "./actions/task-mutations.js"; -import { - applyOrganizationFreePlanMutation, - applyOrganizationStripeCustomerMutation, - applyOrganizationStripeSubscriptionMutation, - markOrganizationSyncStartedMutation, - recordOrganizationSeatUsageMutation, - setOrganizationBillingPaymentMethodMutation, - setOrganizationBillingStatusMutation, - syncOrganizationShellFromGithubMutation, - upsertOrganizationInvoiceMutation, -} from "./app-shell.js"; - -// --------------------------------------------------------------------------- -// Workflow command loop — runs inside `run: workflow(runOrganizationWorkflow)` -// --------------------------------------------------------------------------- - -type WorkflowHandler = (loopCtx: any, body: any) => Promise; - -/** - * Maps queue names to their mutation handlers. - * Each handler receives the workflow loop context and the message body, - * executes the mutation, and returns the result (which is sent back via - * msg.complete). - */ -const COMMAND_HANDLERS: Record = { - // Task mutations - "organization.command.createTask": async (c, body) => createTaskMutation(c, body), - "organization.command.materializeTask": async (c, body) => createTaskMutation(c, body), - "organization.command.applyTaskSummaryUpdate": async (c, body) => { - await applyTaskSummaryUpdateMutation(c, body); - return { ok: true }; - }, - "organization.command.removeTaskSummary": async (c, body) => { - await removeTaskSummaryMutation(c, body); - return { ok: true }; - }, - "organization.command.refreshTaskSummaryForBranch": async (c, body) => { - await refreshTaskSummaryForBranchMutation(c, body); - return { ok: true }; - }, - "organization.command.snapshot.broadcast": async (c, _body) => { - await refreshOrganizationSnapshotMutation(c); - return { ok: true }; - }, - "organization.command.syncGithubSession": async (c, body) => { - const { syncGithubOrganizations } = await import("./app-shell.js"); - await syncGithubOrganizations(c, body); - return { ok: true }; - }, - - // GitHub organization shell sync (stays on queue) - "organization.command.github.organization_shell.sync_from_github": async (c, body) => syncOrganizationShellFromGithubMutation(c, body), - - // GitHub sync progress + webhook receipt - "organization.command.github.sync_progress.apply": async (c, body) => { - await applyGithubSyncProgressMutation(c, body); - return { ok: true }; - }, - "organization.command.github.webhook_receipt.record": async (c, body) => { - await recordGithubWebhookReceiptMutation(c, body); - return { ok: true }; - }, - "organization.command.shell.sync_started.mark": async (c, body) => { - await markOrganizationSyncStartedMutation(c, body); - return { ok: true }; - }, - - // Billing mutations - "organization.command.billing.stripe_customer.apply": async (c, body) => { - await applyOrganizationStripeCustomerMutation(c, body); - return { ok: true }; - }, - "organization.command.billing.stripe_subscription.apply": async (c, body) => { - await applyOrganizationStripeSubscriptionMutation(c, body); - return { ok: true }; - }, - "organization.command.billing.free_plan.apply": async (c, body) => { - await applyOrganizationFreePlanMutation(c, body); - return { ok: true }; - }, - "organization.command.billing.payment_method.set": async (c, body) => { - await setOrganizationBillingPaymentMethodMutation(c, body); - return { ok: true }; - }, - "organization.command.billing.status.set": async (c, body) => { - await setOrganizationBillingStatusMutation(c, body); - return { ok: true }; - }, - "organization.command.billing.invoice.upsert": async (c, body) => { - await upsertOrganizationInvoiceMutation(c, body); - return { ok: true }; - }, - "organization.command.billing.seat_usage.record": async (c, body) => { - await recordOrganizationSeatUsageMutation(c, body); - return { ok: true }; - }, -}; - -export async function runOrganizationWorkflow(ctx: any): Promise { - await ctx.loop("organization-command-loop", async (loopCtx: any) => { - const msg = await loopCtx.queue.next("next-organization-command", { - names: [...ORGANIZATION_QUEUE_NAMES], - completable: true, - }); - - if (!msg) { - return Loop.continue(undefined); - } - - const handler = COMMAND_HANDLERS[msg.name as OrganizationQueueName]; - if (!handler) { - logActorWarning("organization", "unknown organization command", { command: msg.name }); - await msg.complete({ error: `Unknown command: ${msg.name}` }).catch(() => {}); - return Loop.continue(undefined); - } - - try { - // Wrap in a step so c.state and c.db are accessible inside mutation functions. - const result = await loopCtx.step({ - name: msg.name, - timeout: 10 * 60_000, - run: async () => handler(loopCtx, msg.body), - }); - try { - await msg.complete(result); - } catch (completeError) { - logActorWarning("organization", "organization workflow failed completing response", { - command: msg.name, - error: resolveErrorMessage(completeError), - }); - } - } catch (error) { - const message = resolveErrorMessage(error); - logActorWarning("organization", "organization workflow command failed", { - command: msg.name, - error: message, - }); - await msg.complete({ error: message }).catch(() => {}); - } - - return Loop.continue(undefined); - }); -} diff --git a/foundry/packages/backend/src/actors/project-branch-sync/index.ts b/foundry/packages/backend/src/actors/project-branch-sync/index.ts new file mode 100644 index 0000000..3b20941 --- /dev/null +++ b/foundry/packages/backend/src/actors/project-branch-sync/index.ts @@ -0,0 +1,178 @@ +import { actor, queue } from "rivetkit"; +import { workflow } from "rivetkit/workflow"; +import type { GitDriver } from "../../driver.js"; +import { getActorRuntimeContext } from "../context.js"; +import { getProject, selfProjectBranchSync } from "../handles.js"; +import { logActorWarning, resolveErrorMessage, resolveErrorStack } from "../logging.js"; +import { type PollingControlState, runWorkflowPollingLoop } from "../polling.js"; +import { parentLookupFromStack } from "../project/stack-model.js"; +import { withRepoGitLock } from "../../services/repo-git-lock.js"; + +export interface ProjectBranchSyncInput { + workspaceId: string; + repoId: string; + repoPath: string; + intervalMs: number; +} + +interface SetIntervalCommand { + intervalMs: number; +} + +interface EnrichedBranchSnapshot { + branchName: string; + commitSha: string; + parentBranch: string | null; + trackedInStack: boolean; + diffStat: string | null; + hasUnpushed: boolean; + conflictsWithMain: boolean; +} + +interface ProjectBranchSyncState extends PollingControlState { + workspaceId: string; + repoId: string; + repoPath: string; +} + +const CONTROL = { + start: "project.branch_sync.control.start", + stop: "project.branch_sync.control.stop", + setInterval: "project.branch_sync.control.set_interval", + force: "project.branch_sync.control.force", +} as const; + +async function enrichBranches(workspaceId: string, repoId: string, repoPath: string, git: GitDriver): Promise { + return await withRepoGitLock(repoPath, async () => { + await git.fetch(repoPath); + const branches = await git.listRemoteBranches(repoPath); + const { driver } = getActorRuntimeContext(); + const stackEntries = await driver.stack.listStack(repoPath).catch(() => []); + const parentByBranch = parentLookupFromStack(stackEntries); + const enriched: EnrichedBranchSnapshot[] = []; + + const baseRef = await git.remoteDefaultBaseRef(repoPath); + const baseSha = await git.revParse(repoPath, baseRef).catch(() => ""); + + for (const branch of branches) { + let branchDiffStat: string | null = null; + let branchHasUnpushed = false; + let branchConflicts = false; + + try { + branchDiffStat = await git.diffStatForBranch(repoPath, branch.branchName); + } catch (error) { + logActorWarning("project-branch-sync", "diffStatForBranch failed", { + workspaceId, + repoId, + branchName: branch.branchName, + error: resolveErrorMessage(error), + }); + branchDiffStat = null; + } + + try { + const headSha = await git.revParse(repoPath, `origin/${branch.branchName}`); + branchHasUnpushed = Boolean(baseSha && headSha && headSha !== baseSha); + } catch (error) { + logActorWarning("project-branch-sync", "revParse failed", { + workspaceId, + repoId, + branchName: branch.branchName, + error: resolveErrorMessage(error), + }); + branchHasUnpushed = false; + } + + try { + branchConflicts = await git.conflictsWithMain(repoPath, branch.branchName); + } catch (error) { + logActorWarning("project-branch-sync", "conflictsWithMain failed", { + workspaceId, + repoId, + branchName: branch.branchName, + error: resolveErrorMessage(error), + }); + branchConflicts = false; + } + + enriched.push({ + branchName: branch.branchName, + commitSha: branch.commitSha, + parentBranch: parentByBranch.get(branch.branchName) ?? null, + trackedInStack: parentByBranch.has(branch.branchName), + diffStat: branchDiffStat, + hasUnpushed: branchHasUnpushed, + conflictsWithMain: branchConflicts, + }); + } + + return enriched; + }); +} + +async function pollBranches(c: { state: ProjectBranchSyncState }): Promise { + const { driver } = getActorRuntimeContext(); + const enrichedItems = await enrichBranches(c.state.workspaceId, c.state.repoId, c.state.repoPath, driver.git); + const parent = getProject(c, c.state.workspaceId, c.state.repoId); + await parent.applyBranchSyncResult({ items: enrichedItems, at: Date.now() }); +} + +export const projectBranchSync = actor({ + queues: { + [CONTROL.start]: queue(), + [CONTROL.stop]: queue(), + [CONTROL.setInterval]: queue(), + [CONTROL.force]: queue(), + }, + options: { + name: "Project Branch Sync", + icon: "code-branch", + // Polling actors rely on timer-based wakeups; sleeping would pause the timer and stop polling. + noSleep: true, + }, + createState: (_c, input: ProjectBranchSyncInput): ProjectBranchSyncState => ({ + workspaceId: input.workspaceId, + repoId: input.repoId, + repoPath: input.repoPath, + intervalMs: input.intervalMs, + running: true, + }), + actions: { + async start(c): Promise { + const self = selfProjectBranchSync(c); + await self.send(CONTROL.start, {}, { wait: true, timeout: 15_000 }); + }, + + async stop(c): Promise { + const self = selfProjectBranchSync(c); + await self.send(CONTROL.stop, {}, { wait: true, timeout: 15_000 }); + }, + + async setIntervalMs(c, payload: SetIntervalCommand): Promise { + const self = selfProjectBranchSync(c); + await self.send(CONTROL.setInterval, payload, { wait: true, timeout: 15_000 }); + }, + + async force(c): Promise { + const self = selfProjectBranchSync(c); + await self.send(CONTROL.force, {}, { wait: true, timeout: 5 * 60_000 }); + }, + }, + run: workflow(async (ctx) => { + await runWorkflowPollingLoop(ctx, { + loopName: "project-branch-sync-loop", + control: CONTROL, + onPoll: async (loopCtx) => { + try { + await pollBranches(loopCtx); + } catch (error) { + logActorWarning("project-branch-sync", "poll failed", { + error: resolveErrorMessage(error), + stack: resolveErrorStack(error), + }); + } + }, + }); + }), +}); diff --git a/foundry/packages/backend/src/actors/project-pr-sync/index.ts b/foundry/packages/backend/src/actors/project-pr-sync/index.ts new file mode 100644 index 0000000..f525d64 --- /dev/null +++ b/foundry/packages/backend/src/actors/project-pr-sync/index.ts @@ -0,0 +1,98 @@ +import { actor, queue } from "rivetkit"; +import { workflow } from "rivetkit/workflow"; +import { getActorRuntimeContext } from "../context.js"; +import { getProject, selfProjectPrSync } from "../handles.js"; +import { logActorWarning, resolveErrorMessage, resolveErrorStack } from "../logging.js"; +import { type PollingControlState, runWorkflowPollingLoop } from "../polling.js"; +import { resolveWorkspaceGithubAuth } from "../../services/github-auth.js"; + +export interface ProjectPrSyncInput { + workspaceId: string; + repoId: string; + repoPath: string; + intervalMs: number; +} + +interface SetIntervalCommand { + intervalMs: number; +} + +interface ProjectPrSyncState extends PollingControlState { + workspaceId: string; + repoId: string; + repoPath: string; +} + +const CONTROL = { + start: "project.pr_sync.control.start", + stop: "project.pr_sync.control.stop", + setInterval: "project.pr_sync.control.set_interval", + force: "project.pr_sync.control.force", +} as const; + +async function pollPrs(c: { state: ProjectPrSyncState }): Promise { + const { driver } = getActorRuntimeContext(); + const auth = await resolveWorkspaceGithubAuth(c, c.state.workspaceId); + const items = await driver.github.listPullRequests(c.state.repoPath, { githubToken: auth?.githubToken ?? null }); + const parent = getProject(c, c.state.workspaceId, c.state.repoId); + await parent.applyPrSyncResult({ items, at: Date.now() }); +} + +export const projectPrSync = actor({ + queues: { + [CONTROL.start]: queue(), + [CONTROL.stop]: queue(), + [CONTROL.setInterval]: queue(), + [CONTROL.force]: queue(), + }, + options: { + name: "Project PR Sync", + icon: "code-merge", + // Polling actors rely on timer-based wakeups; sleeping would pause the timer and stop polling. + noSleep: true, + }, + createState: (_c, input: ProjectPrSyncInput): ProjectPrSyncState => ({ + workspaceId: input.workspaceId, + repoId: input.repoId, + repoPath: input.repoPath, + intervalMs: input.intervalMs, + running: true, + }), + actions: { + async start(c): Promise { + const self = selfProjectPrSync(c); + await self.send(CONTROL.start, {}, { wait: true, timeout: 15_000 }); + }, + + async stop(c): Promise { + const self = selfProjectPrSync(c); + await self.send(CONTROL.stop, {}, { wait: true, timeout: 15_000 }); + }, + + async setIntervalMs(c, payload: SetIntervalCommand): Promise { + const self = selfProjectPrSync(c); + await self.send(CONTROL.setInterval, payload, { wait: true, timeout: 15_000 }); + }, + + async force(c): Promise { + const self = selfProjectPrSync(c); + await self.send(CONTROL.force, {}, { wait: true, timeout: 5 * 60_000 }); + }, + }, + run: workflow(async (ctx) => { + await runWorkflowPollingLoop(ctx, { + loopName: "project-pr-sync-loop", + control: CONTROL, + onPoll: async (loopCtx) => { + try { + await pollPrs(loopCtx); + } catch (error) { + logActorWarning("project-pr-sync", "poll failed", { + error: resolveErrorMessage(error), + stack: resolveErrorStack(error), + }); + } + }, + }); + }), +}); diff --git a/foundry/packages/backend/src/actors/project/actions.ts b/foundry/packages/backend/src/actors/project/actions.ts new file mode 100644 index 0000000..bcd8f36 --- /dev/null +++ b/foundry/packages/backend/src/actors/project/actions.ts @@ -0,0 +1,1353 @@ +// @ts-nocheck +import { randomUUID } from "node:crypto"; +import { and, desc, eq, isNotNull, ne } from "drizzle-orm"; +import { Loop } from "rivetkit/workflow"; +import type { AgentType, TaskRecord, TaskSummary, ProviderId, RepoOverview, RepoStackAction, RepoStackActionResult } from "@sandbox-agent/foundry-shared"; +import { getActorRuntimeContext } from "../context.js"; +import { getTask, getOrCreateTask, getOrCreateHistory, getOrCreateProjectBranchSync, getOrCreateProjectPrSync, selfProject } from "../handles.js"; +import { isActorNotFoundError, logActorWarning, resolveErrorMessage } from "../logging.js"; +import { foundryRepoClonePath } from "../../services/foundry-paths.js"; +import { resolveWorkspaceGithubAuth } from "../../services/github-auth.js"; +import { expectQueueResponse } from "../../services/queue.js"; +import { withRepoGitLock } from "../../services/repo-git-lock.js"; +import { branches, taskIndex, prCache, repoActionJobs, repoMeta } from "./db/schema.js"; +import { deriveFallbackTitle } from "../../services/create-flow.js"; +import { normalizeBaseBranchName } from "../../integrations/git-spice/index.js"; +import { sortBranchesForOverview } from "./stack-model.js"; + +interface EnsureProjectCommand { + remoteUrl: string; +} + +interface EnsureProjectResult { + localPath: string; +} + +interface CreateTaskCommand { + task: string; + providerId: ProviderId; + agentType: AgentType | null; + explicitTitle: string | null; + explicitBranchName: string | null; + initialPrompt: string | null; + onBranch: string | null; +} + +interface HydrateTaskIndexCommand {} + +interface ListReservedBranchesCommand {} + +interface RegisterTaskBranchCommand { + taskId: string; + branchName: string; + requireExistingRemote?: boolean; +} + +interface ListTaskSummariesCommand { + includeArchived?: boolean; +} + +interface GetTaskEnrichedCommand { + taskId: string; +} + +interface GetPullRequestForBranchCommand { + branchName: string; +} + +interface PrSyncResult { + items: Array<{ + number: number; + headRefName: string; + state: string; + title: string; + url?: string; + author?: string; + isDraft?: boolean; + ciStatus?: string | null; + reviewStatus?: string | null; + reviewer?: string | null; + }>; + at: number; +} + +interface BranchSyncResult { + items: Array<{ + branchName: string; + commitSha: string; + parentBranch?: string | null; + trackedInStack?: boolean; + diffStat?: string | null; + hasUnpushed?: boolean; + conflictsWithMain?: boolean; + }>; + at: number; +} + +interface RepoOverviewCommand {} + +interface RunRepoStackActionCommand { + jobId?: string; + action: RepoStackAction; + branchName?: string; + parentBranch?: string; +} + +const PROJECT_QUEUE_NAMES = [ + "project.command.ensure", + "project.command.hydrateTaskIndex", + "project.command.createTask", + "project.command.registerTaskBranch", + "project.command.runRepoStackAction", + "project.command.applyPrSyncResult", + "project.command.applyBranchSyncResult", +] as const; + +type ProjectQueueName = (typeof PROJECT_QUEUE_NAMES)[number]; + +export { PROJECT_QUEUE_NAMES }; + +export function projectWorkflowQueueName(name: ProjectQueueName): ProjectQueueName { + return name; +} + +async function ensureLocalClone(c: any, remoteUrl: string): Promise { + const { config, driver } = getActorRuntimeContext(); + const localPath = foundryRepoClonePath(config, c.state.workspaceId, c.state.repoId); + const auth = await resolveWorkspaceGithubAuth(c, c.state.workspaceId); + await driver.git.ensureCloned(remoteUrl, localPath, { githubToken: auth?.githubToken ?? null }); + c.state.localPath = localPath; + return localPath; +} + +async function ensureProjectSyncActors(c: any, localPath: string): Promise { + if (c.state.syncActorsStarted) { + return; + } + + const prSync = await getOrCreateProjectPrSync(c, c.state.workspaceId, c.state.repoId, localPath, 30_000); + await prSync.start(); + + const branchSync = await getOrCreateProjectBranchSync(c, c.state.workspaceId, c.state.repoId, localPath, 5_000); + await branchSync.start(); + + c.state.syncActorsStarted = true; +} + +async function ensureRepoActionJobsTable(c: any): Promise { + await c.db.execute(` + CREATE TABLE IF NOT EXISTS repo_action_jobs ( + job_id text PRIMARY KEY NOT NULL, + action text NOT NULL, + branch_name text, + parent_branch text, + status text NOT NULL, + message text NOT NULL, + created_at integer NOT NULL, + updated_at integer NOT NULL, + completed_at integer + ) + `); +} + +async function writeRepoActionJob( + c: any, + input: { + jobId: string; + action: RepoStackAction; + branchName: string | null; + parentBranch: string | null; + status: "queued" | "running" | "completed" | "error"; + message: string; + createdAt?: number; + completedAt?: number | null; + }, +): Promise { + await ensureRepoActionJobsTable(c); + const now = Date.now(); + await c.db + .insert(repoActionJobs) + .values({ + jobId: input.jobId, + action: input.action, + branchName: input.branchName, + parentBranch: input.parentBranch, + status: input.status, + message: input.message, + createdAt: input.createdAt ?? now, + updatedAt: now, + completedAt: input.completedAt ?? null, + }) + .onConflictDoUpdate({ + target: repoActionJobs.jobId, + set: { + status: input.status, + message: input.message, + updatedAt: now, + completedAt: input.completedAt ?? null, + }, + }) + .run(); +} + +async function listRepoActionJobRows(c: any): Promise< + Array<{ + jobId: string; + action: RepoStackAction; + branchName: string | null; + parentBranch: string | null; + status: "queued" | "running" | "completed" | "error"; + message: string; + createdAt: number; + updatedAt: number; + completedAt: number | null; + }> +> { + await ensureRepoActionJobsTable(c); + const rows = await c.db.select().from(repoActionJobs).orderBy(desc(repoActionJobs.updatedAt)).limit(20).all(); + return rows.map((row: any) => ({ + jobId: row.jobId, + action: row.action, + branchName: row.branchName ?? null, + parentBranch: row.parentBranch ?? null, + status: row.status, + message: row.message, + createdAt: row.createdAt, + updatedAt: row.updatedAt, + completedAt: row.completedAt ?? null, + })); +} + +async function deleteStaleTaskIndexRow(c: any, taskId: string): Promise { + try { + await c.db.delete(taskIndex).where(eq(taskIndex.taskId, taskId)).run(); + } catch { + // Best-effort cleanup only; preserve the original caller flow. + } +} + +function isStaleTaskReferenceError(error: unknown): boolean { + const message = resolveErrorMessage(error); + return isActorNotFoundError(error) || message.startsWith("Task not found:"); +} + +async function ensureTaskIndexHydrated(c: any): Promise { + if (c.state.taskIndexHydrated) { + return; + } + + const existing = await c.db.select({ taskId: taskIndex.taskId }).from(taskIndex).limit(1).get(); + + if (existing) { + c.state.taskIndexHydrated = true; + return; + } + + // Migration path for old project actors that only tracked tasks in history. + try { + const history = await getOrCreateHistory(c, c.state.workspaceId, c.state.repoId); + const rows = await history.list({ limit: 5_000 }); + const seen = new Set(); + let skippedMissingTaskActors = 0; + + for (const row of rows) { + if (!row.taskId || seen.has(row.taskId)) { + continue; + } + seen.add(row.taskId); + + try { + const h = getTask(c, c.state.workspaceId, c.state.repoId, row.taskId); + await h.get(); + } catch (error) { + if (isStaleTaskReferenceError(error)) { + skippedMissingTaskActors += 1; + continue; + } + throw error; + } + + await c.db + .insert(taskIndex) + .values({ + taskId: row.taskId, + branchName: row.branchName, + createdAt: row.createdAt, + updatedAt: row.createdAt, + }) + .onConflictDoNothing() + .run(); + } + + if (skippedMissingTaskActors > 0) { + logActorWarning("project", "skipped missing tasks while hydrating index", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + skippedMissingTaskActors, + }); + } + } catch (error) { + logActorWarning("project", "task index hydration from history failed", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + error: resolveErrorMessage(error), + }); + } + + c.state.taskIndexHydrated = true; +} + +async function ensureProjectReady(c: any): Promise { + if (!c.state.remoteUrl) { + throw new Error("project remoteUrl is not initialized"); + } + if (!c.state.localPath) { + await ensureLocalClone(c, c.state.remoteUrl); + } + if (!c.state.localPath) { + throw new Error("project local repo is not initialized"); + } + await ensureProjectSyncActors(c, c.state.localPath); + return c.state.localPath; +} + +async function ensureProjectReadyForRead(c: any): Promise { + if (!c.state.remoteUrl) { + throw new Error("project remoteUrl is not initialized"); + } + + if (!c.state.localPath || !c.state.syncActorsStarted) { + const result = await projectActions.ensure(c, { remoteUrl: c.state.remoteUrl }); + const localPath = result?.localPath ?? c.state.localPath; + if (!localPath) { + throw new Error("project local repo is not initialized"); + } + return localPath; + } + + return c.state.localPath; +} + +async function ensureTaskIndexHydratedForRead(c: any): Promise { + if (c.state.taskIndexHydrated) { + return; + } + await projectActions.hydrateTaskIndex(c, {}); +} + +async function forceProjectSync(c: any, localPath: string): Promise { + const prSync = await getOrCreateProjectPrSync(c, c.state.workspaceId, c.state.repoId, localPath, 30_000); + await prSync.force(); + + const branchSync = await getOrCreateProjectBranchSync(c, c.state.workspaceId, c.state.repoId, localPath, 5_000); + await branchSync.force(); +} + +async function enrichTaskRecord(c: any, record: TaskRecord): Promise { + const branchName = record.branchName; + const br = + branchName != null + ? await c.db + .select({ + diffStat: branches.diffStat, + hasUnpushed: branches.hasUnpushed, + conflictsWithMain: branches.conflictsWithMain, + parentBranch: branches.parentBranch, + }) + .from(branches) + .where(eq(branches.branchName, branchName)) + .get() + : null; + + const pr = + branchName != null + ? await c.db + .select({ + prUrl: prCache.prUrl, + prAuthor: prCache.prAuthor, + ciStatus: prCache.ciStatus, + reviewStatus: prCache.reviewStatus, + reviewer: prCache.reviewer, + }) + .from(prCache) + .where(eq(prCache.branchName, branchName)) + .get() + : null; + + return { + ...record, + diffStat: br?.diffStat ?? null, + hasUnpushed: br?.hasUnpushed != null ? String(br.hasUnpushed) : null, + conflictsWithMain: br?.conflictsWithMain != null ? String(br.conflictsWithMain) : null, + parentBranch: br?.parentBranch ?? null, + prUrl: pr?.prUrl ?? null, + prAuthor: pr?.prAuthor ?? null, + ciStatus: pr?.ciStatus ?? null, + reviewStatus: pr?.reviewStatus ?? null, + reviewer: pr?.reviewer ?? null, + }; +} + +async function reinsertTaskIndexRow(c: any, taskId: string, branchName: string | null, updatedAt: number): Promise { + const now = Date.now(); + await c.db + .insert(taskIndex) + .values({ + taskId, + branchName, + createdAt: updatedAt || now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: taskIndex.taskId, + set: { + branchName, + updatedAt: now, + }, + }) + .run(); +} + +async function ensureProjectMutation(c: any, cmd: EnsureProjectCommand): Promise { + c.state.remoteUrl = cmd.remoteUrl; + const localPath = await ensureLocalClone(c, cmd.remoteUrl); + + await c.db + .insert(repoMeta) + .values({ + id: 1, + remoteUrl: cmd.remoteUrl, + updatedAt: Date.now(), + }) + .onConflictDoUpdate({ + target: repoMeta.id, + set: { + remoteUrl: cmd.remoteUrl, + updatedAt: Date.now(), + }, + }) + .run(); + + await ensureProjectSyncActors(c, localPath); + return { localPath }; +} + +async function hydrateTaskIndexMutation(c: any, _cmd?: HydrateTaskIndexCommand): Promise { + await ensureTaskIndexHydrated(c); +} + +async function createTaskMutation(c: any, cmd: CreateTaskCommand): Promise { + const localPath = await ensureProjectReady(c); + const onBranch = cmd.onBranch?.trim() || null; + const initialBranchName = onBranch; + const initialTitle = onBranch ? deriveFallbackTitle(cmd.task, cmd.explicitTitle ?? undefined) : null; + const taskId = randomUUID(); + + if (onBranch) { + const branchRow = await c.db.select({ branchName: branches.branchName }).from(branches).where(eq(branches.branchName, onBranch)).get(); + if (!branchRow) { + throw new Error(`Branch not found in repo snapshot: ${onBranch}`); + } + + await registerTaskBranchMutation(c, { + taskId, + branchName: onBranch, + requireExistingRemote: true, + }); + } + + let task: Awaited>; + try { + task = await getOrCreateTask(c, c.state.workspaceId, c.state.repoId, taskId, { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + taskId, + repoRemote: c.state.remoteUrl, + repoLocalPath: localPath, + branchName: initialBranchName, + title: initialTitle, + task: cmd.task, + providerId: cmd.providerId, + agentType: cmd.agentType, + explicitTitle: onBranch ? null : cmd.explicitTitle, + explicitBranchName: onBranch ? null : cmd.explicitBranchName, + initialPrompt: cmd.initialPrompt, + }); + } catch (error) { + if (onBranch) { + await c.db + .delete(taskIndex) + .where(eq(taskIndex.taskId, taskId)) + .run() + .catch(() => {}); + } + throw error; + } + + if (!onBranch) { + const now = Date.now(); + await c.db + .insert(taskIndex) + .values({ + taskId, + branchName: initialBranchName, + createdAt: now, + updatedAt: now, + }) + .onConflictDoNothing() + .run(); + } + + const created = await task.initialize({ providerId: cmd.providerId }); + + const history = await getOrCreateHistory(c, c.state.workspaceId, c.state.repoId); + await history.append({ + kind: "task.created", + taskId, + payload: { + repoId: c.state.repoId, + providerId: cmd.providerId, + }, + }); + + return created; +} + +async function registerTaskBranchMutation(c: any, cmd: RegisterTaskBranchCommand): Promise<{ branchName: string; headSha: string }> { + const localPath = await ensureProjectReady(c); + + const branchName = cmd.branchName.trim(); + const requireExistingRemote = cmd.requireExistingRemote === true; + if (!branchName) { + throw new Error("branchName is required"); + } + + await ensureTaskIndexHydrated(c); + + const existingOwner = await c.db + .select({ taskId: taskIndex.taskId }) + .from(taskIndex) + .where(and(eq(taskIndex.branchName, branchName), ne(taskIndex.taskId, cmd.taskId))) + .get(); + + if (existingOwner) { + let ownerMissing = false; + try { + const h = getTask(c, c.state.workspaceId, c.state.repoId, existingOwner.taskId); + await h.get(); + } catch (error) { + if (isStaleTaskReferenceError(error)) { + ownerMissing = true; + await deleteStaleTaskIndexRow(c, existingOwner.taskId); + logActorWarning("project", "pruned stale task index row during branch registration", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + taskId: existingOwner.taskId, + branchName, + }); + } else { + throw error; + } + } + if (!ownerMissing) { + throw new Error(`branch is already assigned to a different task: ${branchName}`); + } + } + + const { driver } = getActorRuntimeContext(); + + let headSha = ""; + let trackedInStack = false; + let parentBranch: string | null = null; + const auth = await resolveWorkspaceGithubAuth(c, c.state.workspaceId); + + await withRepoGitLock(localPath, async () => { + await driver.git.fetch(localPath, { githubToken: auth?.githubToken ?? null }); + const baseRef = await driver.git.remoteDefaultBaseRef(localPath); + const normalizedBase = normalizeBaseBranchName(baseRef); + + if (requireExistingRemote) { + try { + headSha = await driver.git.revParse(localPath, `origin/${branchName}`); + } catch { + throw new Error(`Remote branch not found: ${branchName}`); + } + } else { + await driver.git.ensureRemoteBranch(localPath, branchName, { githubToken: auth?.githubToken ?? null }); + await driver.git.fetch(localPath, { githubToken: auth?.githubToken ?? null }); + try { + headSha = await driver.git.revParse(localPath, `origin/${branchName}`); + } catch { + headSha = await driver.git.revParse(localPath, baseRef); + } + } + + if (await driver.stack.available(localPath).catch(() => false)) { + let stackRows = await driver.stack.listStack(localPath).catch(() => []); + let stackRow = stackRows.find((entry) => entry.branchName === branchName); + + if (!stackRow) { + try { + await driver.stack.trackBranch(localPath, branchName, normalizedBase); + } catch (error) { + logActorWarning("project", "stack track failed while registering branch", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + branchName, + error: resolveErrorMessage(error), + }); + } + stackRows = await driver.stack.listStack(localPath).catch(() => []); + stackRow = stackRows.find((entry) => entry.branchName === branchName); + } + + trackedInStack = Boolean(stackRow); + parentBranch = stackRow?.parentBranch ?? null; + } + }); + + const now = Date.now(); + await c.db + .insert(branches) + .values({ + branchName, + commitSha: headSha, + parentBranch, + trackedInStack: trackedInStack ? 1 : 0, + firstSeenAt: now, + lastSeenAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: branches.branchName, + set: { + commitSha: headSha, + parentBranch, + trackedInStack: trackedInStack ? 1 : 0, + lastSeenAt: now, + updatedAt: now, + }, + }) + .run(); + + await c.db + .insert(taskIndex) + .values({ + taskId: cmd.taskId, + branchName, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: taskIndex.taskId, + set: { + branchName, + updatedAt: now, + }, + }) + .run(); + + return { branchName, headSha }; +} + +async function runRepoStackActionMutation(c: any, cmd: RunRepoStackActionCommand): Promise { + const localPath = await ensureProjectReady(c); + await ensureTaskIndexHydrated(c); + + const { driver } = getActorRuntimeContext(); + const at = Date.now(); + const jobId = cmd.jobId ?? randomUUID(); + const action = cmd.action; + const branchName = cmd.branchName?.trim() || null; + const parentBranch = cmd.parentBranch?.trim() || null; + + await writeRepoActionJob(c, { + jobId, + action, + branchName, + parentBranch, + status: "running", + message: `Running ${action}`, + createdAt: at, + }); + + if (!(await driver.stack.available(localPath).catch(() => false))) { + await writeRepoActionJob(c, { + jobId, + action, + branchName, + parentBranch, + status: "error", + message: "git-spice is not available for this repo", + createdAt: at, + completedAt: Date.now(), + }); + return { + jobId, + action, + executed: false, + status: "error", + message: "git-spice is not available for this repo", + at, + }; + } + + if ((action === "restack_subtree" || action === "rebase_branch" || action === "reparent_branch") && !branchName) { + throw new Error(`branchName is required for action: ${action}`); + } + if (action === "reparent_branch" && !parentBranch) { + throw new Error("parentBranch is required for action: reparent_branch"); + } + + await forceProjectSync(c, localPath); + + if (branchName) { + const row = await c.db.select({ branchName: branches.branchName }).from(branches).where(eq(branches.branchName, branchName)).get(); + if (!row) { + throw new Error(`Branch not found in repo snapshot: ${branchName}`); + } + } + + if (action === "reparent_branch") { + if (!parentBranch) { + throw new Error("parentBranch is required for action: reparent_branch"); + } + if (parentBranch === branchName) { + throw new Error("parentBranch must be different from branchName"); + } + const parentRow = await c.db.select({ branchName: branches.branchName }).from(branches).where(eq(branches.branchName, parentBranch)).get(); + if (!parentRow) { + throw new Error(`Parent branch not found in repo snapshot: ${parentBranch}`); + } + } + + try { + await withRepoGitLock(localPath, async () => { + if (action === "sync_repo") { + await driver.stack.syncRepo(localPath); + } else if (action === "restack_repo") { + await driver.stack.restackRepo(localPath); + } else if (action === "restack_subtree") { + await driver.stack.restackSubtree(localPath, branchName!); + } else if (action === "rebase_branch") { + await driver.stack.rebaseBranch(localPath, branchName!); + } else if (action === "reparent_branch") { + await driver.stack.reparentBranch(localPath, branchName!, parentBranch!); + } else { + throw new Error(`Unsupported repo stack action: ${action}`); + } + }); + + try { + const history = await getOrCreateHistory(c, c.state.workspaceId, c.state.repoId); + await history.append({ + kind: "repo.stack_action", + branchName: branchName ?? null, + payload: { + action, + branchName: branchName ?? null, + parentBranch: parentBranch ?? null, + jobId, + }, + }); + } catch (error) { + logActorWarning("project", "failed appending repo stack history event", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + action, + error: resolveErrorMessage(error), + }); + } + + await forceProjectSync(c, localPath); + + await writeRepoActionJob(c, { + jobId, + action, + branchName, + parentBranch, + status: "completed", + message: `Completed ${action}`, + createdAt: at, + completedAt: Date.now(), + }); + } catch (error) { + const message = resolveErrorMessage(error); + await writeRepoActionJob(c, { + jobId, + action, + branchName, + parentBranch, + status: "error", + message, + createdAt: at, + completedAt: Date.now(), + }); + throw error; + } + + return { + jobId, + action, + executed: true, + status: "completed", + message: `Completed ${action}`, + at, + }; +} + +async function applyPrSyncResultMutation(c: any, body: PrSyncResult): Promise { + await c.db.delete(prCache).run(); + + for (const item of body.items) { + await c.db + .insert(prCache) + .values({ + branchName: item.headRefName, + prNumber: item.number, + state: item.state, + title: item.title, + prUrl: item.url ?? null, + prAuthor: item.author ?? null, + isDraft: item.isDraft ? 1 : 0, + ciStatus: item.ciStatus ?? null, + reviewStatus: item.reviewStatus ?? null, + reviewer: item.reviewer ?? null, + fetchedAt: body.at, + updatedAt: body.at, + }) + .onConflictDoUpdate({ + target: prCache.branchName, + set: { + prNumber: item.number, + state: item.state, + title: item.title, + prUrl: item.url ?? null, + prAuthor: item.author ?? null, + isDraft: item.isDraft ? 1 : 0, + ciStatus: item.ciStatus ?? null, + reviewStatus: item.reviewStatus ?? null, + reviewer: item.reviewer ?? null, + fetchedAt: body.at, + updatedAt: body.at, + }, + }) + .run(); + } + + for (const item of body.items) { + if (item.state !== "MERGED" && item.state !== "CLOSED") { + continue; + } + + const row = await c.db.select({ taskId: taskIndex.taskId }).from(taskIndex).where(eq(taskIndex.branchName, item.headRefName)).get(); + if (!row) { + continue; + } + + try { + const h = getTask(c, c.state.workspaceId, c.state.repoId, row.taskId); + await h.archive({ reason: `PR ${item.state.toLowerCase()}` }); + } catch (error) { + if (isStaleTaskReferenceError(error)) { + await deleteStaleTaskIndexRow(c, row.taskId); + logActorWarning("project", "pruned stale task index row during PR close archive", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + taskId: row.taskId, + branchName: item.headRefName, + prState: item.state, + }); + continue; + } + logActorWarning("project", "failed to auto-archive task after PR close", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + taskId: row.taskId, + branchName: item.headRefName, + prState: item.state, + error: resolveErrorMessage(error), + }); + } + } +} + +async function applyBranchSyncResultMutation(c: any, body: BranchSyncResult): Promise { + const incoming = new Set(body.items.map((item) => item.branchName)); + + for (const item of body.items) { + const existing = await c.db + .select({ + firstSeenAt: branches.firstSeenAt, + }) + .from(branches) + .where(eq(branches.branchName, item.branchName)) + .get(); + + await c.db + .insert(branches) + .values({ + branchName: item.branchName, + commitSha: item.commitSha, + parentBranch: item.parentBranch ?? null, + trackedInStack: item.trackedInStack ? 1 : 0, + diffStat: item.diffStat ?? null, + hasUnpushed: item.hasUnpushed ? 1 : 0, + conflictsWithMain: item.conflictsWithMain ? 1 : 0, + firstSeenAt: existing?.firstSeenAt ?? body.at, + lastSeenAt: body.at, + updatedAt: body.at, + }) + .onConflictDoUpdate({ + target: branches.branchName, + set: { + commitSha: item.commitSha, + parentBranch: item.parentBranch ?? null, + trackedInStack: item.trackedInStack ? 1 : 0, + diffStat: item.diffStat ?? null, + hasUnpushed: item.hasUnpushed ? 1 : 0, + conflictsWithMain: item.conflictsWithMain ? 1 : 0, + firstSeenAt: existing?.firstSeenAt ?? body.at, + lastSeenAt: body.at, + updatedAt: body.at, + }, + }) + .run(); + } + + const existingRows = await c.db.select({ branchName: branches.branchName }).from(branches).all(); + + for (const row of existingRows) { + if (incoming.has(row.branchName)) { + continue; + } + await c.db.delete(branches).where(eq(branches.branchName, row.branchName)).run(); + } +} + +export async function runProjectWorkflow(ctx: any): Promise { + await ctx.loop("project-command-loop", async (loopCtx: any) => { + const msg = await loopCtx.queue.next("next-project-command", { + names: [...PROJECT_QUEUE_NAMES], + completable: true, + }); + if (!msg) { + return Loop.continue(undefined); + } + + if (msg.name === "project.command.ensure") { + const result = await loopCtx.step({ + name: "project-ensure", + timeout: 5 * 60_000, + run: async () => ensureProjectMutation(loopCtx, msg.body as EnsureProjectCommand), + }); + await msg.complete(result); + return Loop.continue(undefined); + } + + if (msg.name === "project.command.hydrateTaskIndex") { + await loopCtx.step("project-hydrate-task-index", async () => hydrateTaskIndexMutation(loopCtx, msg.body as HydrateTaskIndexCommand)); + await msg.complete({ ok: true }); + return Loop.continue(undefined); + } + + if (msg.name === "project.command.createTask") { + const result = await loopCtx.step({ + name: "project-create-task", + timeout: 12 * 60_000, + run: async () => createTaskMutation(loopCtx, msg.body as CreateTaskCommand), + }); + await msg.complete(result); + return Loop.continue(undefined); + } + + if (msg.name === "project.command.registerTaskBranch") { + const result = await loopCtx.step({ + name: "project-register-task-branch", + timeout: 5 * 60_000, + run: async () => registerTaskBranchMutation(loopCtx, msg.body as RegisterTaskBranchCommand), + }); + await msg.complete(result); + return Loop.continue(undefined); + } + + if (msg.name === "project.command.runRepoStackAction") { + const result = await loopCtx.step({ + name: "project-run-repo-stack-action", + timeout: 12 * 60_000, + run: async () => runRepoStackActionMutation(loopCtx, msg.body as RunRepoStackActionCommand), + }); + await msg.complete(result); + return Loop.continue(undefined); + } + + if (msg.name === "project.command.applyPrSyncResult") { + await loopCtx.step({ + name: "project-apply-pr-sync-result", + timeout: 60_000, + run: async () => applyPrSyncResultMutation(loopCtx, msg.body as PrSyncResult), + }); + await msg.complete({ ok: true }); + return Loop.continue(undefined); + } + + if (msg.name === "project.command.applyBranchSyncResult") { + await loopCtx.step({ + name: "project-apply-branch-sync-result", + timeout: 60_000, + run: async () => applyBranchSyncResultMutation(loopCtx, msg.body as BranchSyncResult), + }); + await msg.complete({ ok: true }); + } + + return Loop.continue(undefined); + }); +} + +export const projectActions = { + async ensure(c: any, cmd: EnsureProjectCommand): Promise { + const self = selfProject(c); + return expectQueueResponse( + await self.send(projectWorkflowQueueName("project.command.ensure"), cmd, { + wait: true, + timeout: 5 * 60_000, + }), + ); + }, + + async createTask(c: any, cmd: CreateTaskCommand): Promise { + const self = selfProject(c); + return expectQueueResponse( + await self.send(projectWorkflowQueueName("project.command.createTask"), cmd, { + wait: true, + timeout: 12 * 60_000, + }), + ); + }, + + async listReservedBranches(c: any, _cmd?: ListReservedBranchesCommand): Promise { + await ensureTaskIndexHydratedForRead(c); + + const rows = await c.db.select({ branchName: taskIndex.branchName }).from(taskIndex).where(isNotNull(taskIndex.branchName)).all(); + + return rows.map((row) => row.branchName).filter((name): name is string => typeof name === "string" && name.trim().length > 0); + }, + + async registerTaskBranch(c: any, cmd: RegisterTaskBranchCommand): Promise<{ branchName: string; headSha: string }> { + const self = selfProject(c); + return expectQueueResponse<{ branchName: string; headSha: string }>( + await self.send(projectWorkflowQueueName("project.command.registerTaskBranch"), cmd, { + wait: true, + timeout: 5 * 60_000, + }), + ); + }, + + async hydrateTaskIndex(c: any, cmd?: HydrateTaskIndexCommand): Promise { + const self = selfProject(c); + await self.send(projectWorkflowQueueName("project.command.hydrateTaskIndex"), cmd ?? {}, { + wait: true, + timeout: 60_000, + }); + }, + + async listTaskSummaries(c: any, cmd?: ListTaskSummariesCommand): Promise { + const body = cmd ?? {}; + const records: TaskSummary[] = []; + + await ensureTaskIndexHydratedForRead(c); + + const taskRows = await c.db.select({ taskId: taskIndex.taskId }).from(taskIndex).orderBy(desc(taskIndex.updatedAt)).all(); + + for (const row of taskRows) { + try { + const h = getTask(c, c.state.workspaceId, c.state.repoId, row.taskId); + const record = await h.get(); + + if (!body.includeArchived && record.status === "archived") { + continue; + } + + records.push({ + workspaceId: record.workspaceId, + repoId: record.repoId, + taskId: record.taskId, + branchName: record.branchName, + title: record.title, + status: record.status, + updatedAt: record.updatedAt, + }); + } catch (error) { + if (isStaleTaskReferenceError(error)) { + await deleteStaleTaskIndexRow(c, row.taskId); + logActorWarning("project", "pruned stale task index row during summary listing", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + taskId: row.taskId, + }); + continue; + } + logActorWarning("project", "failed loading task summary row", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + taskId: row.taskId, + error: resolveErrorMessage(error), + }); + } + } + + records.sort((a, b) => b.updatedAt - a.updatedAt); + return records; + }, + + async getTaskEnriched(c: any, cmd: GetTaskEnrichedCommand): Promise { + await ensureTaskIndexHydratedForRead(c); + + const row = await c.db.select({ taskId: taskIndex.taskId }).from(taskIndex).where(eq(taskIndex.taskId, cmd.taskId)).get(); + if (!row) { + try { + const h = getTask(c, c.state.workspaceId, c.state.repoId, cmd.taskId); + const record = await h.get(); + await reinsertTaskIndexRow(c, cmd.taskId, record.branchName ?? null, record.updatedAt ?? Date.now()); + return await enrichTaskRecord(c, record); + } catch (error) { + if (isStaleTaskReferenceError(error)) { + throw new Error(`Unknown task in repo ${c.state.repoId}: ${cmd.taskId}`); + } + throw error; + } + } + + try { + const h = getTask(c, c.state.workspaceId, c.state.repoId, cmd.taskId); + const record = await h.get(); + return await enrichTaskRecord(c, record); + } catch (error) { + if (isStaleTaskReferenceError(error)) { + await deleteStaleTaskIndexRow(c, cmd.taskId); + throw new Error(`Unknown task in repo ${c.state.repoId}: ${cmd.taskId}`); + } + throw error; + } + }, + + async getRepoOverview(c: any, _cmd?: RepoOverviewCommand): Promise { + const localPath = await ensureProjectReadyForRead(c); + await ensureTaskIndexHydratedForRead(c); + + const { driver } = getActorRuntimeContext(); + const now = Date.now(); + const baseRef = await driver.git.remoteDefaultBaseRef(localPath).catch(() => null); + const stackAvailable = await driver.stack.available(localPath).catch(() => false); + + const branchRowsRaw = await c.db + .select({ + branchName: branches.branchName, + commitSha: branches.commitSha, + parentBranch: branches.parentBranch, + trackedInStack: branches.trackedInStack, + diffStat: branches.diffStat, + hasUnpushed: branches.hasUnpushed, + conflictsWithMain: branches.conflictsWithMain, + firstSeenAt: branches.firstSeenAt, + lastSeenAt: branches.lastSeenAt, + updatedAt: branches.updatedAt, + }) + .from(branches) + .all(); + + const taskRows = await c.db + .select({ + taskId: taskIndex.taskId, + branchName: taskIndex.branchName, + updatedAt: taskIndex.updatedAt, + }) + .from(taskIndex) + .all(); + + const taskMetaByBranch = new Map(); + + for (const row of taskRows) { + if (!row.branchName) { + continue; + } + try { + const h = getTask(c, c.state.workspaceId, c.state.repoId, row.taskId); + const record = await h.get(); + taskMetaByBranch.set(row.branchName, { + taskId: row.taskId, + title: record.title ?? null, + status: record.status, + updatedAt: record.updatedAt, + }); + } catch (error) { + if (isStaleTaskReferenceError(error)) { + await deleteStaleTaskIndexRow(c, row.taskId); + logActorWarning("project", "pruned stale task index row during repo overview", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + taskId: row.taskId, + branchName: row.branchName, + }); + continue; + } + logActorWarning("project", "failed loading task while building repo overview", { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + taskId: row.taskId, + branchName: row.branchName, + error: resolveErrorMessage(error), + }); + } + } + + const prRows = await c.db + .select({ + branchName: prCache.branchName, + prNumber: prCache.prNumber, + prState: prCache.state, + prUrl: prCache.prUrl, + ciStatus: prCache.ciStatus, + reviewStatus: prCache.reviewStatus, + reviewer: prCache.reviewer, + }) + .from(prCache) + .all(); + const prByBranch = new Map(prRows.map((row) => [row.branchName, row])); + + const combinedRows = sortBranchesForOverview( + branchRowsRaw.map((row) => ({ + branchName: row.branchName, + parentBranch: row.parentBranch ?? null, + updatedAt: row.updatedAt, + })), + ); + + const detailByBranch = new Map(branchRowsRaw.map((row) => [row.branchName, row])); + + const branchRows = combinedRows.map((ordering) => { + const row = detailByBranch.get(ordering.branchName)!; + const taskMeta = taskMetaByBranch.get(row.branchName); + const pr = prByBranch.get(row.branchName); + return { + branchName: row.branchName, + commitSha: row.commitSha, + parentBranch: row.parentBranch ?? null, + trackedInStack: Boolean(row.trackedInStack), + diffStat: row.diffStat ?? null, + hasUnpushed: Boolean(row.hasUnpushed), + conflictsWithMain: Boolean(row.conflictsWithMain), + taskId: taskMeta?.taskId ?? null, + taskTitle: taskMeta?.title ?? null, + taskStatus: taskMeta?.status ?? null, + prNumber: pr?.prNumber ?? null, + prState: pr?.prState ?? null, + prUrl: pr?.prUrl ?? null, + ciStatus: pr?.ciStatus ?? null, + reviewStatus: pr?.reviewStatus ?? null, + reviewer: pr?.reviewer ?? null, + firstSeenAt: row.firstSeenAt ?? null, + lastSeenAt: row.lastSeenAt ?? null, + updatedAt: Math.max(row.updatedAt, taskMeta?.updatedAt ?? 0), + }; + }); + + const latestBranchSync = await c.db.select({ updatedAt: branches.updatedAt }).from(branches).orderBy(desc(branches.updatedAt)).limit(1).get(); + const latestPrSync = await c.db.select({ updatedAt: prCache.updatedAt }).from(prCache).orderBy(desc(prCache.updatedAt)).limit(1).get(); + + return { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + remoteUrl: c.state.remoteUrl, + baseRef, + stackAvailable, + fetchedAt: now, + branchSyncAt: latestBranchSync?.updatedAt ?? null, + prSyncAt: latestPrSync?.updatedAt ?? null, + branchSyncStatus: latestBranchSync ? "synced" : "pending", + prSyncStatus: latestPrSync ? "synced" : "pending", + repoActionJobs: await listRepoActionJobRows(c), + branches: branchRows, + }; + }, + + async getPullRequestForBranch(c: any, cmd: GetPullRequestForBranchCommand): Promise<{ number: number; status: "draft" | "ready" } | null> { + const branchName = cmd.branchName?.trim(); + if (!branchName) { + return null; + } + + const pr = await c.db + .select({ + prNumber: prCache.prNumber, + prState: prCache.state, + }) + .from(prCache) + .where(eq(prCache.branchName, branchName)) + .get(); + + if (!pr?.prNumber) { + return null; + } + + return { + number: pr.prNumber, + status: pr.prState === "draft" ? "draft" : "ready", + }; + }, + + async runRepoStackAction(c: any, cmd: RunRepoStackActionCommand): Promise { + const self = selfProject(c); + const jobId = randomUUID(); + const at = Date.now(); + const action = cmd.action; + const branchName = cmd.branchName?.trim() || null; + const parentBranch = cmd.parentBranch?.trim() || null; + + await writeRepoActionJob(c, { + jobId, + action, + branchName, + parentBranch, + status: "queued", + message: `Queued ${action}`, + createdAt: at, + }); + + await self.send( + projectWorkflowQueueName("project.command.runRepoStackAction"), + { + ...cmd, + jobId, + }, + { + wait: false, + }, + ); + + return { + jobId, + action, + executed: true, + status: "queued", + message: `Queued ${action}`, + at, + }; + }, + + async applyPrSyncResult(c: any, body: PrSyncResult): Promise { + const self = selfProject(c); + await self.send(projectWorkflowQueueName("project.command.applyPrSyncResult"), body, { + wait: true, + timeout: 5 * 60_000, + }); + }, + + async applyBranchSyncResult(c: any, body: BranchSyncResult): Promise { + const self = selfProject(c); + await self.send(projectWorkflowQueueName("project.command.applyBranchSyncResult"), body, { + wait: true, + timeout: 5 * 60_000, + }); + }, +}; diff --git a/foundry/packages/backend/src/actors/github-data/db/db.ts b/foundry/packages/backend/src/actors/project/db/db.ts similarity index 68% rename from foundry/packages/backend/src/actors/github-data/db/db.ts rename to foundry/packages/backend/src/actors/project/db/db.ts index 00e5a11..49b5b72 100644 --- a/foundry/packages/backend/src/actors/github-data/db/db.ts +++ b/foundry/packages/backend/src/actors/project/db/db.ts @@ -2,4 +2,4 @@ import { db } from "rivetkit/db/drizzle"; import * as schema from "./schema.js"; import migrations from "./migrations.js"; -export const githubDataDb = db({ schema, migrations }); +export const projectDb = db({ schema, migrations }); diff --git a/foundry/packages/backend/src/actors/project/db/drizzle.config.ts b/foundry/packages/backend/src/actors/project/db/drizzle.config.ts new file mode 100644 index 0000000..5f53fc9 --- /dev/null +++ b/foundry/packages/backend/src/actors/project/db/drizzle.config.ts @@ -0,0 +1,6 @@ +import { defineConfig } from "rivetkit/db/drizzle"; + +export default defineConfig({ + out: "./src/actors/project/db/drizzle", + schema: "./src/actors/project/db/schema.ts", +}); diff --git a/foundry/packages/backend/src/actors/project/db/drizzle/0000_useful_la_nuit.sql b/foundry/packages/backend/src/actors/project/db/drizzle/0000_useful_la_nuit.sql new file mode 100644 index 0000000..f4f23ff --- /dev/null +++ b/foundry/packages/backend/src/actors/project/db/drizzle/0000_useful_la_nuit.sql @@ -0,0 +1,40 @@ +CREATE TABLE `branches` ( + `branch_name` text PRIMARY KEY NOT NULL, + `commit_sha` text NOT NULL, + `parent_branch` text, + `tracked_in_stack` integer DEFAULT 0 NOT NULL, + `diff_stat` text, + `has_unpushed` integer DEFAULT 0 NOT NULL, + `conflicts_with_main` integer DEFAULT 0 NOT NULL, + `first_seen_at` integer, + `last_seen_at` integer, + `updated_at` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE `pr_cache` ( + `branch_name` text PRIMARY KEY NOT NULL, + `pr_number` integer NOT NULL, + `state` text NOT NULL, + `title` text NOT NULL, + `pr_url` text, + `pr_author` text, + `is_draft` integer DEFAULT 0 NOT NULL, + `ci_status` text, + `review_status` text, + `reviewer` text, + `fetched_at` integer, + `updated_at` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE `repo_meta` ( + `id` integer PRIMARY KEY NOT NULL, + `remote_url` text NOT NULL, + `updated_at` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE `task_index` ( + `task_id` text PRIMARY KEY NOT NULL, + `branch_name` text, + `created_at` integer NOT NULL, + `updated_at` integer NOT NULL +); diff --git a/foundry/packages/backend/src/actors/project/db/drizzle/meta/0000_snapshot.json b/foundry/packages/backend/src/actors/project/db/drizzle/meta/0000_snapshot.json new file mode 100644 index 0000000..baf5913 --- /dev/null +++ b/foundry/packages/backend/src/actors/project/db/drizzle/meta/0000_snapshot.json @@ -0,0 +1,265 @@ +{ + "version": "6", + "dialect": "sqlite", + "id": "6ffd6acb-e737-46ee-a8fe-fcfddcdd6ea9", + "prevId": "00000000-0000-0000-0000-000000000000", + "tables": { + "branches": { + "name": "branches", + "columns": { + "branch_name": { + "name": "branch_name", + "type": "text", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "commit_sha": { + "name": "commit_sha", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "parent_branch": { + "name": "parent_branch", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "tracked_in_stack": { + "name": "tracked_in_stack", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false, + "default": 0 + }, + "diff_stat": { + "name": "diff_stat", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "has_unpushed": { + "name": "has_unpushed", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false, + "default": 0 + }, + "conflicts_with_main": { + "name": "conflicts_with_main", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false, + "default": 0 + }, + "first_seen_at": { + "name": "first_seen_at", + "type": "integer", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "last_seen_at": { + "name": "last_seen_at", + "type": "integer", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "updated_at": { + "name": "updated_at", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} + }, + "pr_cache": { + "name": "pr_cache", + "columns": { + "branch_name": { + "name": "branch_name", + "type": "text", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "pr_number": { + "name": "pr_number", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "state": { + "name": "state", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "title": { + "name": "title", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "pr_url": { + "name": "pr_url", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "pr_author": { + "name": "pr_author", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "is_draft": { + "name": "is_draft", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false, + "default": 0 + }, + "ci_status": { + "name": "ci_status", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "review_status": { + "name": "review_status", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "reviewer": { + "name": "reviewer", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "fetched_at": { + "name": "fetched_at", + "type": "integer", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "updated_at": { + "name": "updated_at", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} + }, + "repo_meta": { + "name": "repo_meta", + "columns": { + "id": { + "name": "id", + "type": "integer", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "remote_url": { + "name": "remote_url", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "updated_at": { + "name": "updated_at", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} + }, + "task_index": { + "name": "task_index", + "columns": { + "task_id": { + "name": "task_id", + "type": "text", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "branch_name": { + "name": "branch_name", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "created_at": { + "name": "created_at", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "updated_at": { + "name": "updated_at", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} + } + }, + "views": {}, + "enums": {}, + "_meta": { + "schemas": {}, + "tables": {}, + "columns": {} + }, + "internal": { + "indexes": {} + } +} diff --git a/foundry/packages/backend/src/actors/project/db/drizzle/meta/_journal.json b/foundry/packages/backend/src/actors/project/db/drizzle/meta/_journal.json new file mode 100644 index 0000000..deebd86 --- /dev/null +++ b/foundry/packages/backend/src/actors/project/db/drizzle/meta/_journal.json @@ -0,0 +1,13 @@ +{ + "version": "7", + "dialect": "sqlite", + "entries": [ + { + "idx": 0, + "version": "6", + "when": 1773376221848, + "tag": "0000_useful_la_nuit", + "breakpoints": true + } + ] +} diff --git a/foundry/packages/backend/src/actors/project/db/migrations.ts b/foundry/packages/backend/src/actors/project/db/migrations.ts new file mode 100644 index 0000000..aa49fba --- /dev/null +++ b/foundry/packages/backend/src/actors/project/db/migrations.ts @@ -0,0 +1,61 @@ +// This file is generated by src/actors/_scripts/generate-actor-migrations.ts. +// Source of truth is drizzle-kit output under ./drizzle (meta/_journal.json + *.sql). +// Do not hand-edit this file. + +const journal = { + entries: [ + { + idx: 0, + when: 1773376221848, + tag: "0000_useful_la_nuit", + breakpoints: true, + }, + ], +} as const; + +export default { + journal, + migrations: { + m0000: `CREATE TABLE \`branches\` ( + \`branch_name\` text PRIMARY KEY NOT NULL, + \`commit_sha\` text NOT NULL, + \`parent_branch\` text, + \`tracked_in_stack\` integer DEFAULT 0 NOT NULL, + \`diff_stat\` text, + \`has_unpushed\` integer DEFAULT 0 NOT NULL, + \`conflicts_with_main\` integer DEFAULT 0 NOT NULL, + \`first_seen_at\` integer, + \`last_seen_at\` integer, + \`updated_at\` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE \`pr_cache\` ( + \`branch_name\` text PRIMARY KEY NOT NULL, + \`pr_number\` integer NOT NULL, + \`state\` text NOT NULL, + \`title\` text NOT NULL, + \`pr_url\` text, + \`pr_author\` text, + \`is_draft\` integer DEFAULT 0 NOT NULL, + \`ci_status\` text, + \`review_status\` text, + \`reviewer\` text, + \`fetched_at\` integer, + \`updated_at\` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE \`repo_meta\` ( + \`id\` integer PRIMARY KEY NOT NULL, + \`remote_url\` text NOT NULL, + \`updated_at\` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE \`task_index\` ( + \`task_id\` text PRIMARY KEY NOT NULL, + \`branch_name\` text, + \`created_at\` integer NOT NULL, + \`updated_at\` integer NOT NULL +); +`, + } as const, +}; diff --git a/foundry/packages/backend/src/actors/project/db/schema.ts b/foundry/packages/backend/src/actors/project/db/schema.ts new file mode 100644 index 0000000..1ef4cee --- /dev/null +++ b/foundry/packages/backend/src/actors/project/db/schema.ts @@ -0,0 +1,56 @@ +import { integer, sqliteTable, text } from "rivetkit/db/drizzle"; + +// SQLite is per project actor instance (workspaceId+repoId), so no workspaceId/repoId columns needed. + +export const branches = sqliteTable("branches", { + branchName: text("branch_name").notNull().primaryKey(), + commitSha: text("commit_sha").notNull(), + parentBranch: text("parent_branch"), + trackedInStack: integer("tracked_in_stack").notNull().default(0), + diffStat: text("diff_stat"), + hasUnpushed: integer("has_unpushed").notNull().default(0), + conflictsWithMain: integer("conflicts_with_main").notNull().default(0), + firstSeenAt: integer("first_seen_at"), + lastSeenAt: integer("last_seen_at"), + updatedAt: integer("updated_at").notNull(), +}); + +export const repoMeta = sqliteTable("repo_meta", { + id: integer("id").primaryKey(), + remoteUrl: text("remote_url").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const prCache = sqliteTable("pr_cache", { + branchName: text("branch_name").notNull().primaryKey(), + prNumber: integer("pr_number").notNull(), + state: text("state").notNull(), + title: text("title").notNull(), + prUrl: text("pr_url"), + prAuthor: text("pr_author"), + isDraft: integer("is_draft").notNull().default(0), + ciStatus: text("ci_status"), + reviewStatus: text("review_status"), + reviewer: text("reviewer"), + fetchedAt: integer("fetched_at"), + updatedAt: integer("updated_at").notNull(), +}); + +export const taskIndex = sqliteTable("task_index", { + taskId: text("task_id").notNull().primaryKey(), + branchName: text("branch_name"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const repoActionJobs = sqliteTable("repo_action_jobs", { + jobId: text("job_id").notNull().primaryKey(), + action: text("action").notNull(), + branchName: text("branch_name"), + parentBranch: text("parent_branch"), + status: text("status").notNull(), + message: text("message").notNull(), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), + completedAt: integer("completed_at"), +}); diff --git a/foundry/packages/backend/src/actors/project/index.ts b/foundry/packages/backend/src/actors/project/index.ts new file mode 100644 index 0000000..c5ba8a7 --- /dev/null +++ b/foundry/packages/backend/src/actors/project/index.ts @@ -0,0 +1,30 @@ +import { actor, queue } from "rivetkit"; +import { workflow } from "rivetkit/workflow"; +import { projectDb } from "./db/db.js"; +import { PROJECT_QUEUE_NAMES, projectActions, runProjectWorkflow } from "./actions.js"; + +export interface ProjectInput { + workspaceId: string; + repoId: string; + remoteUrl: string; +} + +export const project = actor({ + db: projectDb, + queues: Object.fromEntries(PROJECT_QUEUE_NAMES.map((name) => [name, queue()])), + options: { + name: "Project", + icon: "folder", + actionTimeout: 5 * 60_000, + }, + createState: (_c, input: ProjectInput) => ({ + workspaceId: input.workspaceId, + repoId: input.repoId, + remoteUrl: input.remoteUrl, + localPath: null as string | null, + syncActorsStarted: false, + taskIndexHydrated: false, + }), + actions: projectActions, + run: workflow(runProjectWorkflow), +}); diff --git a/foundry/packages/backend/src/actors/project/stack-model.ts b/foundry/packages/backend/src/actors/project/stack-model.ts new file mode 100644 index 0000000..78c9888 --- /dev/null +++ b/foundry/packages/backend/src/actors/project/stack-model.ts @@ -0,0 +1,69 @@ +export interface StackEntry { + branchName: string; + parentBranch: string | null; +} + +export interface OrderedBranchRow { + branchName: string; + parentBranch: string | null; + updatedAt: number; +} + +export function normalizeParentBranch(branchName: string, parentBranch: string | null | undefined): string | null { + const parent = parentBranch?.trim() || null; + if (!parent || parent === branchName) { + return null; + } + return parent; +} + +export function parentLookupFromStack(entries: StackEntry[]): Map { + const lookup = new Map(); + for (const entry of entries) { + const branchName = entry.branchName.trim(); + if (!branchName) { + continue; + } + lookup.set(branchName, normalizeParentBranch(branchName, entry.parentBranch)); + } + return lookup; +} + +export function sortBranchesForOverview(rows: OrderedBranchRow[]): OrderedBranchRow[] { + const byName = new Map(rows.map((row) => [row.branchName, row])); + const depthMemo = new Map(); + const computing = new Set(); + + const depthFor = (branchName: string): number => { + const cached = depthMemo.get(branchName); + if (cached != null) { + return cached; + } + if (computing.has(branchName)) { + return 999; + } + + computing.add(branchName); + const row = byName.get(branchName); + const parent = row?.parentBranch; + let depth = 0; + if (parent && parent !== branchName && byName.has(parent)) { + depth = Math.min(998, depthFor(parent) + 1); + } + computing.delete(branchName); + depthMemo.set(branchName, depth); + return depth; + }; + + return [...rows].sort((a, b) => { + const da = depthFor(a.branchName); + const db = depthFor(b.branchName); + if (da !== db) { + return da - db; + } + if (a.updatedAt !== b.updatedAt) { + return b.updatedAt - a.updatedAt; + } + return a.branchName.localeCompare(b.branchName); + }); +} diff --git a/foundry/packages/backend/src/actors/sandbox-instance/db/db.ts b/foundry/packages/backend/src/actors/sandbox-instance/db/db.ts new file mode 100644 index 0000000..0251c43 --- /dev/null +++ b/foundry/packages/backend/src/actors/sandbox-instance/db/db.ts @@ -0,0 +1,5 @@ +import { db } from "rivetkit/db/drizzle"; +import * as schema from "./schema.js"; +import migrations from "./migrations.js"; + +export const sandboxInstanceDb = db({ schema, migrations }); diff --git a/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle.config.ts b/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle.config.ts new file mode 100644 index 0000000..b09d4cb --- /dev/null +++ b/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle.config.ts @@ -0,0 +1,6 @@ +import { defineConfig } from "rivetkit/db/drizzle"; + +export default defineConfig({ + out: "./src/actors/sandbox-instance/db/drizzle", + schema: "./src/actors/sandbox-instance/db/schema.ts", +}); diff --git a/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle/0000_smooth_sauron.sql b/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle/0000_smooth_sauron.sql new file mode 100644 index 0000000..20b3180 --- /dev/null +++ b/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle/0000_smooth_sauron.sql @@ -0,0 +1,27 @@ +CREATE TABLE `sandbox_instance` ( + `id` integer PRIMARY KEY NOT NULL, + `metadata_json` text NOT NULL, + `status` text NOT NULL, + `updated_at` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE `sandbox_session_events` ( + `id` text PRIMARY KEY NOT NULL, + `session_id` text NOT NULL, + `event_index` integer NOT NULL, + `created_at` integer NOT NULL, + `connection_id` text NOT NULL, + `sender` text NOT NULL, + `payload_json` text NOT NULL +); +--> statement-breakpoint +CREATE UNIQUE INDEX `sandbox_session_events_session_id_event_index_unique` ON `sandbox_session_events` (`session_id`,`event_index`);--> statement-breakpoint +CREATE TABLE `sandbox_sessions` ( + `id` text PRIMARY KEY NOT NULL, + `agent` text NOT NULL, + `agent_session_id` text NOT NULL, + `last_connection_id` text NOT NULL, + `created_at` integer NOT NULL, + `destroyed_at` integer, + `session_init_json` text +); diff --git a/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle/meta/0000_snapshot.json b/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle/meta/0000_snapshot.json new file mode 100644 index 0000000..d3e09c6 --- /dev/null +++ b/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle/meta/0000_snapshot.json @@ -0,0 +1,180 @@ +{ + "version": "6", + "dialect": "sqlite", + "id": "130486c5-6208-4d00-b367-e02b9def953a", + "prevId": "00000000-0000-0000-0000-000000000000", + "tables": { + "sandbox_instance": { + "name": "sandbox_instance", + "columns": { + "id": { + "name": "id", + "type": "integer", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "metadata_json": { + "name": "metadata_json", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "status": { + "name": "status", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "updated_at": { + "name": "updated_at", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} + }, + "sandbox_session_events": { + "name": "sandbox_session_events", + "columns": { + "id": { + "name": "id", + "type": "text", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "session_id": { + "name": "session_id", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "event_index": { + "name": "event_index", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "created_at": { + "name": "created_at", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "connection_id": { + "name": "connection_id", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "sender": { + "name": "sender", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "payload_json": { + "name": "payload_json", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + } + }, + "indexes": { + "sandbox_session_events_session_id_event_index_unique": { + "name": "sandbox_session_events_session_id_event_index_unique", + "columns": ["session_id", "event_index"], + "isUnique": true + } + }, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} + }, + "sandbox_sessions": { + "name": "sandbox_sessions", + "columns": { + "id": { + "name": "id", + "type": "text", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "agent": { + "name": "agent", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "agent_session_id": { + "name": "agent_session_id", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "last_connection_id": { + "name": "last_connection_id", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "created_at": { + "name": "created_at", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "destroyed_at": { + "name": "destroyed_at", + "type": "integer", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "session_init_json": { + "name": "session_init_json", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} + } + }, + "views": {}, + "enums": {}, + "_meta": { + "schemas": {}, + "tables": {}, + "columns": {} + }, + "internal": { + "indexes": {} + } +} diff --git a/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle/meta/_journal.json b/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle/meta/_journal.json new file mode 100644 index 0000000..fe993c2 --- /dev/null +++ b/foundry/packages/backend/src/actors/sandbox-instance/db/drizzle/meta/_journal.json @@ -0,0 +1,13 @@ +{ + "version": "7", + "dialect": "sqlite", + "entries": [ + { + "idx": 0, + "version": "6", + "when": 1773376224446, + "tag": "0000_smooth_sauron", + "breakpoints": true + } + ] +} diff --git a/foundry/packages/backend/src/actors/sandbox-instance/db/migrations.ts b/foundry/packages/backend/src/actors/sandbox-instance/db/migrations.ts new file mode 100644 index 0000000..4db8b1b --- /dev/null +++ b/foundry/packages/backend/src/actors/sandbox-instance/db/migrations.ts @@ -0,0 +1,48 @@ +// This file is generated by src/actors/_scripts/generate-actor-migrations.ts. +// Source of truth is drizzle-kit output under ./drizzle (meta/_journal.json + *.sql). +// Do not hand-edit this file. + +const journal = { + entries: [ + { + idx: 0, + when: 1773376224446, + tag: "0000_smooth_sauron", + breakpoints: true, + }, + ], +} as const; + +export default { + journal, + migrations: { + m0000: `CREATE TABLE \`sandbox_instance\` ( + \`id\` integer PRIMARY KEY NOT NULL, + \`metadata_json\` text NOT NULL, + \`status\` text NOT NULL, + \`updated_at\` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE \`sandbox_session_events\` ( + \`id\` text PRIMARY KEY NOT NULL, + \`session_id\` text NOT NULL, + \`event_index\` integer NOT NULL, + \`created_at\` integer NOT NULL, + \`connection_id\` text NOT NULL, + \`sender\` text NOT NULL, + \`payload_json\` text NOT NULL +); +--> statement-breakpoint +CREATE UNIQUE INDEX \`sandbox_session_events_session_id_event_index_unique\` ON \`sandbox_session_events\` (\`session_id\`,\`event_index\`);--> statement-breakpoint +CREATE TABLE \`sandbox_sessions\` ( + \`id\` text PRIMARY KEY NOT NULL, + \`agent\` text NOT NULL, + \`agent_session_id\` text NOT NULL, + \`last_connection_id\` text NOT NULL, + \`created_at\` integer NOT NULL, + \`destroyed_at\` integer, + \`session_init_json\` text +); +`, + } as const, +}; diff --git a/foundry/packages/backend/src/actors/sandbox-instance/db/schema.ts b/foundry/packages/backend/src/actors/sandbox-instance/db/schema.ts new file mode 100644 index 0000000..06ce05a --- /dev/null +++ b/foundry/packages/backend/src/actors/sandbox-instance/db/schema.ts @@ -0,0 +1,38 @@ +import { integer, sqliteTable, text, uniqueIndex } from "rivetkit/db/drizzle"; + +// SQLite is per sandbox-instance actor instance. +export const sandboxInstance = sqliteTable("sandbox_instance", { + id: integer("id").primaryKey(), + // Structured by the provider/runtime metadata serializer for this actor. + metadataJson: text("metadata_json").notNull(), + status: text("status").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +// Persist sandbox-agent sessions/events in SQLite instead of actor state so they survive +// serverless actor evictions and backend restarts. +export const sandboxSessions = sqliteTable("sandbox_sessions", { + id: text("id").notNull().primaryKey(), + agent: text("agent").notNull(), + agentSessionId: text("agent_session_id").notNull(), + lastConnectionId: text("last_connection_id").notNull(), + createdAt: integer("created_at").notNull(), + destroyedAt: integer("destroyed_at"), + // Structured by the sandbox-agent ACP session bootstrap payload. + sessionInitJson: text("session_init_json"), +}); + +export const sandboxSessionEvents = sqliteTable( + "sandbox_session_events", + { + id: text("id").notNull().primaryKey(), + sessionId: text("session_id").notNull(), + eventIndex: integer("event_index").notNull(), + createdAt: integer("created_at").notNull(), + connectionId: text("connection_id").notNull(), + sender: text("sender").notNull(), + // Structured by the sandbox-agent session event envelope. + payloadJson: text("payload_json").notNull(), + }, + (table) => [uniqueIndex("sandbox_session_events_session_id_event_index_unique").on(table.sessionId, table.eventIndex)], +); diff --git a/foundry/packages/backend/src/actors/sandbox-instance/index.ts b/foundry/packages/backend/src/actors/sandbox-instance/index.ts new file mode 100644 index 0000000..566a378 --- /dev/null +++ b/foundry/packages/backend/src/actors/sandbox-instance/index.ts @@ -0,0 +1,640 @@ +import { setTimeout as delay } from "node:timers/promises"; +import { eq } from "drizzle-orm"; +import { actor, queue } from "rivetkit"; +import { Loop, workflow } from "rivetkit/workflow"; +import type { ProviderId } from "@sandbox-agent/foundry-shared"; +import type { + ProcessCreateRequest, + ProcessInfo, + ProcessLogFollowQuery, + ProcessLogsResponse, + ProcessSignalQuery, + SessionEvent, + SessionRecord, +} from "sandbox-agent"; +import { sandboxInstanceDb } from "./db/db.js"; +import { sandboxInstance as sandboxInstanceTable } from "./db/schema.js"; +import { SandboxInstancePersistDriver } from "./persist.js"; +import { getActorRuntimeContext } from "../context.js"; +import { selfSandboxInstance } from "../handles.js"; +import { logActorWarning, resolveErrorMessage } from "../logging.js"; +import { expectQueueResponse } from "../../services/queue.js"; + +export interface SandboxInstanceInput { + workspaceId: string; + providerId: ProviderId; + sandboxId: string; +} + +interface SandboxAgentConnection { + endpoint: string; + token?: string; +} + +const SANDBOX_ROW_ID = 1; +const CREATE_SESSION_MAX_ATTEMPTS = 3; +const CREATE_SESSION_RETRY_BASE_MS = 1_000; +const CREATE_SESSION_STEP_TIMEOUT_MS = 10 * 60_000; + +function normalizeStatusFromEventPayload(payload: unknown): "running" | "idle" | "error" | null { + if (payload && typeof payload === "object") { + const envelope = payload as { + error?: unknown; + method?: unknown; + result?: unknown; + }; + + if (envelope.error) { + return "error"; + } + + if (envelope.result && typeof envelope.result === "object") { + const stopReason = (envelope.result as { stopReason?: unknown }).stopReason; + if (typeof stopReason === "string" && stopReason.length > 0) { + return "idle"; + } + } + + if (typeof envelope.method === "string") { + const lowered = envelope.method.toLowerCase(); + if (lowered.includes("error") || lowered.includes("failed")) { + return "error"; + } + if (lowered.includes("ended") || lowered.includes("complete") || lowered.includes("stopped")) { + return "idle"; + } + } + } + + return null; +} + +function stringifyJson(value: unknown): string { + return JSON.stringify(value, (_key, item) => { + if (typeof item === "bigint") return item.toString(); + return item; + }); +} + +function parseMetadata(metadataJson: string): Record { + try { + const parsed = JSON.parse(metadataJson) as unknown; + if (parsed && typeof parsed === "object") return parsed as Record; + return {}; + } catch { + return {}; + } +} + +async function loadPersistedAgentConfig(c: any): Promise { + try { + const row = await c.db + .select({ metadataJson: sandboxInstanceTable.metadataJson }) + .from(sandboxInstanceTable) + .where(eq(sandboxInstanceTable.id, SANDBOX_ROW_ID)) + .get(); + + if (row?.metadataJson) { + const metadata = parseMetadata(row.metadataJson); + const endpoint = typeof metadata.agentEndpoint === "string" ? metadata.agentEndpoint.trim() : ""; + const token = typeof metadata.agentToken === "string" ? metadata.agentToken.trim() : ""; + if (endpoint) { + return token ? { endpoint, token } : { endpoint }; + } + } + } catch { + return null; + } + return null; +} + +async function loadFreshDaytonaAgentConfig(c: any): Promise { + const { config, driver } = getActorRuntimeContext(); + const daytona = driver.daytona.createClient({ + apiUrl: config.providers.daytona.endpoint, + apiKey: config.providers.daytona.apiKey, + }); + const sandbox = await daytona.getSandbox(c.state.sandboxId); + const state = String(sandbox.state ?? "unknown").toLowerCase(); + if (state !== "started" && state !== "running") { + await daytona.startSandbox(c.state.sandboxId, 60); + } + const preview = await daytona.getPreviewEndpoint(c.state.sandboxId, 2468); + return preview.token ? { endpoint: preview.url, token: preview.token } : { endpoint: preview.url }; +} + +async function loadFreshProviderAgentConfig(c: any): Promise { + const { providers } = getActorRuntimeContext(); + const provider = providers.get(c.state.providerId); + return await provider.ensureSandboxAgent({ + workspaceId: c.state.workspaceId, + sandboxId: c.state.sandboxId, + }); +} + +async function loadAgentConfig(c: any): Promise { + const persisted = await loadPersistedAgentConfig(c); + if (c.state.providerId === "daytona") { + // Keep one stable signed preview endpoint per sandbox-instance actor. + // Rotating preview URLs on every call fragments SDK client state (sessions/events) + // because client caching keys by endpoint. + if (persisted) { + return persisted; + } + return await loadFreshDaytonaAgentConfig(c); + } + + // Local sandboxes are tied to the current backend process, so the sandbox-agent + // token can rotate on restart. Always refresh from the provider instead of + // trusting persisted metadata. + if (c.state.providerId === "local") { + return await loadFreshProviderAgentConfig(c); + } + + if (persisted) { + return persisted; + } + + return await loadFreshProviderAgentConfig(c); +} + +async function derivePersistedSessionStatus( + persist: SandboxInstancePersistDriver, + sessionId: string, +): Promise<{ id: string; status: "running" | "idle" | "error" }> { + const session = await persist.getSession(sessionId); + if (!session) { + return { id: sessionId, status: "error" }; + } + + if (session.destroyedAt) { + return { id: sessionId, status: "idle" }; + } + + const events = await persist.listEvents({ + sessionId, + limit: 25, + }); + + for (let index = events.items.length - 1; index >= 0; index -= 1) { + const event = events.items[index]; + if (!event) continue; + const status = normalizeStatusFromEventPayload(event.payload); + if (status) { + return { id: sessionId, status }; + } + } + + return { id: sessionId, status: "idle" }; +} + +function isTransientSessionCreateError(detail: string): boolean { + const lowered = detail.toLowerCase(); + if (lowered.includes("timed out") || lowered.includes("timeout") || lowered.includes("504") || lowered.includes("gateway timeout")) { + // ACP timeout errors are expensive and usually deterministic for the same + // request; immediate retries spawn additional sessions/processes and make + // recovery harder. + return false; + } + + return ( + lowered.includes("502") || lowered.includes("503") || lowered.includes("bad gateway") || lowered.includes("econnreset") || lowered.includes("econnrefused") + ); +} + +interface EnsureSandboxCommand { + metadata: Record; + status: string; + agentEndpoint?: string; + agentToken?: string; +} + +interface HealthSandboxCommand { + status: string; + message: string; +} + +interface CreateSessionCommand { + prompt: string; + cwd?: string; + agent?: "claude" | "codex" | "opencode"; +} + +interface CreateSessionResult { + id: string | null; + status: "running" | "idle" | "error"; + error?: string; +} + +interface ListSessionsCommand { + cursor?: string; + limit?: number; +} + +interface ListSessionEventsCommand { + sessionId: string; + cursor?: string; + limit?: number; +} + +interface SendPromptCommand { + sessionId: string; + prompt: string; + notification?: boolean; +} + +interface SessionStatusCommand { + sessionId: string; +} + +interface SessionControlCommand { + sessionId: string; +} + +const SANDBOX_INSTANCE_QUEUE_NAMES = [ + "sandboxInstance.command.ensure", + "sandboxInstance.command.updateHealth", + "sandboxInstance.command.destroy", + "sandboxInstance.command.createSession", + "sandboxInstance.command.sendPrompt", + "sandboxInstance.command.cancelSession", + "sandboxInstance.command.destroySession", +] as const; + +type SandboxInstanceQueueName = (typeof SANDBOX_INSTANCE_QUEUE_NAMES)[number]; + +function sandboxInstanceWorkflowQueueName(name: SandboxInstanceQueueName): SandboxInstanceQueueName { + return name; +} + +async function getSandboxAgentClient(c: any) { + const { driver } = getActorRuntimeContext(); + const persist = new SandboxInstancePersistDriver(c.db); + const { endpoint, token } = await loadAgentConfig(c); + return driver.sandboxAgent.createClient({ + endpoint, + token, + persist, + }); +} + +async function broadcastProcessesUpdated(c: any): Promise { + const client = await getSandboxAgentClient(c); + const { processes } = await client.listProcesses(); + c.broadcast("processesUpdated", { + type: "processesUpdated", + processes, + }); +} + +async function ensureSandboxMutation(c: any, command: EnsureSandboxCommand): Promise { + const now = Date.now(); + const metadata = { + ...command.metadata, + agentEndpoint: command.agentEndpoint ?? null, + agentToken: command.agentToken ?? null, + }; + + const metadataJson = stringifyJson(metadata); + await c.db + .insert(sandboxInstanceTable) + .values({ + id: SANDBOX_ROW_ID, + metadataJson, + status: command.status, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: sandboxInstanceTable.id, + set: { + metadataJson, + status: command.status, + updatedAt: now, + }, + }) + .run(); +} + +async function updateHealthMutation(c: any, command: HealthSandboxCommand): Promise { + await c.db + .update(sandboxInstanceTable) + .set({ + status: `${command.status}:${command.message}`, + updatedAt: Date.now(), + }) + .where(eq(sandboxInstanceTable.id, SANDBOX_ROW_ID)) + .run(); +} + +async function destroySandboxMutation(c: any): Promise { + await c.db.delete(sandboxInstanceTable).where(eq(sandboxInstanceTable.id, SANDBOX_ROW_ID)).run(); +} + +async function createSessionMutation(c: any, command: CreateSessionCommand): Promise { + let lastDetail = "sandbox-agent createSession failed"; + let attemptsMade = 0; + + for (let attempt = 1; attempt <= CREATE_SESSION_MAX_ATTEMPTS; attempt += 1) { + attemptsMade = attempt; + try { + const client = await getSandboxAgentClient(c); + + const session = await client.createSession({ + prompt: command.prompt, + cwd: command.cwd, + agent: command.agent, + }); + + return { id: session.id, status: session.status }; + } catch (error) { + const detail = error instanceof Error ? error.message : String(error); + lastDetail = detail; + const retryable = isTransientSessionCreateError(detail); + const canRetry = retryable && attempt < CREATE_SESSION_MAX_ATTEMPTS; + + if (!canRetry) { + break; + } + + const waitMs = CREATE_SESSION_RETRY_BASE_MS * attempt; + logActorWarning("sandbox-instance", "createSession transient failure; retrying", { + workspaceId: c.state.workspaceId, + providerId: c.state.providerId, + sandboxId: c.state.sandboxId, + attempt, + maxAttempts: CREATE_SESSION_MAX_ATTEMPTS, + waitMs, + error: detail, + }); + await delay(waitMs); + } + } + + const attemptLabel = attemptsMade === 1 ? "attempt" : "attempts"; + return { + id: null, + status: "error", + error: `sandbox-agent createSession failed after ${attemptsMade} ${attemptLabel}: ${lastDetail}`, + }; +} + +async function sendPromptMutation(c: any, command: SendPromptCommand): Promise { + const client = await getSandboxAgentClient(c); + await client.sendPrompt({ + sessionId: command.sessionId, + prompt: command.prompt, + notification: command.notification, + }); +} + +async function cancelSessionMutation(c: any, command: SessionControlCommand): Promise { + const client = await getSandboxAgentClient(c); + await client.cancelSession(command.sessionId); +} + +async function destroySessionMutation(c: any, command: SessionControlCommand): Promise { + const client = await getSandboxAgentClient(c); + await client.destroySession(command.sessionId); +} + +async function runSandboxInstanceWorkflow(ctx: any): Promise { + await ctx.loop("sandbox-instance-command-loop", async (loopCtx: any) => { + const msg = await loopCtx.queue.next("next-sandbox-instance-command", { + names: [...SANDBOX_INSTANCE_QUEUE_NAMES], + completable: true, + }); + if (!msg) { + return Loop.continue(undefined); + } + + if (msg.name === "sandboxInstance.command.ensure") { + await loopCtx.step("sandbox-instance-ensure", async () => ensureSandboxMutation(loopCtx, msg.body as EnsureSandboxCommand)); + await msg.complete({ ok: true }); + return Loop.continue(undefined); + } + + if (msg.name === "sandboxInstance.command.updateHealth") { + await loopCtx.step("sandbox-instance-update-health", async () => updateHealthMutation(loopCtx, msg.body as HealthSandboxCommand)); + await msg.complete({ ok: true }); + return Loop.continue(undefined); + } + + if (msg.name === "sandboxInstance.command.destroy") { + await loopCtx.step("sandbox-instance-destroy", async () => destroySandboxMutation(loopCtx)); + await msg.complete({ ok: true }); + return Loop.continue(undefined); + } + + if (msg.name === "sandboxInstance.command.createSession") { + const result = await loopCtx.step({ + name: "sandbox-instance-create-session", + timeout: CREATE_SESSION_STEP_TIMEOUT_MS, + run: async () => createSessionMutation(loopCtx, msg.body as CreateSessionCommand), + }); + await msg.complete(result); + return Loop.continue(undefined); + } + + if (msg.name === "sandboxInstance.command.sendPrompt") { + await loopCtx.step("sandbox-instance-send-prompt", async () => sendPromptMutation(loopCtx, msg.body as SendPromptCommand)); + await msg.complete({ ok: true }); + return Loop.continue(undefined); + } + + if (msg.name === "sandboxInstance.command.cancelSession") { + await loopCtx.step("sandbox-instance-cancel-session", async () => cancelSessionMutation(loopCtx, msg.body as SessionControlCommand)); + await msg.complete({ ok: true }); + return Loop.continue(undefined); + } + + if (msg.name === "sandboxInstance.command.destroySession") { + await loopCtx.step("sandbox-instance-destroy-session", async () => destroySessionMutation(loopCtx, msg.body as SessionControlCommand)); + await msg.complete({ ok: true }); + } + + return Loop.continue(undefined); + }); +} + +export const sandboxInstance = actor({ + db: sandboxInstanceDb, + queues: Object.fromEntries(SANDBOX_INSTANCE_QUEUE_NAMES.map((name) => [name, queue()])), + options: { + name: "Sandbox Instance", + icon: "box", + actionTimeout: 5 * 60_000, + }, + createState: (_c, input: SandboxInstanceInput) => ({ + workspaceId: input.workspaceId, + providerId: input.providerId, + sandboxId: input.sandboxId, + }), + actions: { + async sandboxAgentConnection(c: any): Promise { + return await loadAgentConfig(c); + }, + + async createProcess(c: any, request: ProcessCreateRequest): Promise { + const client = await getSandboxAgentClient(c); + const created = await client.createProcess(request); + await broadcastProcessesUpdated(c); + return created; + }, + + async listProcesses(c: any): Promise<{ processes: ProcessInfo[] }> { + const client = await getSandboxAgentClient(c); + return await client.listProcesses(); + }, + + async getProcessLogs(c: any, request: { processId: string; query?: ProcessLogFollowQuery }): Promise { + const client = await getSandboxAgentClient(c); + return await client.getProcessLogs(request.processId, request.query); + }, + + async stopProcess(c: any, request: { processId: string; query?: ProcessSignalQuery }): Promise { + const client = await getSandboxAgentClient(c); + const stopped = await client.stopProcess(request.processId, request.query); + await broadcastProcessesUpdated(c); + return stopped; + }, + + async killProcess(c: any, request: { processId: string; query?: ProcessSignalQuery }): Promise { + const client = await getSandboxAgentClient(c); + const killed = await client.killProcess(request.processId, request.query); + await broadcastProcessesUpdated(c); + return killed; + }, + + async deleteProcess(c: any, request: { processId: string }): Promise { + const client = await getSandboxAgentClient(c); + await client.deleteProcess(request.processId); + await broadcastProcessesUpdated(c); + }, + + async providerState(c: any): Promise<{ providerId: ProviderId; sandboxId: string; state: string; at: number }> { + const at = Date.now(); + const { config, driver } = getActorRuntimeContext(); + + if (c.state.providerId === "daytona") { + const daytona = driver.daytona.createClient({ + apiUrl: config.providers.daytona.endpoint, + apiKey: config.providers.daytona.apiKey, + }); + const sandbox = await daytona.getSandbox(c.state.sandboxId); + const state = String(sandbox.state ?? "unknown").toLowerCase(); + return { providerId: c.state.providerId, sandboxId: c.state.sandboxId, state, at }; + } + + return { + providerId: c.state.providerId, + sandboxId: c.state.sandboxId, + state: "unknown", + at, + }; + }, + + async ensure(c, command: EnsureSandboxCommand): Promise { + const self = selfSandboxInstance(c); + await self.send(sandboxInstanceWorkflowQueueName("sandboxInstance.command.ensure"), command, { + wait: true, + timeout: 60_000, + }); + }, + + async updateHealth(c, command: HealthSandboxCommand): Promise { + const self = selfSandboxInstance(c); + await self.send(sandboxInstanceWorkflowQueueName("sandboxInstance.command.updateHealth"), command, { + wait: true, + timeout: 60_000, + }); + }, + + async destroy(c): Promise { + const self = selfSandboxInstance(c); + await self.send( + sandboxInstanceWorkflowQueueName("sandboxInstance.command.destroy"), + {}, + { + wait: true, + timeout: 60_000, + }, + ); + }, + + async createSession(c: any, command: CreateSessionCommand): Promise { + const self = selfSandboxInstance(c); + return expectQueueResponse( + await self.send(sandboxInstanceWorkflowQueueName("sandboxInstance.command.createSession"), command, { + wait: true, + timeout: 5 * 60_000, + }), + ); + }, + + async listSessions(c: any, command?: ListSessionsCommand): Promise<{ items: SessionRecord[]; nextCursor?: string }> { + const persist = new SandboxInstancePersistDriver(c.db); + try { + const client = await getSandboxAgentClient(c); + + const page = await client.listSessions({ + cursor: command?.cursor, + limit: command?.limit, + }); + + return { + items: page.items, + nextCursor: page.nextCursor, + }; + } catch (error) { + logActorWarning("sandbox-instance", "listSessions remote read failed; using persisted fallback", { + workspaceId: c.state.workspaceId, + providerId: c.state.providerId, + sandboxId: c.state.sandboxId, + error: resolveErrorMessage(error), + }); + return await persist.listSessions({ + cursor: command?.cursor, + limit: command?.limit, + }); + } + }, + + async listSessionEvents(c: any, command: ListSessionEventsCommand): Promise<{ items: SessionEvent[]; nextCursor?: string }> { + const persist = new SandboxInstancePersistDriver(c.db); + return await persist.listEvents({ + sessionId: command.sessionId, + cursor: command.cursor, + limit: command.limit, + }); + }, + + async sendPrompt(c, command: SendPromptCommand): Promise { + const self = selfSandboxInstance(c); + await self.send(sandboxInstanceWorkflowQueueName("sandboxInstance.command.sendPrompt"), command, { + wait: true, + timeout: 5 * 60_000, + }); + }, + + async cancelSession(c, command: SessionControlCommand): Promise { + const self = selfSandboxInstance(c); + await self.send(sandboxInstanceWorkflowQueueName("sandboxInstance.command.cancelSession"), command, { + wait: true, + timeout: 60_000, + }); + }, + + async destroySession(c, command: SessionControlCommand): Promise { + const self = selfSandboxInstance(c); + await self.send(sandboxInstanceWorkflowQueueName("sandboxInstance.command.destroySession"), command, { + wait: true, + timeout: 60_000, + }); + }, + + async sessionStatus(c, command: SessionStatusCommand): Promise<{ id: string; status: "running" | "idle" | "error" }> { + return await derivePersistedSessionStatus(new SandboxInstancePersistDriver(c.db), command.sessionId); + }, + }, + run: workflow(runSandboxInstanceWorkflow), +}); diff --git a/foundry/packages/backend/src/actors/sandbox-instance/persist.ts b/foundry/packages/backend/src/actors/sandbox-instance/persist.ts new file mode 100644 index 0000000..5400e30 --- /dev/null +++ b/foundry/packages/backend/src/actors/sandbox-instance/persist.ts @@ -0,0 +1,266 @@ +import { and, asc, count, eq } from "drizzle-orm"; +import type { ListEventsRequest, ListPage, ListPageRequest, SessionEvent, SessionPersistDriver, SessionRecord } from "sandbox-agent"; +import { sandboxSessionEvents, sandboxSessions } from "./db/schema.js"; + +const DEFAULT_MAX_SESSIONS = 1024; +const DEFAULT_MAX_EVENTS_PER_SESSION = 500; +const DEFAULT_LIST_LIMIT = 100; + +function normalizeCap(value: number | undefined, fallback: number): number { + if (!Number.isFinite(value) || (value ?? 0) < 1) { + return fallback; + } + return Math.floor(value as number); +} + +function parseCursor(cursor: string | undefined): number { + if (!cursor) return 0; + const parsed = Number.parseInt(cursor, 10); + if (!Number.isFinite(parsed) || parsed < 0) return 0; + return parsed; +} + +export function resolveEventListOffset(params: { cursor?: string; total: number; limit: number }): number { + if (params.cursor != null) { + return parseCursor(params.cursor); + } + return Math.max(0, params.total - params.limit); +} + +function safeStringify(value: unknown): string { + return JSON.stringify(value, (_key, item) => { + if (typeof item === "bigint") return item.toString(); + return item; + }); +} + +function safeParseJson(value: string | null | undefined, fallback: T): T { + if (!value) return fallback; + try { + return JSON.parse(value) as T; + } catch { + return fallback; + } +} + +export interface SandboxInstancePersistDriverOptions { + maxSessions?: number; + maxEventsPerSession?: number; +} + +export class SandboxInstancePersistDriver implements SessionPersistDriver { + private readonly maxSessions: number; + private readonly maxEventsPerSession: number; + + constructor( + private readonly db: any, + options: SandboxInstancePersistDriverOptions = {}, + ) { + this.maxSessions = normalizeCap(options.maxSessions, DEFAULT_MAX_SESSIONS); + this.maxEventsPerSession = normalizeCap(options.maxEventsPerSession, DEFAULT_MAX_EVENTS_PER_SESSION); + } + + async getSession(id: string): Promise { + const row = await this.db + .select({ + id: sandboxSessions.id, + agent: sandboxSessions.agent, + agentSessionId: sandboxSessions.agentSessionId, + lastConnectionId: sandboxSessions.lastConnectionId, + createdAt: sandboxSessions.createdAt, + destroyedAt: sandboxSessions.destroyedAt, + sessionInitJson: sandboxSessions.sessionInitJson, + }) + .from(sandboxSessions) + .where(eq(sandboxSessions.id, id)) + .get(); + + if (!row) return null; + + return { + id: row.id, + agent: row.agent, + agentSessionId: row.agentSessionId, + lastConnectionId: row.lastConnectionId, + createdAt: row.createdAt, + destroyedAt: row.destroyedAt ?? undefined, + sessionInit: safeParseJson(row.sessionInitJson, undefined), + }; + } + + async listSessions(request: ListPageRequest = {}): Promise> { + const offset = parseCursor(request.cursor); + const limit = normalizeCap(request.limit, DEFAULT_LIST_LIMIT); + + const rows = await this.db + .select({ + id: sandboxSessions.id, + agent: sandboxSessions.agent, + agentSessionId: sandboxSessions.agentSessionId, + lastConnectionId: sandboxSessions.lastConnectionId, + createdAt: sandboxSessions.createdAt, + destroyedAt: sandboxSessions.destroyedAt, + sessionInitJson: sandboxSessions.sessionInitJson, + }) + .from(sandboxSessions) + .orderBy(asc(sandboxSessions.createdAt), asc(sandboxSessions.id)) + .limit(limit) + .offset(offset) + .all(); + + const items = rows.map((row) => ({ + id: row.id, + agent: row.agent, + agentSessionId: row.agentSessionId, + lastConnectionId: row.lastConnectionId, + createdAt: row.createdAt, + destroyedAt: row.destroyedAt ?? undefined, + sessionInit: safeParseJson(row.sessionInitJson, undefined), + })); + + const totalRow = await this.db.select({ c: count() }).from(sandboxSessions).get(); + const total = Number(totalRow?.c ?? 0); + + const nextOffset = offset + items.length; + return { + items, + nextCursor: nextOffset < total ? String(nextOffset) : undefined, + }; + } + + async updateSession(session: SessionRecord): Promise { + const now = Date.now(); + await this.db + .insert(sandboxSessions) + .values({ + id: session.id, + agent: session.agent, + agentSessionId: session.agentSessionId, + lastConnectionId: session.lastConnectionId, + createdAt: session.createdAt ?? now, + destroyedAt: session.destroyedAt ?? null, + sessionInitJson: session.sessionInit ? safeStringify(session.sessionInit) : null, + }) + .onConflictDoUpdate({ + target: sandboxSessions.id, + set: { + agent: session.agent, + agentSessionId: session.agentSessionId, + lastConnectionId: session.lastConnectionId, + createdAt: session.createdAt ?? now, + destroyedAt: session.destroyedAt ?? null, + sessionInitJson: session.sessionInit ? safeStringify(session.sessionInit) : null, + }, + }) + .run(); + + // Evict oldest sessions beyond cap. + const totalRow = await this.db.select({ c: count() }).from(sandboxSessions).get(); + const total = Number(totalRow?.c ?? 0); + const overflow = total - this.maxSessions; + if (overflow <= 0) return; + + const toRemove = await this.db + .select({ id: sandboxSessions.id }) + .from(sandboxSessions) + .orderBy(asc(sandboxSessions.createdAt), asc(sandboxSessions.id)) + .limit(overflow) + .all(); + + for (const row of toRemove) { + await this.db.delete(sandboxSessionEvents).where(eq(sandboxSessionEvents.sessionId, row.id)).run(); + await this.db.delete(sandboxSessions).where(eq(sandboxSessions.id, row.id)).run(); + } + } + + async listEvents(request: ListEventsRequest): Promise> { + const limit = normalizeCap(request.limit, DEFAULT_LIST_LIMIT); + const totalRow = await this.db.select({ c: count() }).from(sandboxSessionEvents).where(eq(sandboxSessionEvents.sessionId, request.sessionId)).get(); + const total = Number(totalRow?.c ?? 0); + const offset = resolveEventListOffset({ + cursor: request.cursor, + total, + limit, + }); + + const rows = await this.db + .select({ + id: sandboxSessionEvents.id, + sessionId: sandboxSessionEvents.sessionId, + eventIndex: sandboxSessionEvents.eventIndex, + createdAt: sandboxSessionEvents.createdAt, + connectionId: sandboxSessionEvents.connectionId, + sender: sandboxSessionEvents.sender, + payloadJson: sandboxSessionEvents.payloadJson, + }) + .from(sandboxSessionEvents) + .where(eq(sandboxSessionEvents.sessionId, request.sessionId)) + .orderBy(asc(sandboxSessionEvents.eventIndex), asc(sandboxSessionEvents.id)) + .limit(limit) + .offset(offset) + .all(); + + const items: SessionEvent[] = rows.map((row) => ({ + id: row.id, + eventIndex: row.eventIndex, + sessionId: row.sessionId, + createdAt: row.createdAt, + connectionId: row.connectionId, + sender: row.sender as any, + payload: safeParseJson(row.payloadJson, null), + })); + + const nextOffset = offset + items.length; + return { + items, + nextCursor: nextOffset < total ? String(nextOffset) : undefined, + }; + } + + async insertEvent(event: SessionEvent): Promise { + await this.db + .insert(sandboxSessionEvents) + .values({ + id: event.id, + sessionId: event.sessionId, + eventIndex: event.eventIndex, + createdAt: event.createdAt, + connectionId: event.connectionId, + sender: event.sender, + payloadJson: safeStringify(event.payload), + }) + .onConflictDoUpdate({ + target: sandboxSessionEvents.id, + set: { + sessionId: event.sessionId, + eventIndex: event.eventIndex, + createdAt: event.createdAt, + connectionId: event.connectionId, + sender: event.sender, + payloadJson: safeStringify(event.payload), + }, + }) + .run(); + + // Trim oldest events beyond cap. + const totalRow = await this.db.select({ c: count() }).from(sandboxSessionEvents).where(eq(sandboxSessionEvents.sessionId, event.sessionId)).get(); + const total = Number(totalRow?.c ?? 0); + const overflow = total - this.maxEventsPerSession; + if (overflow <= 0) return; + + const toRemove = await this.db + .select({ id: sandboxSessionEvents.id }) + .from(sandboxSessionEvents) + .where(eq(sandboxSessionEvents.sessionId, event.sessionId)) + .orderBy(asc(sandboxSessionEvents.eventIndex), asc(sandboxSessionEvents.id)) + .limit(overflow) + .all(); + + for (const row of toRemove) { + await this.db + .delete(sandboxSessionEvents) + .where(and(eq(sandboxSessionEvents.sessionId, event.sessionId), eq(sandboxSessionEvents.id, row.id))) + .run(); + } + } +} diff --git a/foundry/packages/backend/src/actors/sandbox/index.ts b/foundry/packages/backend/src/actors/sandbox/index.ts deleted file mode 100644 index 0444d9b..0000000 --- a/foundry/packages/backend/src/actors/sandbox/index.ts +++ /dev/null @@ -1,646 +0,0 @@ -// @ts-nocheck -import { actor, queue } from "rivetkit"; -import { workflow, Loop } from "rivetkit/workflow"; -import { e2b, sandboxActor } from "rivetkit/sandbox"; -import { existsSync } from "node:fs"; -import Dockerode from "dockerode"; -import { DEFAULT_WORKSPACE_MODEL_GROUPS, workspaceModelGroupsFromSandboxAgents, type WorkspaceModelGroup } from "@sandbox-agent/foundry-shared"; -import { SandboxAgent } from "sandbox-agent"; -import { getActorRuntimeContext } from "../context.js"; -import { organizationKey } from "../keys.js"; -import { selfTaskSandbox } from "../handles.js"; -import { logActorWarning, resolveErrorMessage } from "../logging.js"; -import { expectQueueResponse } from "../../services/queue.js"; -import { resolveSandboxProviderId } from "../../sandbox-config.js"; - -/** - * Default repo CWD inside the sandbox. The actual path is resolved dynamically - * via `$HOME/repo` because different sandbox providers run as different users - * (e.g. E2B uses `/home/user`, local Docker uses `/home/sandbox`). - */ -const DEFAULT_SANDBOX_REPO_CWD = "/home/user/repo"; -const DEFAULT_LOCAL_SANDBOX_IMAGE = "rivetdev/sandbox-agent:foundry-base-latest"; -const DEFAULT_LOCAL_SANDBOX_PORT = 2468; -const dockerClient = new Dockerode({ socketPath: "/var/run/docker.sock" }); - -function parseTaskSandboxKey(key: readonly string[]): { organizationId: string; taskId: string } { - if (key.length !== 4 || key[0] !== "org" || key[2] !== "sandbox") { - throw new Error(`Invalid task sandbox key: ${JSON.stringify(key)}`); - } - - return { - organizationId: key[1]!, - taskId: key[3]!, - }; -} - -function preferredDockerHost(): string { - if (process.env.FOUNDRY_DOCKER_HOST?.trim()) { - return process.env.FOUNDRY_DOCKER_HOST.trim(); - } - - return existsSync("/.dockerenv") ? "host.docker.internal" : "127.0.0.1"; -} - -function preferredPublicDockerHost(): string { - if (process.env.FOUNDRY_PUBLIC_SANDBOX_HOST?.trim()) { - return process.env.FOUNDRY_PUBLIC_SANDBOX_HOST.trim(); - } - - return "127.0.0.1"; -} - -function localSandboxAgentPort(): number { - const raw = process.env.FOUNDRY_LOCAL_SANDBOX_PORT?.trim() ?? process.env.HF_LOCAL_SANDBOX_PORT?.trim() ?? ""; - const parsed = Number(raw); - if (Number.isInteger(parsed) && parsed > 0 && parsed <= 65535) { - return parsed; - } - return DEFAULT_LOCAL_SANDBOX_PORT; -} - -function sandboxEnvPairs(): string[] { - const openAiApiKey = process.env.OPENAI_API_KEY; - const entries = [ - ["ANTHROPIC_API_KEY", process.env.ANTHROPIC_API_KEY], - ["CLAUDE_API_KEY", process.env.CLAUDE_API_KEY ?? process.env.ANTHROPIC_API_KEY], - ["OPENAI_API_KEY", openAiApiKey], - // Codex ACP prefers CODEX_API_KEY when present. In dev we want that to be the - // actual OpenAI API key, not an unrelated local Codex auth token. - ["CODEX_API_KEY", openAiApiKey ?? process.env.CODEX_API_KEY], - ["GH_TOKEN", process.env.GH_TOKEN ?? process.env.GITHUB_TOKEN], - ["GITHUB_TOKEN", process.env.GITHUB_TOKEN ?? process.env.GH_TOKEN], - ["E2B_API_KEY", process.env.E2B_API_KEY], - ]; - - return entries - .filter((entry): entry is [string, string] => typeof entry[1] === "string" && entry[1].trim().length > 0) - .map(([key, value]) => `${key}=${value}`); -} - -function sandboxEnvObject(): Record { - return Object.fromEntries( - sandboxEnvPairs().map((entry) => { - const [key, ...rest] = entry.split("="); - return [key!, rest.join("=")]; - }), - ); -} - -function modeIdForAgent(agent?: string | null): string | null { - switch (agent) { - case "codex": - return "full-access"; - case "claude": - return "acceptEdits"; - default: - return null; - } -} - -async function getPublishedDockerPort(sandboxId: string, containerPort: number): Promise { - const info = await dockerClient.getContainer(sandboxId).inspect(); - const hostPort = info.NetworkSettings?.Ports?.[`${containerPort}/tcp`]?.[0]?.HostPort; - if (!hostPort) { - throw new Error(`docker sandbox-agent port ${containerPort} is not published`); - } - return Number(hostPort); -} - -function createLocalSandboxProvider(image: string): any { - const agentPort = localSandboxAgentPort(); - const backendHost = preferredDockerHost(); - const publicHost = preferredPublicDockerHost(); - - return { - name: "docker", - - async create(_context: any): Promise { - const container = await dockerClient.createContainer({ - Image: image, - Cmd: ["server", "--no-token", "--host", "0.0.0.0", "--port", String(agentPort)], - Env: sandboxEnvPairs(), - ExposedPorts: { - [`${agentPort}/tcp`]: {}, - }, - HostConfig: { - AutoRemove: true, - PortBindings: { - [`${agentPort}/tcp`]: [{ HostPort: "0" }], - }, - }, - }); - - await container.start(); - return container.id; - }, - - async destroy(sandboxId: string): Promise { - const container = dockerClient.getContainer(sandboxId); - try { - await container.stop({ t: 5 }); - } catch {} - try { - await container.remove({ force: true }); - } catch {} - }, - - async getUrl(sandboxId: string): Promise { - const hostPort = await getPublishedDockerPort(sandboxId, agentPort); - return `http://${publicHost}:${hostPort}`; - }, - - async connectAgent(sandboxId: string, connectOptions: any): Promise { - const hostPort = await getPublishedDockerPort(sandboxId, agentPort); - return await SandboxAgent.connect({ - baseUrl: `http://${backendHost}:${hostPort}`, - ...connectOptions, - }); - }, - }; -} - -function sanitizeActorResult(value: unknown, seen = new WeakSet()): unknown { - if (typeof value === "function" || value === undefined) { - return undefined; - } - - if (value && typeof value === "object") { - const maybeToRecord = (value as { toRecord?: unknown }).toRecord; - if (typeof maybeToRecord === "function") { - return sanitizeActorResult(maybeToRecord.call(value), seen); - } - } - - if (value === null || typeof value !== "object") { - return value; - } - - if (value instanceof Date) { - return value.toISOString(); - } - - if (Array.isArray(value)) { - return value.map((entry) => sanitizeActorResult(entry, seen)).filter((entry) => entry !== undefined); - } - - if (seen.has(value)) { - return undefined; - } - seen.add(value); - - const next: Record = {}; - for (const [key, entry] of Object.entries(value)) { - const sanitized = sanitizeActorResult(entry, seen); - if (sanitized !== undefined) { - next[key] = sanitized; - } - } - return next; -} - -const baseTaskSandbox = sandboxActor({ - createProvider: async (c) => { - const { config } = getActorRuntimeContext(); - const { organizationId, taskId } = parseTaskSandboxKey(c.key); - const organization = await c.client().organization.getOrCreate(organizationKey(organizationId), { - createWithInput: organizationId, - }); - const task = await organization.getTask({ organizationId, taskId }); - const sandboxProviderId = resolveSandboxProviderId(config, task.sandboxProviderId); - - if (sandboxProviderId === "e2b") { - return e2b({ - create: () => ({ - template: config.sandboxProviders.e2b.template ?? "sandbox-agent-full-0.5.x", - envs: sandboxEnvObject(), - // TEMPORARY: Default E2B timeout is 5 minutes which is too short. - // Set to 1 hour as a stopgap. Remove this once the E2B provider in - // sandbox-agent uses betaCreate + autoPause (see - // .context/proposal-rivetkit-sandbox-resilience.md). At that point - // the provider handles timeout/pause lifecycle and this override is - // unnecessary. - timeoutMs: 60 * 60 * 1000, - }), - installAgents: ["claude", "codex"], - }); - } - - return createLocalSandboxProvider(config.sandboxProviders.local.image ?? process.env.HF_LOCAL_SANDBOX_IMAGE ?? DEFAULT_LOCAL_SANDBOX_IMAGE); - }, -}); - -async function broadcastProcesses(c: any, actions: Record Promise>): Promise { - try { - const listed = await actions.listProcesses(c); - c.broadcast("processesUpdated", { - type: "processesUpdated", - processes: listed.processes ?? [], - }); - } catch (error) { - // Process broadcasts are best-effort. Callers still receive the primary action result. - logActorWarning("taskSandbox", "broadcastProcesses failed", { - sandboxId: c.state?.sandboxId, - error: resolveErrorMessage(error), - }); - } -} - -async function providerForConnection(c: any): Promise { - if (c.state.sandboxDestroyed || !c.state.sandboxId) { - return null; - } - - if (c.vars.provider) { - return c.vars.provider; - } - - const providerFactory = baseTaskSandbox.config.actions as Record; - void providerFactory; - const { config } = getActorRuntimeContext(); - const { organizationId, taskId } = parseTaskSandboxKey(c.key); - const organization = await c.client().organization.getOrCreate(organizationKey(organizationId), { - createWithInput: organizationId, - }); - const task = await organization.getTask({ organizationId, taskId }); - const sandboxProviderId = resolveSandboxProviderId(config, task.sandboxProviderId); - - const provider = - sandboxProviderId === "e2b" - ? e2b({ - create: () => ({ - template: config.sandboxProviders.e2b.template ?? "sandbox-agent-full-0.5.x", - envs: sandboxEnvObject(), - }), - installAgents: ["claude", "codex"], - }) - : createLocalSandboxProvider(config.sandboxProviders.local.image ?? process.env.HF_LOCAL_SANDBOX_IMAGE ?? DEFAULT_LOCAL_SANDBOX_IMAGE); - - c.vars.provider = provider; - return provider; -} - -async function listWorkspaceModelGroupsForSandbox(c: any): Promise { - const provider = await providerForConnection(c); - if (!provider || !c.state.sandboxId || typeof provider.connectAgent !== "function") { - return DEFAULT_WORKSPACE_MODEL_GROUPS; - } - - try { - const client = await provider.connectAgent(c.state.sandboxId, { - waitForHealth: { - timeoutMs: 15_000, - }, - }); - const listed = await client.listAgents({ config: true }); - const groups = workspaceModelGroupsFromSandboxAgents(Array.isArray(listed?.agents) ? listed.agents : []); - return groups.length > 0 ? groups : DEFAULT_WORKSPACE_MODEL_GROUPS; - } catch { - return DEFAULT_WORKSPACE_MODEL_GROUPS; - } -} - -const baseActions = baseTaskSandbox.config.actions as Record Promise>; - -// --------------------------------------------------------------------------- -// Dynamic repo CWD resolution -// --------------------------------------------------------------------------- - -let cachedRepoCwd: string | null = null; - -/** - * Resolve the repo CWD inside the sandbox by querying `$HOME`. - * Different providers run as different users (E2B: `/home/user`, local Docker: - * `/home/sandbox`), so the path must be resolved dynamically. The result is - * cached for the lifetime of this sandbox actor instance. - */ -async function resolveRepoCwd(c: any): Promise { - if (cachedRepoCwd) return cachedRepoCwd; - - try { - const result = await baseActions.runProcess(c, { - command: "bash", - args: ["-lc", "echo $HOME"], - cwd: "/", - timeoutMs: 10_000, - }); - const home = (result.stdout ?? result.result ?? "").trim(); - if (home && home.startsWith("/")) { - cachedRepoCwd = `${home}/repo`; - return cachedRepoCwd; - } - } catch (error) { - logActorWarning("taskSandbox", "failed to resolve $HOME, using default", { - error: resolveErrorMessage(error), - }); - } - - cachedRepoCwd = DEFAULT_SANDBOX_REPO_CWD; - return cachedRepoCwd; -} - -// --------------------------------------------------------------------------- -// Queue names for sandbox actor -// --------------------------------------------------------------------------- - -const SANDBOX_QUEUE_NAMES = [ - "sandbox.command.createSession", - "sandbox.command.resumeOrCreateSession", - "sandbox.command.destroySession", - "sandbox.command.createProcess", - "sandbox.command.stopProcess", - "sandbox.command.killProcess", - "sandbox.command.deleteProcess", -] as const; - -type SandboxQueueName = (typeof SANDBOX_QUEUE_NAMES)[number]; - -function sandboxWorkflowQueueName(name: SandboxQueueName): SandboxQueueName { - return name; -} - -// --------------------------------------------------------------------------- -// Mutation handlers — executed inside the workflow command loop -// --------------------------------------------------------------------------- - -async function createSessionMutation(c: any, request: any): Promise { - const session = await baseActions.createSession(c, request); - const sessionId = typeof request?.id === "string" && request.id.length > 0 ? request.id : session?.id; - const modeId = modeIdForAgent(request?.agent); - if (sessionId && modeId) { - try { - await baseActions.rawSendSessionMethod(c, sessionId, "session/set_mode", { modeId }); - } catch { - // Session mode updates are best-effort. - } - } - return sanitizeActorResult(session); -} - -async function resumeOrCreateSessionMutation(c: any, request: any): Promise { - return sanitizeActorResult(await baseActions.resumeOrCreateSession(c, request)); -} - -async function destroySessionMutation(c: any, sessionId: string): Promise { - return sanitizeActorResult(await baseActions.destroySession(c, sessionId)); -} - -async function createProcessMutation(c: any, request: any): Promise { - const created = await baseActions.createProcess(c, request); - await broadcastProcesses(c, baseActions); - return created; -} - -async function runProcessMutation(c: any, request: any): Promise { - const result = await baseActions.runProcess(c, request); - await broadcastProcesses(c, baseActions); - return result; -} - -async function stopProcessMutation(c: any, processId: string, query?: any): Promise { - const stopped = await baseActions.stopProcess(c, processId, query); - await broadcastProcesses(c, baseActions); - return stopped; -} - -async function killProcessMutation(c: any, processId: string, query?: any): Promise { - const killed = await baseActions.killProcess(c, processId, query); - await broadcastProcesses(c, baseActions); - return killed; -} - -async function deleteProcessMutation(c: any, processId: string): Promise { - await baseActions.deleteProcess(c, processId); - await broadcastProcesses(c, baseActions); -} - -// --------------------------------------------------------------------------- -// Workflow command loop -// --------------------------------------------------------------------------- - -type SandboxWorkflowHandler = (loopCtx: any, body: any) => Promise; - -const SANDBOX_COMMAND_HANDLERS: Record = { - "sandbox.command.createSession": async (c, body) => createSessionMutation(c, body), - "sandbox.command.resumeOrCreateSession": async (c, body) => resumeOrCreateSessionMutation(c, body), - "sandbox.command.destroySession": async (c, body) => destroySessionMutation(c, body?.sessionId), - "sandbox.command.createProcess": async (c, body) => createProcessMutation(c, body), - "sandbox.command.stopProcess": async (c, body) => stopProcessMutation(c, body?.processId, body?.query), - "sandbox.command.killProcess": async (c, body) => killProcessMutation(c, body?.processId, body?.query), - "sandbox.command.deleteProcess": async (c, body) => { - await deleteProcessMutation(c, body?.processId); - return { ok: true }; - }, -}; - -async function runSandboxWorkflow(ctx: any): Promise { - await ctx.loop("sandbox-command-loop", async (loopCtx: any) => { - const msg = await loopCtx.queue.next("next-sandbox-command", { - names: [...SANDBOX_QUEUE_NAMES], - completable: true, - }); - - if (!msg) { - return Loop.continue(undefined); - } - - const handler = SANDBOX_COMMAND_HANDLERS[msg.name as SandboxQueueName]; - if (!handler) { - logActorWarning("taskSandbox", "unknown sandbox command", { command: msg.name }); - await msg.complete({ error: `Unknown command: ${msg.name}` }).catch(() => {}); - return Loop.continue(undefined); - } - - try { - // Wrap in a step so c.state and c.db are accessible inside mutation functions. - const result = await loopCtx.step({ - name: msg.name, - timeout: 10 * 60_000, - run: async () => handler(loopCtx, msg.body), - }); - try { - await msg.complete(result); - } catch (completeError) { - logActorWarning("taskSandbox", "sandbox workflow failed completing response", { - command: msg.name, - error: resolveErrorMessage(completeError), - }); - } - } catch (error) { - const message = resolveErrorMessage(error); - logActorWarning("taskSandbox", "sandbox workflow command failed", { - command: msg.name, - error: message, - }); - await msg.complete({ error: message }).catch(() => {}); - } - - return Loop.continue(undefined); - }); -} - -// --------------------------------------------------------------------------- -// Actor definition -// --------------------------------------------------------------------------- - -export const taskSandbox = actor({ - ...baseTaskSandbox.config, - queues: Object.fromEntries(SANDBOX_QUEUE_NAMES.map((name) => [name, queue()])), - options: { - ...baseTaskSandbox.config.options, - actionTimeout: 10 * 60_000, - }, - actions: { - ...baseActions, - - // Read actions — direct (no queue) - async resumeSession(c: any, sessionId: string): Promise { - return sanitizeActorResult(await baseActions.resumeSession(c, sessionId)); - }, - - async getSession(c: any, sessionId: string): Promise { - return sanitizeActorResult(await baseActions.getSession(c, sessionId)); - }, - - async listSessions(c: any, query?: any): Promise { - return sanitizeActorResult(await baseActions.listSessions(c, query)); - }, - - async listProcesses(c: any): Promise { - try { - return await baseActions.listProcesses(c); - } catch (error) { - // Sandbox may be gone (E2B timeout, destroyed, etc.) — degrade to empty - logActorWarning("taskSandbox", "listProcesses failed, sandbox may be expired", { - sandboxId: c.state.sandboxId, - error: resolveErrorMessage(error), - }); - return { processes: [] }; - } - }, - - async sandboxAgentConnection(c: any): Promise<{ endpoint: string; token?: string }> { - const provider = await providerForConnection(c); - if (!provider || !c.state.sandboxId) { - return { endpoint: "mock://terminal-unavailable" }; - } - - try { - return { - endpoint: await provider.getUrl(c.state.sandboxId), - }; - } catch { - return { endpoint: "mock://terminal-unavailable" }; - } - }, - - async listWorkspaceModelGroups(c: any): Promise { - return await listWorkspaceModelGroupsForSandbox(c); - }, - - async providerState(c: any): Promise<{ sandboxProviderId: "e2b" | "local"; sandboxId: string; state: string; at: number }> { - const { config } = getActorRuntimeContext(); - const { taskId } = parseTaskSandboxKey(c.key); - const at = Date.now(); - const sandboxProviderId = resolveSandboxProviderId(config, c.state.providerName === "e2b" ? "e2b" : c.state.providerName === "docker" ? "local" : null); - - if (c.state.sandboxDestroyed) { - return { sandboxProviderId, sandboxId: taskId, state: "destroyed", at }; - } - - if (!c.state.sandboxId) { - return { sandboxProviderId, sandboxId: taskId, state: "pending", at }; - } - - try { - const health = await baseActions.getHealth(c); - return { - sandboxProviderId, - sandboxId: taskId, - state: health.status === "ok" ? "running" : "degraded", - at, - }; - } catch { - return { - sandboxProviderId, - sandboxId: taskId, - state: "error", - at, - }; - } - }, - - async repoCwd(c: any): Promise<{ cwd: string }> { - const resolved = await resolveRepoCwd(c); - return { cwd: resolved }; - }, - - // Long-running action — kept as direct action to avoid blocking the - // workflow loop (prompt responses can take minutes). - async sendPrompt(c: any, request: { sessionId: string; prompt: string }): Promise { - const text = typeof request?.prompt === "string" ? request.prompt.trim() : ""; - if (!text) { - return null; - } - - const session = await baseActions.resumeSession(c, request.sessionId); - if (!session || typeof session.prompt !== "function") { - throw new Error(`session '${request.sessionId}' not found`); - } - - return sanitizeActorResult(await session.prompt([{ type: "text", text }])); - }, - - // Mutation actions — self-send to queue for workflow history - async createSession(c: any, request: any): Promise { - const self = selfTaskSandbox(c); - return expectQueueResponse(await self.send(sandboxWorkflowQueueName("sandbox.command.createSession"), request ?? {}, { wait: true, timeout: 10_000 })); - }, - - async resumeOrCreateSession(c: any, request: any): Promise { - const self = selfTaskSandbox(c); - return expectQueueResponse( - await self.send(sandboxWorkflowQueueName("sandbox.command.resumeOrCreateSession"), request ?? {}, { wait: true, timeout: 10_000 }), - ); - }, - - async destroySession(c: any, sessionId: string): Promise { - const self = selfTaskSandbox(c); - return expectQueueResponse(await self.send(sandboxWorkflowQueueName("sandbox.command.destroySession"), { sessionId }, { wait: true, timeout: 10_000 })); - }, - - async createProcess(c: any, request: any): Promise { - const self = selfTaskSandbox(c); - return expectQueueResponse(await self.send(sandboxWorkflowQueueName("sandbox.command.createProcess"), request ?? {}, { wait: true, timeout: 10_000 })); - }, - - // runProcess kept as direct action — response can exceed 128KB queue limit - async runProcess(c: any, request: any): Promise { - const result = await baseActions.runProcess(c, request); - await broadcastProcesses(c, baseActions); - return result; - }, - - async stopProcess(c: any, processId: string, query?: any): Promise { - const self = selfTaskSandbox(c); - return expectQueueResponse( - await self.send(sandboxWorkflowQueueName("sandbox.command.stopProcess"), { processId, query }, { wait: true, timeout: 10_000 }), - ); - }, - - async killProcess(c: any, processId: string, query?: any): Promise { - const self = selfTaskSandbox(c); - return expectQueueResponse( - await self.send(sandboxWorkflowQueueName("sandbox.command.killProcess"), { processId, query }, { wait: true, timeout: 10_000 }), - ); - }, - - async deleteProcess(c: any, processId: string): Promise { - const self = selfTaskSandbox(c); - await self.send(sandboxWorkflowQueueName("sandbox.command.deleteProcess"), { processId }, { wait: false }); - }, - }, - run: workflow(runSandboxWorkflow), -}); - -export { DEFAULT_SANDBOX_REPO_CWD, resolveRepoCwd }; diff --git a/foundry/packages/backend/src/actors/task-status-sync/index.ts b/foundry/packages/backend/src/actors/task-status-sync/index.ts new file mode 100644 index 0000000..759cbe4 --- /dev/null +++ b/foundry/packages/backend/src/actors/task-status-sync/index.ts @@ -0,0 +1,110 @@ +import { actor, queue } from "rivetkit"; +import { workflow } from "rivetkit/workflow"; +import type { ProviderId } from "@sandbox-agent/foundry-shared"; +import { getTask, getSandboxInstance, selfTaskStatusSync } from "../handles.js"; +import { logActorWarning, resolveErrorMessage, resolveErrorStack } from "../logging.js"; +import { type PollingControlState, runWorkflowPollingLoop } from "../polling.js"; + +export interface TaskStatusSyncInput { + workspaceId: string; + repoId: string; + taskId: string; + providerId: ProviderId; + sandboxId: string; + sessionId: string; + intervalMs: number; +} + +interface SetIntervalCommand { + intervalMs: number; +} + +interface TaskStatusSyncState extends PollingControlState { + workspaceId: string; + repoId: string; + taskId: string; + providerId: ProviderId; + sandboxId: string; + sessionId: string; +} + +const CONTROL = { + start: "task.status_sync.control.start", + stop: "task.status_sync.control.stop", + setInterval: "task.status_sync.control.set_interval", + force: "task.status_sync.control.force", +} as const; + +async function pollSessionStatus(c: { state: TaskStatusSyncState }): Promise { + const sandboxInstance = getSandboxInstance(c, c.state.workspaceId, c.state.providerId, c.state.sandboxId); + const status = await sandboxInstance.sessionStatus({ sessionId: c.state.sessionId }); + + const parent = getTask(c, c.state.workspaceId, c.state.repoId, c.state.taskId); + await parent.syncWorkbenchSessionStatus({ + sessionId: c.state.sessionId, + status: status.status, + at: Date.now(), + }); +} + +export const taskStatusSync = actor({ + queues: { + [CONTROL.start]: queue(), + [CONTROL.stop]: queue(), + [CONTROL.setInterval]: queue(), + [CONTROL.force]: queue(), + }, + options: { + name: "Task Status Sync", + icon: "signal", + // Polling actors rely on timer-based wakeups; sleeping would pause the timer and stop polling. + noSleep: true, + }, + createState: (_c, input: TaskStatusSyncInput): TaskStatusSyncState => ({ + workspaceId: input.workspaceId, + repoId: input.repoId, + taskId: input.taskId, + providerId: input.providerId, + sandboxId: input.sandboxId, + sessionId: input.sessionId, + intervalMs: input.intervalMs, + running: true, + }), + actions: { + async start(c): Promise { + const self = selfTaskStatusSync(c); + await self.send(CONTROL.start, {}, { wait: true, timeout: 15_000 }); + }, + + async stop(c): Promise { + const self = selfTaskStatusSync(c); + await self.send(CONTROL.stop, {}, { wait: true, timeout: 15_000 }); + }, + + async setIntervalMs(c, payload: SetIntervalCommand): Promise { + const self = selfTaskStatusSync(c); + await self.send(CONTROL.setInterval, payload, { wait: true, timeout: 15_000 }); + }, + + async force(c): Promise { + const self = selfTaskStatusSync(c); + await self.send(CONTROL.force, {}, { wait: true, timeout: 5 * 60_000 }); + }, + }, + run: workflow(async (ctx) => { + await runWorkflowPollingLoop(ctx, { + loopName: "task-status-sync-loop", + control: CONTROL, + onPoll: async (loopCtx) => { + try { + await pollSessionStatus(loopCtx); + } catch (error) { + logActorWarning("task-status-sync", "poll failed", { + error: resolveErrorMessage(error), + stack: resolveErrorStack(error), + }); + } + }, + }); + }), +}); diff --git a/foundry/packages/backend/src/actors/task/db/drizzle/0000_charming_maestro.sql b/foundry/packages/backend/src/actors/task/db/drizzle/0000_charming_maestro.sql index c6a346a..b9ef95a 100644 --- a/foundry/packages/backend/src/actors/task/db/drizzle/0000_charming_maestro.sql +++ b/foundry/packages/backend/src/actors/task/db/drizzle/0000_charming_maestro.sql @@ -3,9 +3,10 @@ CREATE TABLE `task` ( `branch_name` text, `title` text, `task` text NOT NULL, - `sandbox_provider_id` text NOT NULL, + `provider_id` text NOT NULL, `status` text NOT NULL, - `pull_request_json` text, + `agent_type` text DEFAULT 'claude', + `pr_submitted` integer DEFAULT 0, `created_at` integer NOT NULL, `updated_at` integer NOT NULL, CONSTRAINT "task_singleton_id_check" CHECK("task"."id" = 1) @@ -14,33 +15,33 @@ CREATE TABLE `task` ( CREATE TABLE `task_runtime` ( `id` integer PRIMARY KEY NOT NULL, `active_sandbox_id` text, + `active_session_id` text, `active_switch_target` text, `active_cwd` text, - `git_state_json` text, - `git_state_updated_at` integer, + `status_message` text, `updated_at` integer NOT NULL, CONSTRAINT "task_runtime_singleton_id_check" CHECK("task_runtime"."id" = 1) ); --> statement-breakpoint CREATE TABLE `task_sandboxes` ( `sandbox_id` text PRIMARY KEY NOT NULL, - `sandbox_provider_id` text NOT NULL, + `provider_id` text NOT NULL, `sandbox_actor_id` text, `switch_target` text NOT NULL, `cwd` text, + `status_message` text, `created_at` integer NOT NULL, `updated_at` integer NOT NULL ); --> statement-breakpoint -CREATE TABLE `task_workspace_sessions` ( +CREATE TABLE `task_workbench_sessions` ( `session_id` text PRIMARY KEY NOT NULL, - `sandbox_session_id` text, `session_name` text NOT NULL, `model` text NOT NULL, - `status` text DEFAULT 'ready' NOT NULL, - `error_message` text, - `transcript_json` text DEFAULT '[]' NOT NULL, - `transcript_updated_at` integer, + `unread` integer DEFAULT 0 NOT NULL, + `draft_text` text DEFAULT '' NOT NULL, + `draft_attachments_json` text DEFAULT '[]' NOT NULL, + `draft_updated_at` integer, `created` integer DEFAULT 1 NOT NULL, `closed` integer DEFAULT 0 NOT NULL, `thinking_since_ms` integer, diff --git a/foundry/packages/backend/src/actors/task/db/drizzle/meta/0000_snapshot.json b/foundry/packages/backend/src/actors/task/db/drizzle/meta/0000_snapshot.json index 7397b89..b8a5879 100644 --- a/foundry/packages/backend/src/actors/task/db/drizzle/meta/0000_snapshot.json +++ b/foundry/packages/backend/src/actors/task/db/drizzle/meta/0000_snapshot.json @@ -35,8 +35,8 @@ "notNull": true, "autoincrement": false }, - "sandbox_provider_id": { - "name": "sandbox_provider_id", + "provider_id": { + "name": "provider_id", "type": "text", "primaryKey": false, "notNull": true, @@ -49,12 +49,21 @@ "notNull": true, "autoincrement": false }, - "pull_request_json": { - "name": "pull_request_json", + "agent_type": { + "name": "agent_type", "type": "text", "primaryKey": false, "notNull": false, - "autoincrement": false + "autoincrement": false, + "default": "'claude'" + }, + "pr_submitted": { + "name": "pr_submitted", + "type": "integer", + "primaryKey": false, + "notNull": false, + "autoincrement": false, + "default": 0 }, "created_at": { "name": "created_at", @@ -99,6 +108,13 @@ "notNull": false, "autoincrement": false }, + "active_session_id": { + "name": "active_session_id", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, "active_switch_target": { "name": "active_switch_target", "type": "text", @@ -113,20 +129,13 @@ "notNull": false, "autoincrement": false }, - "git_state_json": { - "name": "git_state_json", + "status_message": { + "name": "status_message", "type": "text", "primaryKey": false, "notNull": false, "autoincrement": false }, - "git_state_updated_at": { - "name": "git_state_updated_at", - "type": "integer", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, "updated_at": { "name": "updated_at", "type": "integer", @@ -156,8 +165,8 @@ "notNull": true, "autoincrement": false }, - "sandbox_provider_id": { - "name": "sandbox_provider_id", + "provider_id": { + "name": "provider_id", "type": "text", "primaryKey": false, "notNull": true, @@ -184,6 +193,13 @@ "notNull": false, "autoincrement": false }, + "status_message": { + "name": "status_message", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, "created_at": { "name": "created_at", "type": "integer", @@ -205,8 +221,8 @@ "uniqueConstraints": {}, "checkConstraints": {} }, - "task_workspace_sessions": { - "name": "task_workspace_sessions", + "task_workbench_sessions": { + "name": "task_workbench_sessions", "columns": { "session_id": { "name": "session_id", @@ -215,13 +231,6 @@ "notNull": true, "autoincrement": false }, - "sandbox_session_id": { - "name": "sandbox_session_id", - "type": "text", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, "session_name": { "name": "session_name", "type": "text", @@ -236,31 +245,32 @@ "notNull": true, "autoincrement": false }, - "status": { - "name": "status", + "unread": { + "name": "unread", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false, + "default": 0 + }, + "draft_text": { + "name": "draft_text", "type": "text", "primaryKey": false, "notNull": true, "autoincrement": false, - "default": "'ready'" + "default": "''" }, - "error_message": { - "name": "error_message", - "type": "text", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, - "transcript_json": { - "name": "transcript_json", + "draft_attachments_json": { + "name": "draft_attachments_json", "type": "text", "primaryKey": false, "notNull": true, "autoincrement": false, "default": "'[]'" }, - "transcript_updated_at": { - "name": "transcript_updated_at", + "draft_updated_at": { + "name": "draft_updated_at", "type": "integer", "primaryKey": false, "notNull": false, diff --git a/foundry/packages/backend/src/actors/task/db/migrations.ts b/foundry/packages/backend/src/actors/task/db/migrations.ts index 61b0dff..4d4630b 100644 --- a/foundry/packages/backend/src/actors/task/db/migrations.ts +++ b/foundry/packages/backend/src/actors/task/db/migrations.ts @@ -10,12 +10,6 @@ const journal = { tag: "0000_charming_maestro", breakpoints: true, }, - { - idx: 1, - when: 1773984000000, - tag: "0001_add_task_owner", - breakpoints: true, - }, ], } as const; @@ -27,9 +21,10 @@ export default { \`branch_name\` text, \`title\` text, \`task\` text NOT NULL, - \`sandbox_provider_id\` text NOT NULL, + \`provider_id\` text NOT NULL, \`status\` text NOT NULL, - \`pull_request_json\` text, + \`agent_type\` text DEFAULT 'claude', + \`pr_submitted\` integer DEFAULT 0, \`created_at\` integer NOT NULL, \`updated_at\` integer NOT NULL, CONSTRAINT "task_singleton_id_check" CHECK("task"."id" = 1) @@ -38,49 +33,39 @@ export default { CREATE TABLE \`task_runtime\` ( \`id\` integer PRIMARY KEY NOT NULL, \`active_sandbox_id\` text, + \`active_session_id\` text, \`active_switch_target\` text, \`active_cwd\` text, - \`git_state_json\` text, - \`git_state_updated_at\` integer, + \`status_message\` text, \`updated_at\` integer NOT NULL, CONSTRAINT "task_runtime_singleton_id_check" CHECK("task_runtime"."id" = 1) ); --> statement-breakpoint CREATE TABLE \`task_sandboxes\` ( \`sandbox_id\` text PRIMARY KEY NOT NULL, - \`sandbox_provider_id\` text NOT NULL, + \`provider_id\` text NOT NULL, \`sandbox_actor_id\` text, \`switch_target\` text NOT NULL, \`cwd\` text, + \`status_message\` text, \`created_at\` integer NOT NULL, \`updated_at\` integer NOT NULL ); --> statement-breakpoint -CREATE TABLE \`task_workspace_sessions\` ( +CREATE TABLE \`task_workbench_sessions\` ( \`session_id\` text PRIMARY KEY NOT NULL, - \`sandbox_session_id\` text, \`session_name\` text NOT NULL, \`model\` text NOT NULL, - \`status\` text DEFAULT 'ready' NOT NULL, - \`error_message\` text, - \`transcript_json\` text DEFAULT '[]' NOT NULL, - \`transcript_updated_at\` integer, + \`unread\` integer DEFAULT 0 NOT NULL, + \`draft_text\` text DEFAULT '' NOT NULL, + \`draft_attachments_json\` text DEFAULT '[]' NOT NULL, + \`draft_updated_at\` integer, \`created\` integer DEFAULT 1 NOT NULL, \`closed\` integer DEFAULT 0 NOT NULL, \`thinking_since_ms\` integer, \`created_at\` integer NOT NULL, \`updated_at\` integer NOT NULL ); -`, - m0001: `CREATE TABLE \`task_owner\` ( - \`id\` integer PRIMARY KEY NOT NULL, - \`primary_user_id\` text, - \`primary_github_login\` text, - \`primary_github_email\` text, - \`primary_github_avatar_url\` text, - \`updated_at\` integer NOT NULL, - CONSTRAINT "task_owner_singleton_id_check" CHECK("task_owner"."id" = 1) -); `, } as const, }; diff --git a/foundry/packages/backend/src/actors/task/db/schema.ts b/foundry/packages/backend/src/actors/task/db/schema.ts index bdb7cf7..2b59f4b 100644 --- a/foundry/packages/backend/src/actors/task/db/schema.ts +++ b/foundry/packages/backend/src/actors/task/db/schema.ts @@ -9,9 +9,10 @@ export const task = sqliteTable( branchName: text("branch_name"), title: text("title"), task: text("task").notNull(), - sandboxProviderId: text("sandbox_provider_id").notNull(), + providerId: text("provider_id").notNull(), status: text("status").notNull(), - pullRequestJson: text("pull_request_json"), + agentType: text("agent_type").default("claude"), + prSubmitted: integer("pr_submitted").default(0), createdAt: integer("created_at").notNull(), updatedAt: integer("updated_at").notNull(), }, @@ -23,55 +24,31 @@ export const taskRuntime = sqliteTable( { id: integer("id").primaryKey(), activeSandboxId: text("active_sandbox_id"), + activeSessionId: text("active_session_id"), activeSwitchTarget: text("active_switch_target"), activeCwd: text("active_cwd"), + statusMessage: text("status_message"), gitStateJson: text("git_state_json"), gitStateUpdatedAt: integer("git_state_updated_at"), + provisionStage: text("provision_stage"), + provisionStageUpdatedAt: integer("provision_stage_updated_at"), updatedAt: integer("updated_at").notNull(), }, (table) => [check("task_runtime_singleton_id_check", sql`${table.id} = 1`)], ); -/** - * Coordinator index of SandboxInstanceActor instances. - * Tracks all sandbox instances provisioned for this task. Only one - * is active at a time (referenced by taskRuntime.activeSandboxId). - */ export const taskSandboxes = sqliteTable("task_sandboxes", { sandboxId: text("sandbox_id").notNull().primaryKey(), - sandboxProviderId: text("sandbox_provider_id").notNull(), + providerId: text("provider_id").notNull(), sandboxActorId: text("sandbox_actor_id"), switchTarget: text("switch_target").notNull(), cwd: text("cwd"), + statusMessage: text("status_message"), createdAt: integer("created_at").notNull(), updatedAt: integer("updated_at").notNull(), }); -/** - * Single-row table tracking the primary user (owner) of this task. - * The owner's GitHub OAuth credentials are injected into the sandbox - * for git operations. Updated when a different user sends a message. - */ -export const taskOwner = sqliteTable( - "task_owner", - { - id: integer("id").primaryKey(), - primaryUserId: text("primary_user_id"), - primaryGithubLogin: text("primary_github_login"), - primaryGithubEmail: text("primary_github_email"), - primaryGithubAvatarUrl: text("primary_github_avatar_url"), - updatedAt: integer("updated_at").notNull(), - }, - (table) => [check("task_owner_singleton_id_check", sql`${table.id} = 1`)], -); - -/** - * Coordinator index of workspace sessions within this task. - * The task actor is the coordinator for sessions. Each row holds session - * metadata, model, status, transcript, and draft state. Sessions are - * sub-entities of the task — no separate session actor in the DB. - */ -export const taskWorkspaceSessions = sqliteTable("task_workspace_sessions", { +export const taskWorkbenchSessions = sqliteTable("task_workbench_sessions", { sessionId: text("session_id").notNull().primaryKey(), sandboxSessionId: text("sandbox_session_id"), sessionName: text("session_name").notNull(), @@ -80,6 +57,11 @@ export const taskWorkspaceSessions = sqliteTable("task_workspace_sessions", { errorMessage: text("error_message"), transcriptJson: text("transcript_json").notNull().default("[]"), transcriptUpdatedAt: integer("transcript_updated_at"), + unread: integer("unread").notNull().default(0), + draftText: text("draft_text").notNull().default(""), + // Structured by the workbench composer attachment payload format. + draftAttachmentsJson: text("draft_attachments_json").notNull().default("[]"), + draftUpdatedAt: integer("draft_updated_at"), created: integer("created").notNull().default(1), closed: integer("closed").notNull().default(0), thinkingSinceMs: integer("thinking_since_ms"), diff --git a/foundry/packages/backend/src/actors/task/index.ts b/foundry/packages/backend/src/actors/task/index.ts index 68bee1c..8d9f418 100644 --- a/foundry/packages/backend/src/actors/task/index.ts +++ b/foundry/packages/backend/src/actors/task/index.ts @@ -1,31 +1,112 @@ import { actor, queue } from "rivetkit"; import { workflow } from "rivetkit/workflow"; -import type { TaskRecord } from "@sandbox-agent/foundry-shared"; +import type { + AgentType, + TaskRecord, + TaskWorkbenchChangeModelInput, + TaskWorkbenchRenameInput, + TaskWorkbenchRenameSessionInput, + TaskWorkbenchSetSessionUnreadInput, + TaskWorkbenchSendMessageInput, + TaskWorkbenchUpdateDraftInput, + ProviderId, +} from "@sandbox-agent/foundry-shared"; +import { expectQueueResponse } from "../../services/queue.js"; +import { selfTask } from "../handles.js"; import { taskDb } from "./db/db.js"; import { getCurrentRecord } from "./workflow/common.js"; import { - changeWorkspaceModel, + changeWorkbenchModel, + closeWorkbenchSession, + createWorkbenchSession, getSessionDetail, getTaskDetail, getTaskSummary, - markWorkspaceUnread, - refreshWorkspaceDerivedState, - refreshWorkspaceSessionTranscript, - renameWorkspaceSession, - renameWorkspaceTask, - selectWorkspaceSession, - setWorkspaceSessionUnread, - syncTaskPullRequest, - syncWorkspaceSessionStatus, - updateWorkspaceDraft, -} from "./workspace.js"; -import { runTaskWorkflow } from "./workflow/index.js"; -import { TASK_QUEUE_NAMES } from "./workflow/queue.js"; + markWorkbenchUnread, + publishWorkbenchPr, + renameWorkbenchBranch, + renameWorkbenchTask, + renameWorkbenchSession, + revertWorkbenchFile, + sendWorkbenchMessage, + syncWorkbenchSessionStatus, + setWorkbenchSessionUnread, + stopWorkbenchSession, + updateWorkbenchDraft, +} from "./workbench.js"; +import { TASK_QUEUE_NAMES, taskWorkflowQueueName, runTaskWorkflow } from "./workflow/index.js"; export interface TaskInput { - organizationId: string; + workspaceId: string; repoId: string; taskId: string; + repoRemote: string; + repoLocalPath: string; + branchName: string | null; + title: string | null; + task: string; + providerId: ProviderId; + agentType: AgentType | null; + explicitTitle: string | null; + explicitBranchName: string | null; + initialPrompt: string | null; +} + +interface InitializeCommand { + providerId?: ProviderId; +} + +interface TaskActionCommand { + reason?: string; +} + +interface TaskTabCommand { + tabId: string; +} + +interface TaskStatusSyncCommand { + sessionId: string; + status: "running" | "idle" | "error"; + at: number; +} + +interface TaskWorkbenchValueCommand { + value: string; +} + +interface TaskWorkbenchSessionTitleCommand { + sessionId: string; + title: string; +} + +interface TaskWorkbenchSessionUnreadCommand { + sessionId: string; + unread: boolean; +} + +interface TaskWorkbenchUpdateDraftCommand { + sessionId: string; + text: string; + attachments: Array; +} + +interface TaskWorkbenchChangeModelCommand { + sessionId: string; + model: string; +} + +interface TaskWorkbenchSendMessageCommand { + sessionId: string; + text: string; + attachments: Array; +} + +interface TaskWorkbenchCreateSessionCommand { + model?: string; +} + +interface TaskWorkbenchSessionCommand { + sessionId: string; } export const task = actor({ @@ -34,66 +115,264 @@ export const task = actor({ options: { name: "Task", icon: "wrench", - actionTimeout: 10 * 60_000, + actionTimeout: 5 * 60_000, }, createState: (_c, input: TaskInput) => ({ - organizationId: input.organizationId, + workspaceId: input.workspaceId, repoId: input.repoId, taskId: input.taskId, + repoRemote: input.repoRemote, + repoLocalPath: input.repoLocalPath, + branchName: input.branchName, + title: input.title, + task: input.task, + providerId: input.providerId, + agentType: input.agentType, + explicitTitle: input.explicitTitle, + explicitBranchName: input.explicitBranchName, + initialPrompt: input.initialPrompt, + initialized: false, + previousStatus: null as string | null, }), actions: { + async initialize(c, cmd: InitializeCommand): Promise { + const self = selfTask(c); + const result = await self.send(taskWorkflowQueueName("task.command.initialize"), cmd ?? {}, { + wait: true, + timeout: 60_000, + }); + return expectQueueResponse(result); + }, + + async provision(c, cmd: InitializeCommand): Promise<{ ok: true }> { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.provision"), cmd ?? {}, { + wait: false, + }); + return { ok: true }; + }, + + async attach(c, cmd?: TaskActionCommand): Promise<{ target: string; sessionId: string | null }> { + const self = selfTask(c); + const result = await self.send(taskWorkflowQueueName("task.command.attach"), cmd ?? {}, { + wait: true, + timeout: 20_000, + }); + return expectQueueResponse<{ target: string; sessionId: string | null }>(result); + }, + + async switch(c): Promise<{ switchTarget: string }> { + const self = selfTask(c); + const result = await self.send( + taskWorkflowQueueName("task.command.switch"), + {}, + { + wait: true, + timeout: 20_000, + }, + ); + return expectQueueResponse<{ switchTarget: string }>(result); + }, + + async push(c, cmd?: TaskActionCommand): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.push"), cmd ?? {}, { + wait: false, + }); + }, + + async sync(c, cmd?: TaskActionCommand): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.sync"), cmd ?? {}, { + wait: false, + }); + }, + + async merge(c, cmd?: TaskActionCommand): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.merge"), cmd ?? {}, { + wait: false, + }); + }, + + async archive(c, cmd?: TaskActionCommand): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.archive"), cmd ?? {}, { + wait: false, + }); + }, + + async kill(c, cmd?: TaskActionCommand): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.kill"), cmd ?? {}, { + wait: false, + }); + }, + async get(c): Promise { - return await getCurrentRecord(c); + return await getCurrentRecord({ db: c.db, state: c.state }); }, async getTaskSummary(c) { return await getTaskSummary(c); }, - async getTaskDetail(c, input?: { authSessionId?: string }) { - return await getTaskDetail(c, input?.authSessionId); + async getTaskDetail(c) { + return await getTaskDetail(c); }, - async getSessionDetail(c, input: { sessionId: string; authSessionId?: string }) { - return await getSessionDetail(c, input.sessionId, input.authSessionId); + async getSessionDetail(c, input: { sessionId: string }) { + return await getSessionDetail(c, input.sessionId); }, - // Direct actions migrated from queue: - async markUnread(c, input: { authSessionId?: string }) { - await markWorkspaceUnread(c, input?.authSessionId); + async markWorkbenchUnread(c): Promise { + const self = selfTask(c); + await self.send( + taskWorkflowQueueName("task.command.workbench.mark_unread"), + {}, + { + wait: true, + timeout: 20_000, + }, + ); }, - async renameTask(c, input: { value: string }) { - await renameWorkspaceTask(c, input.value); + + async renameWorkbenchTask(c, input: TaskWorkbenchRenameInput): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.workbench.rename_task"), { value: input.value } satisfies TaskWorkbenchValueCommand, { + wait: true, + timeout: 20_000, + }); }, - async renameSession(c, input: { sessionId: string; title: string }) { - await renameWorkspaceSession(c, input.sessionId, input.title); + + async renameWorkbenchBranch(c, input: TaskWorkbenchRenameInput): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.workbench.rename_branch"), { value: input.value } satisfies TaskWorkbenchValueCommand, { + wait: false, + }); }, - async selectSession(c, input: { sessionId: string; authSessionId?: string }) { - await selectWorkspaceSession(c, input.sessionId, input?.authSessionId); + + async createWorkbenchSession(c, input?: { model?: string }): Promise<{ tabId: string }> { + const self = selfTask(c); + const result = await self.send( + taskWorkflowQueueName("task.command.workbench.create_session"), + { ...(input?.model ? { model: input.model } : {}) } satisfies TaskWorkbenchCreateSessionCommand, + { + wait: true, + timeout: 5 * 60_000, + }, + ); + return expectQueueResponse<{ tabId: string }>(result); }, - async setSessionUnread(c, input: { sessionId: string; unread: boolean; authSessionId?: string }) { - await setWorkspaceSessionUnread(c, input.sessionId, input.unread, input?.authSessionId); + + async renameWorkbenchSession(c, input: TaskWorkbenchRenameSessionInput): Promise { + const self = selfTask(c); + await self.send( + taskWorkflowQueueName("task.command.workbench.rename_session"), + { sessionId: input.tabId, title: input.title } satisfies TaskWorkbenchSessionTitleCommand, + { + wait: true, + timeout: 20_000, + }, + ); }, - async updateDraft(c, input: { sessionId: string; text: string; attachments: any[]; authSessionId?: string }) { - await updateWorkspaceDraft(c, input.sessionId, input.text, input.attachments, input?.authSessionId); + + async setWorkbenchSessionUnread(c, input: TaskWorkbenchSetSessionUnreadInput): Promise { + const self = selfTask(c); + await self.send( + taskWorkflowQueueName("task.command.workbench.set_session_unread"), + { sessionId: input.tabId, unread: input.unread } satisfies TaskWorkbenchSessionUnreadCommand, + { + wait: true, + timeout: 20_000, + }, + ); }, - async changeModel(c, input: { sessionId: string; model: string; authSessionId?: string }) { - await changeWorkspaceModel(c, input.sessionId, input.model, input?.authSessionId); + + async updateWorkbenchDraft(c, input: TaskWorkbenchUpdateDraftInput): Promise { + const self = selfTask(c); + await self.send( + taskWorkflowQueueName("task.command.workbench.update_draft"), + { + sessionId: input.tabId, + text: input.text, + attachments: input.attachments, + } satisfies TaskWorkbenchUpdateDraftCommand, + { + wait: true, + timeout: 20_000, + }, + ); }, - async refreshSessionTranscript(c, input: { sessionId: string }) { - await refreshWorkspaceSessionTranscript(c, input.sessionId); + + async changeWorkbenchModel(c, input: TaskWorkbenchChangeModelInput): Promise { + const self = selfTask(c); + await self.send( + taskWorkflowQueueName("task.command.workbench.change_model"), + { sessionId: input.tabId, model: input.model } satisfies TaskWorkbenchChangeModelCommand, + { + wait: true, + timeout: 20_000, + }, + ); }, - async refreshDerived(c) { - await refreshWorkspaceDerivedState(c); + + async sendWorkbenchMessage(c, input: TaskWorkbenchSendMessageInput): Promise { + const self = selfTask(c); + await self.send( + taskWorkflowQueueName("task.command.workbench.send_message"), + { + sessionId: input.tabId, + text: input.text, + attachments: input.attachments, + } satisfies TaskWorkbenchSendMessageCommand, + { + wait: false, + }, + ); }, - async syncSessionStatus(c, input: { sessionId: string; status: "running" | "idle" | "error"; at: number }) { - await syncWorkspaceSessionStatus(c, input.sessionId, input.status, input.at); + + async stopWorkbenchSession(c, input: TaskTabCommand): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.workbench.stop_session"), { sessionId: input.tabId } satisfies TaskWorkbenchSessionCommand, { + wait: false, + }); }, - async syncPullRequest(c, input: { pullRequest: any }) { - await syncTaskPullRequest(c, input?.pullRequest ?? null); + + async syncWorkbenchSessionStatus(c, input: TaskStatusSyncCommand): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.workbench.sync_session_status"), input, { + wait: true, + timeout: 20_000, + }); + }, + + async closeWorkbenchSession(c, input: TaskTabCommand): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.workbench.close_session"), { sessionId: input.tabId } satisfies TaskWorkbenchSessionCommand, { + wait: false, + }); + }, + + async publishWorkbenchPr(c): Promise { + const self = selfTask(c); + await self.send( + taskWorkflowQueueName("task.command.workbench.publish_pr"), + {}, + { + wait: false, + }, + ); + }, + + async revertWorkbenchFile(c, input: { path: string }): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.workbench.revert_file"), input, { + wait: false, + }); }, }, run: workflow(runTaskWorkflow), }); -export { taskWorkflowQueueName } from "./workflow/index.js"; +export { TASK_QUEUE_NAMES }; diff --git a/foundry/packages/backend/src/actors/task/workbench.ts b/foundry/packages/backend/src/actors/task/workbench.ts new file mode 100644 index 0000000..0d00e77 --- /dev/null +++ b/foundry/packages/backend/src/actors/task/workbench.ts @@ -0,0 +1,1256 @@ +// @ts-nocheck +import { randomUUID } from "node:crypto"; +import { basename } from "node:path"; +import { asc, eq } from "drizzle-orm"; +import { getActorRuntimeContext } from "../context.js"; +import { getOrCreateTaskStatusSync, getOrCreateProject, getOrCreateWorkspace, getSandboxInstance, selfTask } from "../handles.js"; +import { resolveWorkspaceGithubAuth } from "../../services/github-auth.js"; +import { task as taskTable, taskRuntime, taskWorkbenchSessions } from "./db/schema.js"; +import { getCurrentRecord } from "./workflow/common.js"; +import { taskWorkflowQueueName } from "./workflow/queue.js"; + +const STATUS_SYNC_INTERVAL_MS = 1_000; + +function emptyGitState() { + return { + fileChanges: [], + diffs: {}, + fileTree: [], + updatedAt: null as number | null, + }; +} + +async function ensureWorkbenchSessionTable(c: any): Promise { + await c.db.execute(` + CREATE TABLE IF NOT EXISTS task_workbench_sessions ( + session_id text PRIMARY KEY NOT NULL, + sandbox_session_id text, + session_name text NOT NULL, + model text NOT NULL, + status text DEFAULT 'ready' NOT NULL, + error_message text, + transcript_json text DEFAULT '[]' NOT NULL, + transcript_updated_at integer, + unread integer DEFAULT 0 NOT NULL, + draft_text text DEFAULT '' NOT NULL, + draft_attachments_json text DEFAULT '[]' NOT NULL, + draft_updated_at integer, + created integer DEFAULT 1 NOT NULL, + closed integer DEFAULT 0 NOT NULL, + thinking_since_ms integer, + created_at integer NOT NULL, + updated_at integer NOT NULL + ) + `); + await c.db.execute(`ALTER TABLE task_workbench_sessions ADD COLUMN sandbox_session_id text`).catch(() => {}); + await c.db.execute(`ALTER TABLE task_workbench_sessions ADD COLUMN status text DEFAULT 'ready' NOT NULL`).catch(() => {}); + await c.db.execute(`ALTER TABLE task_workbench_sessions ADD COLUMN error_message text`).catch(() => {}); + await c.db.execute(`ALTER TABLE task_workbench_sessions ADD COLUMN transcript_json text DEFAULT '[]' NOT NULL`).catch(() => {}); + await c.db.execute(`ALTER TABLE task_workbench_sessions ADD COLUMN transcript_updated_at integer`).catch(() => {}); +} + +async function ensureTaskRuntimeCacheColumns(c: any): Promise { + await c.db.execute(`ALTER TABLE task_runtime ADD COLUMN git_state_json text`).catch(() => {}); + await c.db.execute(`ALTER TABLE task_runtime ADD COLUMN git_state_updated_at integer`).catch(() => {}); + await c.db.execute(`ALTER TABLE task_runtime ADD COLUMN provision_stage text`).catch(() => {}); + await c.db.execute(`ALTER TABLE task_runtime ADD COLUMN provision_stage_updated_at integer`).catch(() => {}); +} + +function defaultModelForAgent(agentType: string | null | undefined) { + return agentType === "codex" ? "gpt-4o" : "claude-sonnet-4"; +} + +function agentKindForModel(model: string) { + if (model === "gpt-4o" || model === "o3") { + return "Codex"; + } + return "Claude"; +} + +export function agentTypeForModel(model: string) { + if (model === "gpt-4o" || model === "o3") { + return "codex"; + } + return "claude"; +} + +function repoLabelFromRemote(remoteUrl: string): string { + const trimmed = remoteUrl.trim(); + try { + const url = new URL(trimmed.startsWith("http") ? trimmed : `https://${trimmed}`); + const parts = url.pathname.replace(/\/+$/, "").split("/").filter(Boolean); + if (parts.length >= 2) { + return `${parts[0]}/${(parts[1] ?? "").replace(/\.git$/, "")}`; + } + } catch { + // ignore + } + + return basename(trimmed.replace(/\.git$/, "")); +} + +function parseDraftAttachments(value: string | null | undefined): Array { + if (!value) { + return []; + } + + try { + const parsed = JSON.parse(value) as unknown; + return Array.isArray(parsed) ? parsed : []; + } catch { + return []; + } +} + +function parseTranscript(value: string | null | undefined): Array { + if (!value) { + return []; + } + + try { + const parsed = JSON.parse(value) as unknown; + return Array.isArray(parsed) ? parsed : []; + } catch { + return []; + } +} + +function parseGitState(value: string | null | undefined): { fileChanges: Array; diffs: Record; fileTree: Array } { + if (!value) { + return emptyGitState(); + } + + try { + const parsed = JSON.parse(value) as { + fileChanges?: unknown; + diffs?: unknown; + fileTree?: unknown; + }; + return { + fileChanges: Array.isArray(parsed.fileChanges) ? parsed.fileChanges : [], + diffs: parsed.diffs && typeof parsed.diffs === "object" ? (parsed.diffs as Record) : {}, + fileTree: Array.isArray(parsed.fileTree) ? parsed.fileTree : [], + }; + } catch { + return emptyGitState(); + } +} + +export function shouldMarkSessionUnreadForStatus(meta: { thinkingSinceMs?: number | null }, status: "running" | "idle" | "error"): boolean { + if (status === "running") { + return false; + } + + // Only mark unread when we observe the transition out of an active thinking state. + // Repeated idle polls for an already-finished session must not flip unread back on. + return Boolean(meta.thinkingSinceMs); +} + +async function listSessionMetaRows(c: any, options?: { includeClosed?: boolean }): Promise> { + await ensureWorkbenchSessionTable(c); + const rows = await c.db.select().from(taskWorkbenchSessions).orderBy(asc(taskWorkbenchSessions.createdAt)).all(); + const mapped = rows.map((row: any) => ({ + ...row, + id: row.sessionId, + sessionId: row.sandboxSessionId ?? null, + tabId: row.sessionId, + sandboxSessionId: row.sandboxSessionId ?? null, + status: row.status ?? "ready", + errorMessage: row.errorMessage ?? null, + transcript: parseTranscript(row.transcriptJson), + transcriptUpdatedAt: row.transcriptUpdatedAt ?? null, + draftAttachments: parseDraftAttachments(row.draftAttachmentsJson), + draftUpdatedAtMs: row.draftUpdatedAt ?? null, + unread: row.unread === 1, + created: row.created === 1, + closed: row.closed === 1, + })); + + if (options?.includeClosed === true) { + return mapped; + } + + return mapped.filter((row: any) => row.closed !== true); +} + +async function nextSessionName(c: any): Promise { + const rows = await listSessionMetaRows(c, { includeClosed: true }); + return `Session ${rows.length + 1}`; +} + +async function readSessionMeta(c: any, sessionId: string): Promise { + await ensureWorkbenchSessionTable(c); + const row = await c.db.select().from(taskWorkbenchSessions).where(eq(taskWorkbenchSessions.sessionId, sessionId)).get(); + + if (!row) { + return null; + } + + return { + ...row, + id: row.sessionId, + sessionId: row.sandboxSessionId ?? null, + tabId: row.sessionId, + sandboxSessionId: row.sandboxSessionId ?? null, + status: row.status ?? "ready", + errorMessage: row.errorMessage ?? null, + transcript: parseTranscript(row.transcriptJson), + transcriptUpdatedAt: row.transcriptUpdatedAt ?? null, + draftAttachments: parseDraftAttachments(row.draftAttachmentsJson), + draftUpdatedAtMs: row.draftUpdatedAt ?? null, + unread: row.unread === 1, + created: row.created === 1, + closed: row.closed === 1, + }; +} + +async function ensureSessionMeta( + c: any, + params: { + tabId: string; + sandboxSessionId?: string | null; + model?: string; + sessionName?: string; + unread?: boolean; + created?: boolean; + status?: "pending_provision" | "pending_session_create" | "ready" | "error"; + errorMessage?: string | null; + }, +): Promise { + await ensureWorkbenchSessionTable(c); + const existing = await readSessionMeta(c, params.tabId); + if (existing) { + return existing; + } + + const now = Date.now(); + const sessionName = params.sessionName ?? (await nextSessionName(c)); + const model = params.model ?? defaultModelForAgent(c.state.agentType); + const unread = params.unread ?? false; + + await c.db + .insert(taskWorkbenchSessions) + .values({ + sessionId: params.tabId, + sandboxSessionId: params.sandboxSessionId ?? null, + sessionName, + model, + status: params.status ?? "ready", + errorMessage: params.errorMessage ?? null, + transcriptJson: "[]", + transcriptUpdatedAt: null, + unread: unread ? 1 : 0, + draftText: "", + draftAttachmentsJson: "[]", + draftUpdatedAt: null, + created: params.created === false ? 0 : 1, + closed: 0, + thinkingSinceMs: null, + createdAt: now, + updatedAt: now, + }) + .run(); + + return await readSessionMeta(c, params.tabId); +} + +async function updateSessionMeta(c: any, tabId: string, values: Record): Promise { + await ensureSessionMeta(c, { tabId }); + await c.db + .update(taskWorkbenchSessions) + .set({ + ...values, + updatedAt: Date.now(), + }) + .where(eq(taskWorkbenchSessions.sessionId, tabId)) + .run(); + return await readSessionMeta(c, tabId); +} + +async function readSessionMetaBySandboxSessionId(c: any, sandboxSessionId: string): Promise { + await ensureWorkbenchSessionTable(c); + const row = await c.db.select().from(taskWorkbenchSessions).where(eq(taskWorkbenchSessions.sandboxSessionId, sandboxSessionId)).get(); + if (!row) { + return null; + } + return await readSessionMeta(c, row.sessionId); +} + +async function requireReadySessionMeta(c: any, tabId: string): Promise { + const meta = await readSessionMeta(c, tabId); + if (!meta) { + throw new Error(`Unknown workbench tab: ${tabId}`); + } + if (meta.status !== "ready" || !meta.sandboxSessionId) { + throw new Error(meta.errorMessage ?? "This workbench tab is still preparing"); + } + return meta; +} + +function shellFragment(parts: string[]): string { + return parts.join(" && "); +} + +async function executeInSandbox( + c: any, + params: { + sandboxId: string; + cwd: string; + command: string; + label: string; + }, +): Promise<{ exitCode: number; result: string }> { + const { providers } = getActorRuntimeContext(); + const provider = providers.get(c.state.providerId); + return await provider.executeCommand({ + workspaceId: c.state.workspaceId, + sandboxId: params.sandboxId, + command: `bash -lc ${JSON.stringify(shellFragment([`cd ${JSON.stringify(params.cwd)}`, params.command]))}`, + label: params.label, + }); +} + +function parseGitStatus(output: string): Array<{ path: string; type: "M" | "A" | "D" }> { + return output + .split("\n") + .map((line) => line.trimEnd()) + .filter(Boolean) + .map((line) => { + const status = line.slice(0, 2).trim(); + const rawPath = line.slice(3).trim(); + const path = rawPath.includes(" -> ") ? (rawPath.split(" -> ").pop() ?? rawPath) : rawPath; + const type = status.includes("D") ? "D" : status.includes("A") || status === "??" ? "A" : "M"; + return { path, type }; + }); +} + +function parseNumstat(output: string): Map { + const map = new Map(); + for (const line of output.split("\n")) { + const trimmed = line.trim(); + if (!trimmed) continue; + const [addedRaw, removedRaw, ...pathParts] = trimmed.split("\t"); + const path = pathParts.join("\t").trim(); + if (!path) continue; + map.set(path, { + added: Number.parseInt(addedRaw ?? "0", 10) || 0, + removed: Number.parseInt(removedRaw ?? "0", 10) || 0, + }); + } + return map; +} + +function buildFileTree(paths: string[]): Array { + const root = { + children: new Map(), + }; + + for (const path of paths) { + const parts = path.split("/").filter(Boolean); + let current = root; + let currentPath = ""; + + for (let index = 0; index < parts.length; index += 1) { + const part = parts[index]!; + currentPath = currentPath ? `${currentPath}/${part}` : part; + const isDir = index < parts.length - 1; + let node = current.children.get(part); + if (!node) { + node = { + name: part, + path: currentPath, + isDir, + children: isDir ? new Map() : undefined, + }; + current.children.set(part, node); + } else if (isDir && !(node.children instanceof Map)) { + node.children = new Map(); + } + current = node; + } + } + + function sortNodes(nodes: Iterable): Array { + return [...nodes] + .map((node) => + node.isDir + ? { + name: node.name, + path: node.path, + isDir: true, + children: sortNodes(node.children?.values?.() ?? []), + } + : { + name: node.name, + path: node.path, + isDir: false, + }, + ) + .sort((left, right) => { + if (left.isDir !== right.isDir) { + return left.isDir ? -1 : 1; + } + return left.path.localeCompare(right.path); + }); + } + + return sortNodes(root.children.values()); +} + +async function collectWorkbenchGitState(c: any, record: any) { + const activeSandboxId = record.activeSandboxId; + const activeSandbox = activeSandboxId != null ? ((record.sandboxes ?? []).find((candidate: any) => candidate.sandboxId === activeSandboxId) ?? null) : null; + const cwd = activeSandbox?.cwd ?? record.sandboxes?.[0]?.cwd ?? null; + if (!activeSandboxId || !cwd) { + return { + fileChanges: [], + diffs: {}, + fileTree: [], + }; + } + + const statusResult = await executeInSandbox(c, { + sandboxId: activeSandboxId, + cwd, + command: "git status --porcelain=v1 -uall", + label: "git status", + }); + if (statusResult.exitCode !== 0) { + return { + fileChanges: [], + diffs: {}, + fileTree: [], + }; + } + + const statusRows = parseGitStatus(statusResult.result); + const numstatResult = await executeInSandbox(c, { + sandboxId: activeSandboxId, + cwd, + command: "git diff --numstat", + label: "git diff numstat", + }); + const numstat = parseNumstat(numstatResult.result); + + const filesResult = await executeInSandbox(c, { + sandboxId: activeSandboxId, + cwd, + command: "git ls-files --cached --others --exclude-standard", + label: "git ls-files", + }); + const allPaths = filesResult.result + .split("\n") + .map((line) => line.trim()) + .filter(Boolean); + + const diffs: Record = {}; + for (const row of statusRows) { + const diffResult = await executeInSandbox(c, { + sandboxId: activeSandboxId, + cwd, + command: `git diff -- ${JSON.stringify(row.path)}`, + label: `git diff ${row.path}`, + }); + diffs[row.path] = diffResult.exitCode === 0 ? diffResult.result : ""; + } + + return { + fileChanges: statusRows.map((row) => { + const counts = numstat.get(row.path) ?? { added: 0, removed: 0 }; + return { + path: row.path, + added: counts.added, + removed: counts.removed, + type: row.type, + }; + }), + diffs, + fileTree: buildFileTree(allPaths), + }; +} + +async function readCachedGitState(c: any): Promise<{ fileChanges: Array; diffs: Record; fileTree: Array; updatedAt: number | null }> { + await ensureTaskRuntimeCacheColumns(c); + const row = await c.db + .select({ + gitStateJson: taskRuntime.gitStateJson, + gitStateUpdatedAt: taskRuntime.gitStateUpdatedAt, + }) + .from(taskRuntime) + .where(eq(taskRuntime.id, 1)) + .get(); + const parsed = parseGitState(row?.gitStateJson); + return { + ...parsed, + updatedAt: row?.gitStateUpdatedAt ?? null, + }; +} + +async function writeCachedGitState(c: any, gitState: { fileChanges: Array; diffs: Record; fileTree: Array }): Promise { + await ensureTaskRuntimeCacheColumns(c); + const now = Date.now(); + await c.db + .update(taskRuntime) + .set({ + gitStateJson: JSON.stringify(gitState), + gitStateUpdatedAt: now, + updatedAt: now, + }) + .where(eq(taskRuntime.id, 1)) + .run(); +} + +async function readSessionTranscript(c: any, record: any, sessionId: string) { + const sandboxId = record.activeSandboxId ?? record.sandboxes?.[0]?.sandboxId ?? null; + if (!sandboxId) { + return []; + } + + const sandbox = getSandboxInstance(c, c.state.workspaceId, c.state.providerId, sandboxId); + const page = await sandbox.listSessionEvents({ + sessionId, + limit: 100, + }); + return page.items.map((event: any) => ({ + id: event.id, + eventIndex: event.eventIndex, + sessionId: event.sessionId, + createdAt: event.createdAt, + connectionId: event.connectionId, + sender: event.sender, + payload: event.payload, + })); +} + +async function writeSessionTranscript(c: any, tabId: string, transcript: Array): Promise { + await updateSessionMeta(c, tabId, { + transcriptJson: JSON.stringify(transcript), + transcriptUpdatedAt: Date.now(), + }); +} + +async function enqueueWorkbenchRefresh( + c: any, + command: "task.command.workbench.refresh_derived" | "task.command.workbench.refresh_session_transcript", + body: Record, +): Promise { + const self = selfTask(c); + await self.send(command, body, { wait: false }); +} + +async function maybeScheduleWorkbenchRefreshes(c: any, record: any, sessions: Array): Promise { + const gitState = await readCachedGitState(c); + if (record.activeSandboxId && !gitState.updatedAt) { + await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_derived", {}); + } + + for (const session of sessions) { + if (session.closed || session.status !== "ready" || !session.sandboxSessionId || session.transcriptUpdatedAt) { + continue; + } + await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_session_transcript", { + sessionId: session.sandboxSessionId, + }); + } +} + +function activeSessionStatus(record: any, sessionId: string) { + if (record.activeSessionId !== sessionId) { + return "idle"; + } + + if (record.status === "running") { + return "running"; + } + if (record.status === "error") { + return "error"; + } + return "idle"; +} + +async function readPullRequestSummary(c: any, branchName: string | null) { + if (!branchName) { + return null; + } + + try { + const project = await getOrCreateProject(c, c.state.workspaceId, c.state.repoId, c.state.repoRemote); + return await project.getPullRequestForBranch({ branchName }); + } catch { + return null; + } +} + +export async function ensureWorkbenchSeeded(c: any): Promise { + await ensureTaskRuntimeCacheColumns(c); + const record = await getCurrentRecord({ db: c.db, state: c.state }); + if (record.activeSessionId) { + await ensureSessionMeta(c, { + tabId: record.activeSessionId, + sandboxSessionId: record.activeSessionId, + model: defaultModelForAgent(record.agentType), + sessionName: "Session 1", + status: "ready", + }); + } + return record; +} + +function buildSessionSummary(record: any, meta: any): any { + const derivedSandboxSessionId = meta.sandboxSessionId ?? (meta.status === "pending_provision" && record.activeSessionId ? record.activeSessionId : null); + const sessionStatus = + meta.status === "ready" && derivedSandboxSessionId ? activeSessionStatus(record, derivedSandboxSessionId) : meta.status === "error" ? "error" : "idle"; + let thinkingSinceMs = meta.thinkingSinceMs ?? null; + let unread = Boolean(meta.unread); + if (thinkingSinceMs && sessionStatus !== "running") { + thinkingSinceMs = null; + unread = true; + } + + return { + id: meta.id, + sessionId: derivedSandboxSessionId, + sessionName: meta.sessionName, + agent: agentKindForModel(meta.model), + model: meta.model, + status: sessionStatus, + thinkingSinceMs: sessionStatus === "running" ? thinkingSinceMs : null, + unread, + created: Boolean(meta.created || derivedSandboxSessionId), + }; +} + +function buildSessionDetailFromMeta(record: any, meta: any): any { + const summary = buildSessionSummary(record, meta); + return { + sessionId: meta.tabId, + tabId: meta.tabId, + sandboxSessionId: summary.sessionId, + sessionName: summary.sessionName, + agent: summary.agent, + model: summary.model, + status: summary.status, + thinkingSinceMs: summary.thinkingSinceMs, + unread: summary.unread, + created: summary.created, + draft: { + text: meta.draftText ?? "", + attachments: Array.isArray(meta.draftAttachments) ? meta.draftAttachments : [], + updatedAtMs: meta.draftUpdatedAtMs ?? null, + }, + transcript: meta.transcript ?? [], + }; +} + +/** + * Builds a WorkbenchTaskSummary from local task actor state. Task actors push + * this to the parent workspace actor so workspace sidebar reads stay local. + */ +export async function buildTaskSummary(c: any): Promise { + const record = await ensureWorkbenchSeeded(c); + const sessions = await listSessionMetaRows(c); + await maybeScheduleWorkbenchRefreshes(c, record, sessions); + + return { + id: c.state.taskId, + repoId: c.state.repoId, + title: record.title ?? "New Task", + status: record.status === "archived" ? "archived" : record.status === "running" ? "running" : record.status === "idle" ? "idle" : "new", + repoName: repoLabelFromRemote(c.state.repoRemote), + updatedAtMs: record.updatedAt, + branch: record.branchName, + pullRequest: await readPullRequestSummary(c, record.branchName), + sessionsSummary: sessions.map((meta) => buildSessionSummary(record, meta)), + }; +} + +/** + * Builds a WorkbenchTaskDetail from local task actor state for direct task + * subscribers. This is a full replacement payload, not a patch. + */ +export async function buildTaskDetail(c: any): Promise { + const record = await ensureWorkbenchSeeded(c); + const gitState = await readCachedGitState(c); + const sessions = await listSessionMetaRows(c); + await maybeScheduleWorkbenchRefreshes(c, record, sessions); + const summary = await buildTaskSummary(c); + + return { + ...summary, + task: record.task, + agentType: record.agentType === "claude" || record.agentType === "codex" ? record.agentType : null, + runtimeStatus: record.status, + statusMessage: record.statusMessage ?? null, + activeSessionId: record.activeSessionId ?? null, + diffStat: record.diffStat ?? null, + prUrl: record.prUrl ?? null, + reviewStatus: record.reviewStatus ?? null, + fileChanges: gitState.fileChanges, + diffs: gitState.diffs, + fileTree: gitState.fileTree, + minutesUsed: 0, + sandboxes: (record.sandboxes ?? []).map((sandbox: any) => ({ + providerId: sandbox.providerId, + sandboxId: sandbox.sandboxId, + cwd: sandbox.cwd ?? null, + })), + activeSandboxId: record.activeSandboxId ?? null, + }; +} + +/** + * Builds a WorkbenchSessionDetail for a specific session tab. + */ +export async function buildSessionDetail(c: any, tabId: string): Promise { + const record = await ensureWorkbenchSeeded(c); + const meta = await readSessionMeta(c, tabId); + if (!meta || meta.closed) { + throw new Error(`Unknown workbench session tab: ${tabId}`); + } + + return buildSessionDetailFromMeta(record, meta); +} + +export async function getTaskSummary(c: any): Promise { + return await buildTaskSummary(c); +} + +export async function getTaskDetail(c: any): Promise { + return await buildTaskDetail(c); +} + +export async function getSessionDetail(c: any, tabId: string): Promise { + return await buildSessionDetail(c, tabId); +} + +/** + * Replaces the old notifyWorkbenchUpdated pattern. + * + * The task actor emits two kinds of updates: + * - Push summary state up to the parent workspace actor so the sidebar + * materialized projection stays current. + * - Broadcast full detail/session payloads down to direct task subscribers. + */ +export async function broadcastTaskUpdate(c: any, options?: { sessionId?: string }): Promise { + const workspace = await getOrCreateWorkspace(c, c.state.workspaceId); + await workspace.applyTaskSummaryUpdate({ taskSummary: await buildTaskSummary(c) }); + c.broadcast("taskUpdated", { + type: "taskDetailUpdated", + detail: await buildTaskDetail(c), + }); + + if (options?.sessionId) { + c.broadcast("sessionUpdated", { + type: "sessionUpdated", + session: await buildSessionDetail(c, options.sessionId), + }); + } +} + +export async function refreshWorkbenchDerivedState(c: any): Promise { + const record = await ensureWorkbenchSeeded(c); + const gitState = await collectWorkbenchGitState(c, record); + await writeCachedGitState(c, gitState); + await broadcastTaskUpdate(c); +} + +export async function refreshWorkbenchSessionTranscript(c: any, sessionId: string): Promise { + const record = await ensureWorkbenchSeeded(c); + const meta = (await readSessionMetaBySandboxSessionId(c, sessionId)) ?? (await readSessionMeta(c, sessionId)); + if (!meta?.sandboxSessionId) { + return; + } + + const transcript = await readSessionTranscript(c, record, meta.sandboxSessionId); + await writeSessionTranscript(c, meta.tabId, transcript); + await broadcastTaskUpdate(c, { sessionId: meta.tabId }); +} + +export async function renameWorkbenchTask(c: any, value: string): Promise { + const nextTitle = value.trim(); + if (!nextTitle) { + throw new Error("task title is required"); + } + + await c.db + .update(taskTable) + .set({ + title: nextTitle, + updatedAt: Date.now(), + }) + .where(eq(taskTable.id, 1)) + .run(); + c.state.title = nextTitle; + await broadcastTaskUpdate(c); +} + +export async function renameWorkbenchBranch(c: any, value: string): Promise { + const nextBranch = value.trim(); + if (!nextBranch) { + throw new Error("branch name is required"); + } + + const record = await ensureWorkbenchSeeded(c); + if (!record.branchName) { + throw new Error("cannot rename branch before task branch exists"); + } + if (!record.activeSandboxId) { + throw new Error("cannot rename branch without an active sandbox"); + } + const activeSandbox = (record.sandboxes ?? []).find((candidate: any) => candidate.sandboxId === record.activeSandboxId) ?? null; + if (!activeSandbox?.cwd) { + throw new Error("cannot rename branch without a sandbox cwd"); + } + + const renameResult = await executeInSandbox(c, { + sandboxId: record.activeSandboxId, + cwd: activeSandbox.cwd, + command: [ + `git branch -m ${JSON.stringify(record.branchName)} ${JSON.stringify(nextBranch)}`, + `if git ls-remote --exit-code --heads origin ${JSON.stringify(record.branchName)} >/dev/null 2>&1; then git push origin :${JSON.stringify(record.branchName)}; fi`, + `git push origin ${JSON.stringify(nextBranch)}`, + `git branch --set-upstream-to=${JSON.stringify(`origin/${nextBranch}`)} ${JSON.stringify(nextBranch)} || git push --set-upstream origin ${JSON.stringify(nextBranch)}`, + ].join(" && "), + label: `git branch -m ${record.branchName} ${nextBranch}`, + }); + if (renameResult.exitCode !== 0) { + throw new Error(`branch rename failed (${renameResult.exitCode}): ${renameResult.result}`); + } + + await c.db + .update(taskTable) + .set({ + branchName: nextBranch, + updatedAt: Date.now(), + }) + .where(eq(taskTable.id, 1)) + .run(); + c.state.branchName = nextBranch; + + const project = await getOrCreateProject(c, c.state.workspaceId, c.state.repoId, c.state.repoRemote); + await project.registerTaskBranch({ + taskId: c.state.taskId, + branchName: nextBranch, + }); + await broadcastTaskUpdate(c); +} + +export async function createWorkbenchSession(c: any, model?: string): Promise<{ tabId: string }> { + let record = await ensureWorkbenchSeeded(c); + if (!record.activeSandboxId) { + // Fire-and-forget: enqueue provisioning without waiting to avoid self-deadlock + // (this handler already runs inside the task workflow loop, so wait:true would deadlock). + const providerId = record.providerId ?? c.state.providerId ?? getActorRuntimeContext().providers.defaultProviderId(); + await selfTask(c).send(taskWorkflowQueueName("task.command.provision"), { providerId }, { wait: false }); + throw new Error("sandbox is provisioning — retry shortly"); + } + + if (record.activeSessionId) { + const existingSessions = await listSessionMetaRows(c); + if (existingSessions.length === 0) { + await ensureSessionMeta(c, { + tabId: record.activeSessionId, + sandboxSessionId: record.activeSessionId, + model: model ?? defaultModelForAgent(record.agentType), + sessionName: "Session 1", + status: "ready", + }); + await broadcastTaskUpdate(c, { sessionId: record.activeSessionId }); + return { tabId: record.activeSessionId }; + } + } + + const tabId = `tab-${randomUUID()}`; + await ensureSessionMeta(c, { + tabId, + model: model ?? defaultModelForAgent(record.agentType), + status: record.activeSandboxId ? "pending_session_create" : "pending_provision", + created: false, + }); + + const providerId = record.providerId ?? c.state.providerId ?? getActorRuntimeContext().providers.defaultProviderId(); + const self = selfTask(c); + if (!record.activeSandboxId && !String(record.status ?? "").startsWith("init_")) { + await self.send("task.command.provision", { providerId }, { wait: false }); + } + await self.send( + "task.command.workbench.ensure_session", + { tabId, ...(model ? { model } : {}) }, + { + wait: false, + }, + ); + await broadcastTaskUpdate(c, { sessionId: tabId }); + return { tabId }; +} + +export async function ensureWorkbenchSession(c: any, tabId: string, model?: string): Promise { + const meta = await readSessionMeta(c, tabId); + if (!meta || meta.closed) { + return; + } + + const record = await ensureWorkbenchSeeded(c); + if (!record.activeSandboxId) { + await updateSessionMeta(c, tabId, { + status: "pending_provision", + errorMessage: null, + }); + return; + } + + if (!meta.sandboxSessionId && record.activeSessionId && meta.status === "pending_provision") { + const existingTabForActiveSession = await readSessionMetaBySandboxSessionId(c, record.activeSessionId); + if (existingTabForActiveSession && existingTabForActiveSession.tabId !== tabId) { + await updateSessionMeta(c, existingTabForActiveSession.tabId, { + closed: 1, + }); + } + await updateSessionMeta(c, tabId, { + sandboxSessionId: record.activeSessionId, + status: "ready", + errorMessage: null, + created: 1, + }); + await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_session_transcript", { + sessionId: record.activeSessionId, + }); + await broadcastTaskUpdate(c, { sessionId: tabId }); + return; + } + + if (meta.sandboxSessionId) { + await updateSessionMeta(c, tabId, { + status: "ready", + errorMessage: null, + }); + await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_session_transcript", { + sessionId: meta.sandboxSessionId, + }); + await broadcastTaskUpdate(c, { sessionId: tabId }); + return; + } + + const activeSandbox = (record.sandboxes ?? []).find((candidate: any) => candidate.sandboxId === record.activeSandboxId) ?? null; + const cwd = activeSandbox?.cwd ?? record.sandboxes?.[0]?.cwd ?? null; + if (!cwd) { + await updateSessionMeta(c, tabId, { + status: "error", + errorMessage: "cannot create session without a sandbox cwd", + }); + await broadcastTaskUpdate(c, { sessionId: tabId }); + return; + } + + await updateSessionMeta(c, tabId, { + status: "pending_session_create", + errorMessage: null, + }); + + try { + const sandbox = getSandboxInstance(c, c.state.workspaceId, c.state.providerId, record.activeSandboxId); + const created = await sandbox.createSession({ + prompt: "", + cwd, + agent: agentTypeForModel(model ?? meta.model ?? defaultModelForAgent(record.agentType)), + }); + if (!created.id) { + throw new Error(created.error ?? "sandbox-agent session creation failed"); + } + + await updateSessionMeta(c, tabId, { + sandboxSessionId: created.id, + status: "ready", + errorMessage: null, + }); + await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_session_transcript", { + sessionId: created.id, + }); + } catch (error) { + await updateSessionMeta(c, tabId, { + status: "error", + errorMessage: error instanceof Error ? error.message : String(error), + }); + } + + await broadcastTaskUpdate(c, { sessionId: tabId }); +} + +export async function enqueuePendingWorkbenchSessions(c: any): Promise { + const self = selfTask(c); + const pending = (await listSessionMetaRows(c, { includeClosed: true })).filter( + (row) => row.closed !== true && row.status !== "ready" && row.status !== "error", + ); + + for (const row of pending) { + await self.send( + "task.command.workbench.ensure_session", + { + tabId: row.tabId, + model: row.model, + }, + { + wait: false, + }, + ); + } +} + +export async function renameWorkbenchSession(c: any, sessionId: string, title: string): Promise { + const trimmed = title.trim(); + if (!trimmed) { + throw new Error("session title is required"); + } + await updateSessionMeta(c, sessionId, { + sessionName: trimmed, + }); + await broadcastTaskUpdate(c, { sessionId }); +} + +export async function setWorkbenchSessionUnread(c: any, sessionId: string, unread: boolean): Promise { + await updateSessionMeta(c, sessionId, { + unread: unread ? 1 : 0, + }); + await broadcastTaskUpdate(c, { sessionId }); +} + +export async function updateWorkbenchDraft(c: any, sessionId: string, text: string, attachments: Array): Promise { + await updateSessionMeta(c, sessionId, { + draftText: text, + draftAttachmentsJson: JSON.stringify(attachments), + draftUpdatedAt: Date.now(), + }); + await broadcastTaskUpdate(c, { sessionId }); +} + +export async function changeWorkbenchModel(c: any, sessionId: string, model: string): Promise { + await updateSessionMeta(c, sessionId, { + model, + }); + await broadcastTaskUpdate(c, { sessionId }); +} + +export async function sendWorkbenchMessage(c: any, sessionId: string, text: string, attachments: Array): Promise { + const record = await ensureWorkbenchSeeded(c); + if (!record.activeSandboxId) { + throw new Error("cannot send message without an active sandbox"); + } + + const meta = await requireReadySessionMeta(c, sessionId); + const sandbox = getSandboxInstance(c, c.state.workspaceId, c.state.providerId, record.activeSandboxId); + const prompt = [text.trim(), ...attachments.map((attachment: any) => `@ ${attachment.filePath}:${attachment.lineNumber}\n${attachment.lineContent}`)] + .filter(Boolean) + .join("\n\n"); + if (!prompt) { + throw new Error("message text is required"); + } + + await sandbox.sendPrompt({ + sessionId: meta.sandboxSessionId, + prompt, + notification: true, + }); + + await updateSessionMeta(c, sessionId, { + unread: 0, + created: 1, + draftText: "", + draftAttachmentsJson: "[]", + draftUpdatedAt: Date.now(), + thinkingSinceMs: Date.now(), + }); + + await c.db + .update(taskRuntime) + .set({ + activeSessionId: meta.sandboxSessionId, + updatedAt: Date.now(), + }) + .where(eq(taskRuntime.id, 1)) + .run(); + + const sync = await getOrCreateTaskStatusSync(c, c.state.workspaceId, c.state.repoId, c.state.taskId, record.activeSandboxId, meta.sandboxSessionId, { + workspaceId: c.state.workspaceId, + repoId: c.state.repoId, + taskId: c.state.taskId, + providerId: c.state.providerId, + sandboxId: record.activeSandboxId, + sessionId: meta.sandboxSessionId, + intervalMs: STATUS_SYNC_INTERVAL_MS, + }); + await sync.setIntervalMs({ intervalMs: STATUS_SYNC_INTERVAL_MS }); + await sync.start(); + await sync.force(); + await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_session_transcript", { + sessionId: meta.sandboxSessionId, + }); + await broadcastTaskUpdate(c, { sessionId }); +} + +export async function stopWorkbenchSession(c: any, sessionId: string): Promise { + const record = await ensureWorkbenchSeeded(c); + if (!record.activeSandboxId) { + return; + } + const meta = await requireReadySessionMeta(c, sessionId); + const sandbox = getSandboxInstance(c, c.state.workspaceId, c.state.providerId, record.activeSandboxId); + await sandbox.cancelSession({ sessionId: meta.sandboxSessionId }); + await updateSessionMeta(c, sessionId, { + thinkingSinceMs: null, + }); + await broadcastTaskUpdate(c, { sessionId }); +} + +export async function syncWorkbenchSessionStatus(c: any, sessionId: string, status: "running" | "idle" | "error", at: number): Promise { + const record = await ensureWorkbenchSeeded(c); + const meta = (await readSessionMetaBySandboxSessionId(c, sessionId)) ?? (await ensureSessionMeta(c, { tabId: sessionId, sandboxSessionId: sessionId })); + let changed = false; + + if (record.activeSessionId === sessionId || record.activeSessionId === meta.sandboxSessionId) { + const mappedStatus = status === "running" ? "running" : status === "error" ? "error" : "idle"; + if (record.status !== mappedStatus) { + await c.db + .update(taskTable) + .set({ + status: mappedStatus, + updatedAt: at, + }) + .where(eq(taskTable.id, 1)) + .run(); + changed = true; + } + + const statusMessage = `session:${status}`; + if (record.statusMessage !== statusMessage) { + await c.db + .update(taskRuntime) + .set({ + statusMessage, + updatedAt: at, + }) + .where(eq(taskRuntime.id, 1)) + .run(); + changed = true; + } + } + + if (status === "running") { + if (!meta.thinkingSinceMs) { + await updateSessionMeta(c, sessionId, { + thinkingSinceMs: at, + }); + changed = true; + } + } else { + if (meta.thinkingSinceMs) { + await updateSessionMeta(c, sessionId, { + thinkingSinceMs: null, + }); + changed = true; + } + if (!meta.unread && shouldMarkSessionUnreadForStatus(meta, status)) { + await updateSessionMeta(c, sessionId, { + unread: 1, + }); + changed = true; + } + } + + if (changed) { + if (status !== "running") { + await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_session_transcript", { + sessionId, + }); + await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_derived", {}); + } + await broadcastTaskUpdate(c, { sessionId: meta.tabId }); + } +} + +export async function closeWorkbenchSession(c: any, sessionId: string): Promise { + const record = await ensureWorkbenchSeeded(c); + const sessions = await listSessionMetaRows(c); + if (sessions.filter((candidate) => candidate.closed !== true).length <= 1) { + return; + } + + const meta = await readSessionMeta(c, sessionId); + if (!meta) { + return; + } + if (record.activeSandboxId && meta.sandboxSessionId) { + const sandbox = getSandboxInstance(c, c.state.workspaceId, c.state.providerId, record.activeSandboxId); + await sandbox.destroySession({ sessionId: meta.sandboxSessionId }); + } + await updateSessionMeta(c, sessionId, { + closed: 1, + thinkingSinceMs: null, + }); + if (record.activeSessionId === sessionId || record.activeSessionId === meta.sandboxSessionId) { + await c.db + .update(taskRuntime) + .set({ + activeSessionId: null, + updatedAt: Date.now(), + }) + .where(eq(taskRuntime.id, 1)) + .run(); + } + await broadcastTaskUpdate(c); +} + +export async function markWorkbenchUnread(c: any): Promise { + const sessions = await listSessionMetaRows(c); + const latest = sessions[sessions.length - 1]; + if (!latest) { + return; + } + await updateSessionMeta(c, latest.tabId, { + unread: 1, + }); + await broadcastTaskUpdate(c, { sessionId: latest.tabId }); +} + +export async function publishWorkbenchPr(c: any): Promise { + const record = await ensureWorkbenchSeeded(c); + if (!record.branchName) { + throw new Error("cannot publish PR without a branch"); + } + const { driver } = getActorRuntimeContext(); + const auth = await resolveWorkspaceGithubAuth(c, c.state.workspaceId); + const created = await driver.github.createPr(c.state.repoLocalPath, record.branchName, record.title ?? c.state.task, undefined, { + githubToken: auth?.githubToken ?? null, + }); + await c.db + .update(taskTable) + .set({ + prSubmitted: 1, + updatedAt: Date.now(), + }) + .where(eq(taskTable.id, 1)) + .run(); + await broadcastTaskUpdate(c); +} + +export async function revertWorkbenchFile(c: any, path: string): Promise { + const record = await ensureWorkbenchSeeded(c); + if (!record.activeSandboxId) { + throw new Error("cannot revert file without an active sandbox"); + } + const activeSandbox = (record.sandboxes ?? []).find((candidate: any) => candidate.sandboxId === record.activeSandboxId) ?? null; + if (!activeSandbox?.cwd) { + throw new Error("cannot revert file without a sandbox cwd"); + } + + const result = await executeInSandbox(c, { + sandboxId: record.activeSandboxId, + cwd: activeSandbox.cwd, + command: `if git ls-files --error-unmatch -- ${JSON.stringify(path)} >/dev/null 2>&1; then git restore --staged --worktree -- ${JSON.stringify(path)} || git checkout -- ${JSON.stringify(path)}; else rm -f ${JSON.stringify(path)}; fi`, + label: `git restore ${path}`, + }); + if (result.exitCode !== 0) { + throw new Error(`file revert failed (${result.exitCode}): ${result.result}`); + } + await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_derived", {}); + await broadcastTaskUpdate(c); +} diff --git a/foundry/packages/backend/src/actors/task/workflow/commands.ts b/foundry/packages/backend/src/actors/task/workflow/commands.ts index 7ba2d2b..cc72ebf 100644 --- a/foundry/packages/backend/src/actors/task/workflow/commands.ts +++ b/foundry/packages/backend/src/actors/task/workflow/commands.ts @@ -1,9 +1,10 @@ // @ts-nocheck import { eq } from "drizzle-orm"; -import { getTaskSandbox } from "../../handles.js"; +import { getActorRuntimeContext } from "../../context.js"; +import { getOrCreateTaskStatusSync } from "../../handles.js"; import { logActorWarning, resolveErrorMessage } from "../../logging.js"; -import { task as taskTable } from "../db/schema.js"; -import { TASK_ROW_ID, appendAuditLog, getCurrentRecord, setTaskState } from "./common.js"; +import { task as taskTable, taskRuntime } from "../db/schema.js"; +import { TASK_ROW_ID, appendHistory, getCurrentRecord, setTaskState } from "./common.js"; import { pushActiveBranchActivity } from "./push.js"; async function withTimeout(promise: Promise, timeoutMs: number, label: string): Promise { @@ -24,29 +25,22 @@ async function withTimeout(promise: Promise, timeoutMs: number, label: str export async function handleAttachActivity(loopCtx: any, msg: any): Promise { const record = await getCurrentRecord(loopCtx); - let target = record.sandboxes.find((sandbox: any) => sandbox.sandboxId === record.activeSandboxId)?.switchTarget ?? ""; - const sessionId = msg.body?.sessionId ?? null; + const { providers } = getActorRuntimeContext(); + const activeSandbox = record.activeSandboxId ? (record.sandboxes.find((sb: any) => sb.sandboxId === record.activeSandboxId) ?? null) : null; + const provider = providers.get(activeSandbox?.providerId ?? record.providerId); + const target = await provider.attachTarget({ + workspaceId: loopCtx.state.workspaceId, + sandboxId: record.activeSandboxId ?? "", + }); - if (record.activeSandboxId) { - try { - const sandbox = getTaskSandbox(loopCtx, loopCtx.state.organizationId, record.activeSandboxId); - const connection = await sandbox.sandboxAgentConnection(); - if (typeof connection?.endpoint === "string" && connection.endpoint.length > 0) { - target = connection.endpoint; - } - } catch { - // Best effort; keep the last known switch target if the sandbox actor is unavailable. - } - } - - await appendAuditLog(loopCtx, "task.attach", { - target, - sessionId, + await appendHistory(loopCtx, "task.attach", { + target: target.target, + sessionId: record.activeSessionId, }); await msg.complete({ - target, - sessionId, + target: target.target, + sessionId: record.activeSessionId, }); } @@ -65,52 +59,114 @@ export async function handlePushActivity(loopCtx: any, msg: any): Promise await msg.complete({ ok: true }); } -export async function handleSimpleCommandActivity(loopCtx: any, msg: any, historyKind: string): Promise { - await appendAuditLog(loopCtx, historyKind, { reason: msg.body?.reason ?? null }); +export async function handleSimpleCommandActivity(loopCtx: any, msg: any, statusMessage: string, historyKind: string): Promise { + const db = loopCtx.db; + await db.update(taskRuntime).set({ statusMessage, updatedAt: Date.now() }).where(eq(taskRuntime.id, TASK_ROW_ID)).run(); + + await appendHistory(loopCtx, historyKind, { reason: msg.body?.reason ?? null }); await msg.complete({ ok: true }); } export async function handleArchiveActivity(loopCtx: any, msg: any): Promise { - await setTaskState(loopCtx, "archive_stop_status_sync"); + await setTaskState(loopCtx, "archive_stop_status_sync", "stopping status sync"); const record = await getCurrentRecord(loopCtx); - if (record.activeSandboxId) { - await setTaskState(loopCtx, "archive_release_sandbox"); - void withTimeout(getTaskSandbox(loopCtx, loopCtx.state.organizationId, record.activeSandboxId).destroy(), 45_000, "sandbox destroy").catch((error) => { - logActorWarning("task.commands", "failed to release sandbox during archive", { - organizationId: loopCtx.state.organizationId, + if (record.activeSandboxId && record.activeSessionId) { + try { + const sync = await getOrCreateTaskStatusSync( + loopCtx, + loopCtx.state.workspaceId, + loopCtx.state.repoId, + loopCtx.state.taskId, + record.activeSandboxId, + record.activeSessionId, + { + workspaceId: loopCtx.state.workspaceId, + repoId: loopCtx.state.repoId, + taskId: loopCtx.state.taskId, + providerId: record.providerId, + sandboxId: record.activeSandboxId, + sessionId: record.activeSessionId, + intervalMs: 2_000, + }, + ); + await withTimeout(sync.stop(), 15_000, "task status sync stop"); + } catch (error) { + logActorWarning("task.commands", "failed to stop status sync during archive", { + workspaceId: loopCtx.state.workspaceId, repoId: loopCtx.state.repoId, taskId: loopCtx.state.taskId, sandboxId: record.activeSandboxId, + sessionId: record.activeSessionId, + error: resolveErrorMessage(error), + }); + } + } + + if (record.activeSandboxId) { + await setTaskState(loopCtx, "archive_release_sandbox", "releasing sandbox"); + const { providers } = getActorRuntimeContext(); + const activeSandbox = record.sandboxes.find((sb: any) => sb.sandboxId === record.activeSandboxId) ?? null; + const provider = providers.get(activeSandbox?.providerId ?? record.providerId); + const workspaceId = loopCtx.state.workspaceId; + const repoId = loopCtx.state.repoId; + const taskId = loopCtx.state.taskId; + const sandboxId = record.activeSandboxId; + + // Do not block archive finalization on provider stop. Some provider stop calls can + // run longer than the synchronous archive UX budget. + void withTimeout( + provider.releaseSandbox({ + workspaceId, + sandboxId, + }), + 45_000, + "provider releaseSandbox", + ).catch((error) => { + logActorWarning("task.commands", "failed to release sandbox during archive", { + workspaceId, + repoId, + taskId, + sandboxId, error: resolveErrorMessage(error), }); }); } const db = loopCtx.db; - await setTaskState(loopCtx, "archive_finalize"); + await setTaskState(loopCtx, "archive_finalize", "finalizing archive"); await db.update(taskTable).set({ status: "archived", updatedAt: Date.now() }).where(eq(taskTable.id, TASK_ROW_ID)).run(); - await appendAuditLog(loopCtx, "task.archive", { reason: msg.body?.reason ?? null }); + await db.update(taskRuntime).set({ activeSessionId: null, statusMessage: "archived", updatedAt: Date.now() }).where(eq(taskRuntime.id, TASK_ROW_ID)).run(); + + await appendHistory(loopCtx, "task.archive", { reason: msg.body?.reason ?? null }); await msg.complete({ ok: true }); } export async function killDestroySandboxActivity(loopCtx: any): Promise { - await setTaskState(loopCtx, "kill_destroy_sandbox"); + await setTaskState(loopCtx, "kill_destroy_sandbox", "destroying sandbox"); const record = await getCurrentRecord(loopCtx); if (!record.activeSandboxId) { return; } - await getTaskSandbox(loopCtx, loopCtx.state.organizationId, record.activeSandboxId).destroy(); + const { providers } = getActorRuntimeContext(); + const activeSandbox = record.sandboxes.find((sb: any) => sb.sandboxId === record.activeSandboxId) ?? null; + const provider = providers.get(activeSandbox?.providerId ?? record.providerId); + await provider.destroySandbox({ + workspaceId: loopCtx.state.workspaceId, + sandboxId: record.activeSandboxId, + }); } export async function killWriteDbActivity(loopCtx: any, msg: any): Promise { - await setTaskState(loopCtx, "kill_finalize"); + await setTaskState(loopCtx, "kill_finalize", "finalizing kill"); const db = loopCtx.db; await db.update(taskTable).set({ status: "killed", updatedAt: Date.now() }).where(eq(taskTable.id, TASK_ROW_ID)).run(); - await appendAuditLog(loopCtx, "task.kill", { reason: msg.body?.reason ?? null }); + await db.update(taskRuntime).set({ statusMessage: "killed", updatedAt: Date.now() }).where(eq(taskRuntime.id, TASK_ROW_ID)).run(); + + await appendHistory(loopCtx, "task.kill", { reason: msg.body?.reason ?? null }); await msg.complete({ ok: true }); } diff --git a/foundry/packages/backend/src/actors/task/workflow/common.ts b/foundry/packages/backend/src/actors/task/workflow/common.ts index cbe63e6..0dfc667 100644 --- a/foundry/packages/backend/src/actors/task/workflow/common.ts +++ b/foundry/packages/backend/src/actors/task/workflow/common.ts @@ -2,10 +2,8 @@ import { eq } from "drizzle-orm"; import type { TaskRecord, TaskStatus } from "@sandbox-agent/foundry-shared"; import { task as taskTable, taskRuntime, taskSandboxes } from "../db/schema.js"; -import { getOrCreateAuditLog, getOrCreateOrganization } from "../../handles.js"; -import { broadcastTaskUpdate } from "../workspace.js"; -import { getActorRuntimeContext } from "../../context.js"; -import { defaultSandboxProviderId } from "../../../sandbox-config.js"; +import { historyKey } from "../../keys.js"; +import { broadcastTaskUpdate } from "../workbench.js"; export const TASK_ROW_ID = 1; @@ -58,32 +56,50 @@ export function buildAgentPrompt(task: string): string { return task.trim(); } -export async function setTaskState(ctx: any, status: TaskStatus): Promise { +export async function setTaskState(ctx: any, status: TaskStatus, statusMessage?: string): Promise { const now = Date.now(); const db = ctx.db; await db.update(taskTable).set({ status, updatedAt: now }).where(eq(taskTable.id, TASK_ROW_ID)).run(); + if (statusMessage != null) { + await db + .insert(taskRuntime) + .values({ + id: TASK_ROW_ID, + activeSandboxId: null, + activeSessionId: null, + activeSwitchTarget: null, + activeCwd: null, + statusMessage, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: taskRuntime.id, + set: { + statusMessage, + updatedAt: now, + }, + }) + .run(); + } + await broadcastTaskUpdate(ctx); } -/** - * Read the task's current record from its local SQLite DB. - * If the task actor was lazily created (virtual task from PR sync) and has no - * DB rows yet, auto-initializes by reading branch/title from the org actor's - * getTaskIndexEntry. This is the self-initialization path for lazy task actors. - */ export async function getCurrentRecord(ctx: any): Promise { const db = ctx.db; - const organization = await getOrCreateOrganization(ctx, ctx.state.organizationId); - let row = await db + const row = await db .select({ branchName: taskTable.branchName, title: taskTable.title, task: taskTable.task, - sandboxProviderId: taskTable.sandboxProviderId, + providerId: taskTable.providerId, status: taskTable.status, - pullRequestJson: taskTable.pullRequestJson, + statusMessage: taskRuntime.statusMessage, activeSandboxId: taskRuntime.activeSandboxId, + activeSessionId: taskRuntime.activeSessionId, + agentType: taskTable.agentType, + prSubmitted: taskTable.prSubmitted, createdAt: taskTable.createdAt, updatedAt: taskTable.updatedAt, }) @@ -93,64 +109,13 @@ export async function getCurrentRecord(ctx: any): Promise { .get(); if (!row) { - // Virtual task — auto-initialize from org actor's task index data - let branchName: string | null = null; - let title = "Untitled"; - try { - const entry = await organization.getTaskIndexEntry({ taskId: ctx.state.taskId }); - branchName = entry?.branchName ?? null; - title = entry?.title ?? title; - } catch {} - - const { config } = getActorRuntimeContext(); - const { initBootstrapDbActivity, initCompleteActivity } = await import("./init.js"); - await initBootstrapDbActivity(ctx, { - sandboxProviderId: defaultSandboxProviderId(config), - branchName, - title, - task: title, - }); - await initCompleteActivity(ctx, { sandboxProviderId: defaultSandboxProviderId(config) }); - - // Re-read the row after initialization - const initialized = await db - .select({ - branchName: taskTable.branchName, - title: taskTable.title, - task: taskTable.task, - sandboxProviderId: taskTable.sandboxProviderId, - status: taskTable.status, - pullRequestJson: taskTable.pullRequestJson, - activeSandboxId: taskRuntime.activeSandboxId, - createdAt: taskTable.createdAt, - updatedAt: taskTable.updatedAt, - }) - .from(taskTable) - .leftJoin(taskRuntime, eq(taskTable.id, taskRuntime.id)) - .where(eq(taskTable.id, TASK_ROW_ID)) - .get(); - - if (!initialized) { - throw new Error(`Task not found after initialization: ${ctx.state.taskId}`); - } - - row = initialized; - } - - const repositoryMetadata = await organization.getRepositoryMetadata({ repoId: ctx.state.repoId }); - let pullRequest = null; - if (row.pullRequestJson) { - try { - pullRequest = JSON.parse(row.pullRequestJson); - } catch { - pullRequest = null; - } + throw new Error(`Task not found: ${ctx.state.taskId}`); } const sandboxes = await db .select({ sandboxId: taskSandboxes.sandboxId, - sandboxProviderId: taskSandboxes.sandboxProviderId, + providerId: taskSandboxes.providerId, sandboxActorId: taskSandboxes.sandboxActorId, switchTarget: taskSandboxes.switchTarget, cwd: taskSandboxes.cwd, @@ -161,39 +126,52 @@ export async function getCurrentRecord(ctx: any): Promise { .all(); return { - organizationId: ctx.state.organizationId, + workspaceId: ctx.state.workspaceId, repoId: ctx.state.repoId, - repoRemote: repositoryMetadata.remoteUrl, + repoRemote: ctx.state.repoRemote, taskId: ctx.state.taskId, branchName: row.branchName, title: row.title, task: row.task, - sandboxProviderId: row.sandboxProviderId, + providerId: row.providerId, status: row.status, + statusMessage: row.statusMessage ?? null, activeSandboxId: row.activeSandboxId ?? null, - pullRequest, + activeSessionId: row.activeSessionId ?? null, sandboxes: sandboxes.map((sb) => ({ sandboxId: sb.sandboxId, - sandboxProviderId: sb.sandboxProviderId, + providerId: sb.providerId, sandboxActorId: sb.sandboxActorId ?? null, switchTarget: sb.switchTarget, cwd: sb.cwd ?? null, createdAt: sb.createdAt, updatedAt: sb.updatedAt, })), + agentType: row.agentType ?? null, + prSubmitted: Boolean(row.prSubmitted), + diffStat: null, + hasUnpushed: null, + conflictsWithMain: null, + parentBranch: null, + prUrl: null, + prAuthor: null, + ciStatus: null, + reviewStatus: null, + reviewer: null, createdAt: row.createdAt, updatedAt: row.updatedAt, } as TaskRecord; } -export async function appendAuditLog(ctx: any, kind: string, payload: Record): Promise { - const row = await ctx.db.select({ branchName: taskTable.branchName }).from(taskTable).where(eq(taskTable.id, TASK_ROW_ID)).get(); - const auditLog = await getOrCreateAuditLog(ctx, ctx.state.organizationId); - void auditLog.append({ +export async function appendHistory(ctx: any, kind: string, payload: Record): Promise { + const client = ctx.client(); + const history = await client.history.getOrCreate(historyKey(ctx.state.workspaceId, ctx.state.repoId), { + createWithInput: { workspaceId: ctx.state.workspaceId, repoId: ctx.state.repoId }, + }); + await history.append({ kind, - repoId: ctx.state.repoId, taskId: ctx.state.taskId, - branchName: row?.branchName ?? null, + branchName: ctx.state.branchName, payload, }); diff --git a/foundry/packages/backend/src/actors/task/workflow/index.ts b/foundry/packages/backend/src/actors/task/workflow/index.ts index 75b2da3..419d36d 100644 --- a/foundry/packages/backend/src/actors/task/workflow/index.ts +++ b/foundry/packages/backend/src/actors/task/workflow/index.ts @@ -1,185 +1,308 @@ -// @ts-nocheck -/** - * Task workflow — queue-based command loop. - * - * Mutations are dispatched through named queues and processed inside the - * workflow command loop so that every command appears in the RivetKit - * inspector's workflow history. Read actions remain direct (no queue). - * - * Callers send commands directly via `.send(taskWorkflowQueueName(...), ...)`. - */ import { Loop } from "rivetkit/workflow"; +import { getActorRuntimeContext } from "../../context.js"; import { logActorWarning, resolveErrorMessage } from "../../logging.js"; -import { TASK_QUEUE_NAMES, type TaskQueueName, taskWorkflowQueueName } from "./queue.js"; import { getCurrentRecord } from "./common.js"; -import { initBootstrapDbActivity, initCompleteActivity, initEnqueueProvisionActivity, initFailedActivity } from "./init.js"; +import { + initAssertNameActivity, + initBootstrapDbActivity, + initCompleteActivity, + initCreateSandboxActivity, + initCreateSessionActivity, + initEnqueueProvisionActivity, + initEnsureAgentActivity, + initEnsureNameActivity, + initExposeSandboxActivity, + initFailedActivity, + initStartSandboxInstanceActivity, + initStartStatusSyncActivity, + initWriteDbActivity, +} from "./init.js"; import { handleArchiveActivity, handleAttachActivity, + handleGetActivity, handlePushActivity, handleSimpleCommandActivity, handleSwitchActivity, killDestroySandboxActivity, killWriteDbActivity, } from "./commands.js"; +import { idleNotifyActivity, idleSubmitPrActivity, statusUpdateActivity } from "./status-sync.js"; +import { TASK_QUEUE_NAMES } from "./queue.js"; import { - changeTaskOwnerManually, - closeWorkspaceSession, - createWorkspaceSession, - ensureWorkspaceSession, - publishWorkspacePr, - revertWorkspaceFile, - sendWorkspaceMessage, - stopWorkspaceSession, -} from "../workspace.js"; + changeWorkbenchModel, + closeWorkbenchSession, + createWorkbenchSession, + ensureWorkbenchSession, + refreshWorkbenchDerivedState, + refreshWorkbenchSessionTranscript, + markWorkbenchUnread, + publishWorkbenchPr, + renameWorkbenchBranch, + renameWorkbenchTask, + renameWorkbenchSession, + revertWorkbenchFile, + sendWorkbenchMessage, + setWorkbenchSessionUnread, + stopWorkbenchSession, + syncWorkbenchSessionStatus, + updateWorkbenchDraft, +} from "../workbench.js"; -export { taskWorkflowQueueName } from "./queue.js"; +export { TASK_QUEUE_NAMES, taskWorkflowQueueName } from "./queue.js"; -// --------------------------------------------------------------------------- -// Workflow command loop — runs inside `run: workflow(runTaskWorkflow)` -// --------------------------------------------------------------------------- +type TaskQueueName = (typeof TASK_QUEUE_NAMES)[number]; -type WorkflowHandler = (loopCtx: any, msg: any) => Promise; +type WorkflowHandler = (loopCtx: any, msg: { name: TaskQueueName; body: any; complete: (response: unknown) => Promise }) => Promise; -const COMMAND_HANDLERS: Record = { +const commandHandlers: Record = { "task.command.initialize": async (loopCtx, msg) => { - await initBootstrapDbActivity(loopCtx, msg.body); - await initEnqueueProvisionActivity(loopCtx, msg.body); - const record = await getCurrentRecord(loopCtx); - await msg.complete(record); + const body = msg.body; + + await loopCtx.step("init-bootstrap-db", async () => initBootstrapDbActivity(loopCtx, body)); + await loopCtx.step("init-enqueue-provision", async () => initEnqueueProvisionActivity(loopCtx, body)); + await loopCtx.removed("init-dispatch-provision-v2", "step"); + const currentRecord = await loopCtx.step("init-read-current-record", async () => getCurrentRecord(loopCtx)); + + try { + await msg.complete(currentRecord); + } catch (error) { + logActorWarning("task.workflow", "initialize completion failed", { + error: resolveErrorMessage(error), + }); + } }, "task.command.provision": async (loopCtx, msg) => { + const body = msg.body; + await loopCtx.removed("init-failed", "step"); try { - await initCompleteActivity(loopCtx, msg.body); + await loopCtx.step("init-ensure-name", async () => initEnsureNameActivity(loopCtx)); + await loopCtx.step("init-assert-name", async () => initAssertNameActivity(loopCtx)); + + const sandbox = await loopCtx.step({ + name: "init-create-sandbox", + timeout: 180_000, + run: async () => initCreateSandboxActivity(loopCtx, body), + }); + const agent = await loopCtx.step({ + name: "init-ensure-agent", + timeout: 180_000, + run: async () => initEnsureAgentActivity(loopCtx, body, sandbox), + }); + const sandboxInstanceReady = await loopCtx.step({ + name: "init-start-sandbox-instance", + timeout: 60_000, + run: async () => initStartSandboxInstanceActivity(loopCtx, body, sandbox, agent), + }); + await loopCtx.step("init-expose-sandbox", async () => initExposeSandboxActivity(loopCtx, body, sandbox, sandboxInstanceReady)); + const session = await loopCtx.step({ + name: "init-create-session", + timeout: 180_000, + run: async () => initCreateSessionActivity(loopCtx, body, sandbox, sandboxInstanceReady), + }); + + await loopCtx.step("init-write-db", async () => initWriteDbActivity(loopCtx, body, sandbox, session, sandboxInstanceReady)); + await loopCtx.step("init-start-status-sync", async () => initStartStatusSyncActivity(loopCtx, body, sandbox, session)); + await loopCtx.step("init-complete", async () => initCompleteActivity(loopCtx, body, sandbox, session)); await msg.complete({ ok: true }); } catch (error) { - await initFailedActivity(loopCtx, error, msg.body); - await msg.complete({ ok: false, error: resolveErrorMessage(error) }); + await loopCtx.step("init-failed-v2", async () => initFailedActivity(loopCtx, error)); + await msg.complete({ + ok: false, + error: resolveErrorMessage(error), + }); } }, "task.command.attach": async (loopCtx, msg) => { - await handleAttachActivity(loopCtx, msg); + await loopCtx.step("handle-attach", async () => handleAttachActivity(loopCtx, msg)); }, "task.command.switch": async (loopCtx, msg) => { - await handleSwitchActivity(loopCtx, msg); + await loopCtx.step("handle-switch", async () => handleSwitchActivity(loopCtx, msg)); }, "task.command.push": async (loopCtx, msg) => { - await handlePushActivity(loopCtx, msg); + await loopCtx.step("handle-push", async () => handlePushActivity(loopCtx, msg)); }, "task.command.sync": async (loopCtx, msg) => { - await handleSimpleCommandActivity(loopCtx, msg, "task.sync"); + await loopCtx.step("handle-sync", async () => handleSimpleCommandActivity(loopCtx, msg, "sync requested", "task.sync")); }, "task.command.merge": async (loopCtx, msg) => { - await handleSimpleCommandActivity(loopCtx, msg, "task.merge"); + await loopCtx.step("handle-merge", async () => handleSimpleCommandActivity(loopCtx, msg, "merge requested", "task.merge")); }, "task.command.archive": async (loopCtx, msg) => { - await handleArchiveActivity(loopCtx, msg); + await loopCtx.step("handle-archive", async () => handleArchiveActivity(loopCtx, msg)); }, "task.command.kill": async (loopCtx, msg) => { - await killDestroySandboxActivity(loopCtx); - await killWriteDbActivity(loopCtx, msg); + await loopCtx.step("kill-destroy-sandbox", async () => killDestroySandboxActivity(loopCtx)); + await loopCtx.step("kill-write-db", async () => killWriteDbActivity(loopCtx, msg)); }, - "task.command.workspace.create_session": async (loopCtx, msg) => { - const result = await createWorkspaceSession(loopCtx, msg.body?.model, msg.body?.authSessionId); - await msg.complete(result); + "task.command.get": async (loopCtx, msg) => { + await loopCtx.step("handle-get", async () => handleGetActivity(loopCtx, msg)); }, - "task.command.workspace.create_session_and_send": async (loopCtx, msg) => { - try { - const created = await createWorkspaceSession(loopCtx, msg.body?.model, msg.body?.authSessionId); - await sendWorkspaceMessage(loopCtx, created.sessionId, msg.body.text, [], msg.body?.authSessionId); - } catch (error) { - logActorWarning("task.workflow", "create_session_and_send failed", { - error: resolveErrorMessage(error), - }); - } + "task.command.workbench.mark_unread": async (loopCtx, msg) => { + await loopCtx.step("workbench-mark-unread", async () => markWorkbenchUnread(loopCtx)); await msg.complete({ ok: true }); }, - "task.command.workspace.ensure_session": async (loopCtx, msg) => { - await ensureWorkspaceSession(loopCtx, msg.body.sessionId, msg.body?.model, msg.body?.authSessionId); + "task.command.workbench.rename_task": async (loopCtx, msg) => { + await loopCtx.step("workbench-rename-task", async () => renameWorkbenchTask(loopCtx, msg.body.value)); await msg.complete({ ok: true }); }, - "task.command.workspace.send_message": async (loopCtx, msg) => { - await sendWorkspaceMessage(loopCtx, msg.body.sessionId, msg.body.text, msg.body.attachments, msg.body?.authSessionId); - await msg.complete({ ok: true }); - }, - - "task.command.workspace.stop_session": async (loopCtx, msg) => { - await stopWorkspaceSession(loopCtx, msg.body.sessionId); - await msg.complete({ ok: true }); - }, - - "task.command.workspace.close_session": async (loopCtx, msg) => { - await closeWorkspaceSession(loopCtx, msg.body.sessionId, msg.body?.authSessionId); - await msg.complete({ ok: true }); - }, - - "task.command.workspace.publish_pr": async (loopCtx, msg) => { - await publishWorkspacePr(loopCtx); - await msg.complete({ ok: true }); - }, - - "task.command.workspace.revert_file": async (loopCtx, msg) => { - await revertWorkspaceFile(loopCtx, msg.body.path); - await msg.complete({ ok: true }); - }, - - "task.command.workspace.change_owner": async (loopCtx, msg) => { - await changeTaskOwnerManually(loopCtx, { - primaryUserId: msg.body.primaryUserId, - primaryGithubLogin: msg.body.primaryGithubLogin, - primaryGithubEmail: msg.body.primaryGithubEmail, - primaryGithubAvatarUrl: msg.body.primaryGithubAvatarUrl ?? null, + "task.command.workbench.rename_branch": async (loopCtx, msg) => { + await loopCtx.step({ + name: "workbench-rename-branch", + timeout: 5 * 60_000, + run: async () => renameWorkbenchBranch(loopCtx, msg.body.value), }); await msg.complete({ ok: true }); }, + + "task.command.workbench.create_session": async (loopCtx, msg) => { + try { + const created = await loopCtx.step({ + name: "workbench-create-session", + timeout: 30_000, + run: async () => createWorkbenchSession(loopCtx, msg.body?.model), + }); + await msg.complete(created); + } catch (error) { + await msg.complete({ error: resolveErrorMessage(error) }); + } + }, + + "task.command.workbench.ensure_session": async (loopCtx, msg) => { + await loopCtx.step({ + name: "workbench-ensure-session", + timeout: 5 * 60_000, + run: async () => ensureWorkbenchSession(loopCtx, msg.body.tabId, msg.body?.model), + }); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.rename_session": async (loopCtx, msg) => { + await loopCtx.step("workbench-rename-session", async () => renameWorkbenchSession(loopCtx, msg.body.sessionId, msg.body.title)); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.set_session_unread": async (loopCtx, msg) => { + await loopCtx.step("workbench-set-session-unread", async () => setWorkbenchSessionUnread(loopCtx, msg.body.sessionId, msg.body.unread)); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.update_draft": async (loopCtx, msg) => { + await loopCtx.step("workbench-update-draft", async () => updateWorkbenchDraft(loopCtx, msg.body.sessionId, msg.body.text, msg.body.attachments)); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.change_model": async (loopCtx, msg) => { + await loopCtx.step("workbench-change-model", async () => changeWorkbenchModel(loopCtx, msg.body.sessionId, msg.body.model)); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.send_message": async (loopCtx, msg) => { + await loopCtx.step({ + name: "workbench-send-message", + timeout: 10 * 60_000, + run: async () => sendWorkbenchMessage(loopCtx, msg.body.sessionId, msg.body.text, msg.body.attachments), + }); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.stop_session": async (loopCtx, msg) => { + await loopCtx.step({ + name: "workbench-stop-session", + timeout: 5 * 60_000, + run: async () => stopWorkbenchSession(loopCtx, msg.body.sessionId), + }); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.sync_session_status": async (loopCtx, msg) => { + await loopCtx.step("workbench-sync-session-status", async () => syncWorkbenchSessionStatus(loopCtx, msg.body.sessionId, msg.body.status, msg.body.at)); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.refresh_derived": async (loopCtx, msg) => { + await loopCtx.step({ + name: "workbench-refresh-derived", + timeout: 5 * 60_000, + run: async () => refreshWorkbenchDerivedState(loopCtx), + }); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.refresh_session_transcript": async (loopCtx, msg) => { + await loopCtx.step({ + name: "workbench-refresh-session-transcript", + timeout: 60_000, + run: async () => refreshWorkbenchSessionTranscript(loopCtx, msg.body.sessionId), + }); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.close_session": async (loopCtx, msg) => { + await loopCtx.step({ + name: "workbench-close-session", + timeout: 5 * 60_000, + run: async () => closeWorkbenchSession(loopCtx, msg.body.sessionId), + }); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.publish_pr": async (loopCtx, msg) => { + await loopCtx.step({ + name: "workbench-publish-pr", + timeout: 10 * 60_000, + run: async () => publishWorkbenchPr(loopCtx), + }); + await msg.complete({ ok: true }); + }, + + "task.command.workbench.revert_file": async (loopCtx, msg) => { + await loopCtx.step({ + name: "workbench-revert-file", + timeout: 5 * 60_000, + run: async () => revertWorkbenchFile(loopCtx, msg.body.path), + }); + await msg.complete({ ok: true }); + }, + + "task.status_sync.result": async (loopCtx, msg) => { + const transitionedToIdle = await loopCtx.step("status-update", async () => statusUpdateActivity(loopCtx, msg.body)); + + if (transitionedToIdle) { + const { config } = getActorRuntimeContext(); + if (config.auto_submit) { + await loopCtx.step("idle-submit-pr", async () => idleSubmitPrActivity(loopCtx)); + } + await loopCtx.step("idle-notify", async () => idleNotifyActivity(loopCtx)); + } + }, }; export async function runTaskWorkflow(ctx: any): Promise { await ctx.loop("task-command-loop", async (loopCtx: any) => { - const msg = await loopCtx.queue.next("next-task-command", { + const msg = await loopCtx.queue.next("next-command", { names: [...TASK_QUEUE_NAMES], completable: true, }); - if (!msg) { return Loop.continue(undefined); } - - const handler = COMMAND_HANDLERS[msg.name as TaskQueueName]; - if (!handler) { - logActorWarning("task.workflow", "unknown task command", { command: msg.name }); - await msg.complete({ error: `Unknown command: ${msg.name}` }).catch(() => {}); - return Loop.continue(undefined); + const handler = commandHandlers[msg.name as TaskQueueName]; + if (handler) { + await handler(loopCtx, msg); } - - try { - // Wrap in a step so c.state and c.db are accessible inside mutation functions. - await loopCtx.step({ - name: msg.name, - timeout: 10 * 60_000, - run: async () => handler(loopCtx, msg), - }); - } catch (error) { - const message = resolveErrorMessage(error); - logActorWarning("task.workflow", "task workflow command failed", { - command: msg.name, - error: message, - }); - await msg.complete({ error: message }).catch(() => {}); - } - return Loop.continue(undefined); }); } diff --git a/foundry/packages/backend/src/actors/task/workflow/init.ts b/foundry/packages/backend/src/actors/task/workflow/init.ts index ffdf1d4..4e6fbb5 100644 --- a/foundry/packages/backend/src/actors/task/workflow/init.ts +++ b/foundry/packages/backend/src/actors/task/workflow/init.ts @@ -1,82 +1,156 @@ // @ts-nocheck -import { eq } from "drizzle-orm"; +import { desc, eq } from "drizzle-orm"; +import { resolveCreateFlowDecision } from "../../../services/create-flow.js"; +import { resolveWorkspaceGithubAuth } from "../../../services/github-auth.js"; import { getActorRuntimeContext } from "../../context.js"; -import { selfTask } from "../../handles.js"; -import { resolveErrorMessage } from "../../logging.js"; +import { getOrCreateTaskStatusSync, getOrCreateHistory, getOrCreateProject, getOrCreateSandboxInstance, getSandboxInstance, selfTask } from "../../handles.js"; +import { logActorWarning, resolveErrorMessage } from "../../logging.js"; +import { task as taskTable, taskRuntime, taskSandboxes } from "../db/schema.js"; +import { TASK_ROW_ID, appendHistory, buildAgentPrompt, collectErrorMessages, resolveErrorDetail, setTaskState } from "./common.js"; import { taskWorkflowQueueName } from "./queue.js"; -import { defaultSandboxProviderId } from "../../../sandbox-config.js"; -import { task as taskTable, taskRuntime } from "../db/schema.js"; -import { TASK_ROW_ID, appendAuditLog, collectErrorMessages, resolveErrorDetail, setTaskState } from "./common.js"; -// task actions called directly (no queue) +import { enqueuePendingWorkbenchSessions } from "../workbench.js"; + +const DEFAULT_INIT_CREATE_SANDBOX_ACTIVITY_TIMEOUT_MS = 180_000; + +function getInitCreateSandboxActivityTimeoutMs(): number { + const raw = process.env.HF_INIT_CREATE_SANDBOX_ACTIVITY_TIMEOUT_MS; + if (!raw) { + return DEFAULT_INIT_CREATE_SANDBOX_ACTIVITY_TIMEOUT_MS; + } + const parsed = Number(raw); + if (!Number.isFinite(parsed) || parsed <= 0) { + return DEFAULT_INIT_CREATE_SANDBOX_ACTIVITY_TIMEOUT_MS; + } + return Math.floor(parsed); +} + +function debugInit(loopCtx: any, message: string, context?: Record): void { + loopCtx.log.debug({ + msg: message, + scope: "task.init", + workspaceId: loopCtx.state.workspaceId, + repoId: loopCtx.state.repoId, + taskId: loopCtx.state.taskId, + ...(context ?? {}), + }); +} + +async function ensureTaskRuntimeCacheColumns(db: any): Promise { + await db.execute(`ALTER TABLE task_runtime ADD COLUMN git_state_json text`).catch(() => {}); + await db.execute(`ALTER TABLE task_runtime ADD COLUMN git_state_updated_at integer`).catch(() => {}); + await db.execute(`ALTER TABLE task_runtime ADD COLUMN provision_stage text`).catch(() => {}); + await db.execute(`ALTER TABLE task_runtime ADD COLUMN provision_stage_updated_at integer`).catch(() => {}); +} + +async function withActivityTimeout(timeoutMs: number, label: string, run: () => Promise): Promise { + let timer: ReturnType | null = null; + try { + return await Promise.race([ + run(), + new Promise((_, reject) => { + timer = setTimeout(() => { + reject(new Error(`${label} timed out after ${timeoutMs}ms`)); + }, timeoutMs); + }), + ]); + } finally { + if (timer) { + clearTimeout(timer); + } + } +} export async function initBootstrapDbActivity(loopCtx: any, body: any): Promise { + const providerId = body?.providerId ?? loopCtx.state.providerId; const { config } = getActorRuntimeContext(); - const sandboxProviderId = body?.sandboxProviderId ?? defaultSandboxProviderId(config); - const task = body?.task; - if (typeof task !== "string" || task.trim().length === 0) { - throw new Error("task initialize requires the task prompt"); - } const now = Date.now(); + const db = loopCtx.db; + const initialStatusMessage = loopCtx.state.branchName && loopCtx.state.title ? "provisioning" : "naming"; - await loopCtx.db - .insert(taskTable) - .values({ - id: TASK_ROW_ID, - branchName: body?.branchName ?? null, - title: body?.title ?? null, - task, - sandboxProviderId, - status: "init_bootstrap_db", - pullRequestJson: null, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: taskTable.id, - set: { - branchName: body?.branchName ?? null, - title: body?.title ?? null, - task, - sandboxProviderId, + try { + await ensureTaskRuntimeCacheColumns(db); + + await db + .insert(taskTable) + .values({ + id: TASK_ROW_ID, + branchName: loopCtx.state.branchName, + title: loopCtx.state.title, + task: loopCtx.state.task, + providerId, status: "init_bootstrap_db", - pullRequestJson: null, + agentType: loopCtx.state.agentType ?? config.default_agent, + createdAt: now, updatedAt: now, - }, - }) - .run(); + }) + .onConflictDoUpdate({ + target: taskTable.id, + set: { + branchName: loopCtx.state.branchName, + title: loopCtx.state.title, + task: loopCtx.state.task, + providerId, + status: "init_bootstrap_db", + agentType: loopCtx.state.agentType ?? config.default_agent, + updatedAt: now, + }, + }) + .run(); - await loopCtx.db - .insert(taskRuntime) - .values({ - id: TASK_ROW_ID, - activeSandboxId: null, - activeSwitchTarget: null, - activeCwd: null, - gitStateJson: null, - gitStateUpdatedAt: null, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: taskRuntime.id, - set: { + await db + .insert(taskRuntime) + .values({ + id: TASK_ROW_ID, activeSandboxId: null, + activeSessionId: null, activeSwitchTarget: null, activeCwd: null, + statusMessage: initialStatusMessage, + gitStateJson: null, + gitStateUpdatedAt: null, + provisionStage: "queued", + provisionStageUpdatedAt: now, updatedAt: now, - }, - }) - .run(); + }) + .onConflictDoUpdate({ + target: taskRuntime.id, + set: { + activeSandboxId: null, + activeSessionId: null, + activeSwitchTarget: null, + activeCwd: null, + statusMessage: initialStatusMessage, + provisionStage: "queued", + provisionStageUpdatedAt: now, + updatedAt: now, + }, + }) + .run(); + } catch (error) { + const detail = resolveErrorMessage(error); + throw new Error(`task init bootstrap db failed: ${detail}`); + } } export async function initEnqueueProvisionActivity(loopCtx: any, body: any): Promise { - await setTaskState(loopCtx, "init_enqueue_provision"); - + await setTaskState(loopCtx, "init_enqueue_provision", "provision queued"); + await loopCtx.db + .update(taskRuntime) + .set({ + provisionStage: "queued", + provisionStageUpdatedAt: Date.now(), + updatedAt: Date.now(), + }) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .run(); const self = selfTask(loopCtx); try { - void self.send(taskWorkflowQueueName("task.command.provision"), body ?? {}, { wait: false }).catch(() => {}); - } catch (error) { + await self.send(taskWorkflowQueueName("task.command.provision"), body, { + wait: false, + }); + } catch (error: unknown) { logActorWarning("task.init", "background provision command failed", { - organizationId: loopCtx.state.organizationId, + workspaceId: loopCtx.state.workspaceId, repoId: loopCtx.state.repoId, taskId: loopCtx.state.taskId, error: resolveErrorMessage(error), @@ -85,81 +159,532 @@ export async function initEnqueueProvisionActivity(loopCtx: any, body: any): Pro } } -export async function initCompleteActivity(loopCtx: any, body: any): Promise { - const now = Date.now(); - const { config } = getActorRuntimeContext(); - const sandboxProviderId = body?.sandboxProviderId ?? defaultSandboxProviderId(config); +export async function initEnsureNameActivity(loopCtx: any): Promise { + await setTaskState(loopCtx, "init_ensure_name", "determining title and branch"); + const existing = await loopCtx.db + .select({ + branchName: taskTable.branchName, + title: taskTable.title, + }) + .from(taskTable) + .where(eq(taskTable.id, TASK_ROW_ID)) + .get(); + + if (existing?.branchName && existing?.title) { + loopCtx.state.branchName = existing.branchName; + loopCtx.state.title = existing.title; + return; + } + + const { driver } = getActorRuntimeContext(); + const auth = await resolveWorkspaceGithubAuth(loopCtx, loopCtx.state.workspaceId); + try { + await driver.git.fetch(loopCtx.state.repoLocalPath, { githubToken: auth?.githubToken ?? null }); + } catch (error) { + logActorWarning("task.init", "fetch before naming failed", { + workspaceId: loopCtx.state.workspaceId, + repoId: loopCtx.state.repoId, + taskId: loopCtx.state.taskId, + error: resolveErrorMessage(error), + }); + } + const remoteBranches = (await driver.git.listRemoteBranches(loopCtx.state.repoLocalPath, { githubToken: auth?.githubToken ?? null })).map( + (branch: any) => branch.branchName, + ); + + const project = await getOrCreateProject(loopCtx, loopCtx.state.workspaceId, loopCtx.state.repoId, loopCtx.state.repoRemote); + const reservedBranches = await project.listReservedBranches({}); + + const resolved = resolveCreateFlowDecision({ + task: loopCtx.state.task, + explicitTitle: loopCtx.state.explicitTitle ?? undefined, + explicitBranchName: loopCtx.state.explicitBranchName ?? undefined, + localBranches: remoteBranches, + taskBranches: reservedBranches, + }); + + const now = Date.now(); + await loopCtx.db + .update(taskTable) + .set({ + branchName: resolved.branchName, + title: resolved.title, + updatedAt: now, + }) + .where(eq(taskTable.id, TASK_ROW_ID)) + .run(); + + loopCtx.state.branchName = resolved.branchName; + loopCtx.state.title = resolved.title; + loopCtx.state.explicitTitle = null; + loopCtx.state.explicitBranchName = null; - await setTaskState(loopCtx, "init_complete"); await loopCtx.db .update(taskRuntime) .set({ + statusMessage: "provisioning", + provisionStage: "repo_prepared", + provisionStageUpdatedAt: now, updatedAt: now, }) .where(eq(taskRuntime.id, TASK_ROW_ID)) .run(); - await appendAuditLog(loopCtx, "task.initialized", { - payload: { sandboxProviderId }, + await project.registerTaskBranch({ + taskId: loopCtx.state.taskId, + branchName: resolved.branchName, + }); + + await appendHistory(loopCtx, "task.named", { + title: resolved.title, + branchName: resolved.branchName, }); } -export async function initFailedActivity(loopCtx: any, error: unknown, body?: any): Promise { +export async function initAssertNameActivity(loopCtx: any): Promise { + await setTaskState(loopCtx, "init_assert_name", "validating naming"); + if (!loopCtx.state.branchName) { + throw new Error("task branchName is not initialized"); + } +} + +export async function initCreateSandboxActivity(loopCtx: any, body: any): Promise { + await setTaskState(loopCtx, "init_create_sandbox", "creating sandbox"); + await loopCtx.db + .update(taskRuntime) + .set({ + provisionStage: "sandbox_allocated", + provisionStageUpdatedAt: Date.now(), + updatedAt: Date.now(), + }) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .run(); + const { providers } = getActorRuntimeContext(); + const providerId = body?.providerId ?? loopCtx.state.providerId; + const provider = providers.get(providerId); + const timeoutMs = getInitCreateSandboxActivityTimeoutMs(); + const startedAt = Date.now(); + + debugInit(loopCtx, "init_create_sandbox started", { + providerId, + timeoutMs, + supportsSessionReuse: provider.capabilities().supportsSessionReuse, + }); + + if (provider.capabilities().supportsSessionReuse) { + const runtime = await loopCtx.db.select({ activeSandboxId: taskRuntime.activeSandboxId }).from(taskRuntime).where(eq(taskRuntime.id, TASK_ROW_ID)).get(); + + const existing = await loopCtx.db + .select({ sandboxId: taskSandboxes.sandboxId }) + .from(taskSandboxes) + .where(eq(taskSandboxes.providerId, providerId)) + .orderBy(desc(taskSandboxes.updatedAt)) + .limit(1) + .get(); + + const sandboxId = runtime?.activeSandboxId ?? existing?.sandboxId ?? null; + if (sandboxId) { + debugInit(loopCtx, "init_create_sandbox attempting resume", { sandboxId }); + try { + const resumed = await withActivityTimeout(timeoutMs, "resumeSandbox", async () => + provider.resumeSandbox({ + workspaceId: loopCtx.state.workspaceId, + sandboxId, + }), + ); + + debugInit(loopCtx, "init_create_sandbox resume succeeded", { + sandboxId: resumed.sandboxId, + durationMs: Date.now() - startedAt, + }); + return resumed; + } catch (error) { + logActorWarning("task.init", "resume sandbox failed; creating a new sandbox", { + workspaceId: loopCtx.state.workspaceId, + repoId: loopCtx.state.repoId, + taskId: loopCtx.state.taskId, + sandboxId, + error: resolveErrorMessage(error), + }); + } + } + } + + debugInit(loopCtx, "init_create_sandbox creating fresh sandbox", { + branchName: loopCtx.state.branchName, + }); + + try { + const auth = await resolveWorkspaceGithubAuth(loopCtx, loopCtx.state.workspaceId); + const sandbox = await withActivityTimeout(timeoutMs, "createSandbox", async () => + provider.createSandbox({ + workspaceId: loopCtx.state.workspaceId, + repoId: loopCtx.state.repoId, + repoRemote: loopCtx.state.repoRemote, + branchName: loopCtx.state.branchName, + taskId: loopCtx.state.taskId, + githubToken: auth?.githubToken ?? null, + debug: (message, context) => debugInit(loopCtx, message, context), + }), + ); + + debugInit(loopCtx, "init_create_sandbox create succeeded", { + sandboxId: sandbox.sandboxId, + durationMs: Date.now() - startedAt, + }); + return sandbox; + } catch (error) { + debugInit(loopCtx, "init_create_sandbox failed", { + durationMs: Date.now() - startedAt, + error: resolveErrorMessage(error), + }); + throw error; + } +} + +export async function initEnsureAgentActivity(loopCtx: any, body: any, sandbox: any): Promise { + await setTaskState(loopCtx, "init_ensure_agent", "ensuring sandbox agent"); + await loopCtx.db + .update(taskRuntime) + .set({ + provisionStage: "agent_installing", + provisionStageUpdatedAt: Date.now(), + updatedAt: Date.now(), + }) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .run(); + const { providers } = getActorRuntimeContext(); + const providerId = body?.providerId ?? loopCtx.state.providerId; + const provider = providers.get(providerId); + return await provider.ensureSandboxAgent({ + workspaceId: loopCtx.state.workspaceId, + sandboxId: sandbox.sandboxId, + }); +} + +export async function initStartSandboxInstanceActivity(loopCtx: any, body: any, sandbox: any, agent: any): Promise { + await setTaskState(loopCtx, "init_start_sandbox_instance", "starting sandbox runtime"); + await loopCtx.db + .update(taskRuntime) + .set({ + provisionStage: "agent_starting", + provisionStageUpdatedAt: Date.now(), + updatedAt: Date.now(), + }) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .run(); + try { + const providerId = body?.providerId ?? loopCtx.state.providerId; + const sandboxInstance = await getOrCreateSandboxInstance(loopCtx, loopCtx.state.workspaceId, providerId, sandbox.sandboxId, { + workspaceId: loopCtx.state.workspaceId, + providerId, + sandboxId: sandbox.sandboxId, + }); + + await sandboxInstance.ensure({ + metadata: sandbox.metadata, + status: "ready", + agentEndpoint: agent.endpoint, + agentToken: agent.token, + }); + + const actorId = typeof (sandboxInstance as any).resolve === "function" ? await (sandboxInstance as any).resolve() : null; + + return { + ok: true as const, + actorId: typeof actorId === "string" ? actorId : null, + }; + } catch (error) { + const detail = error instanceof Error ? error.message : String(error); + return { + ok: false as const, + error: `sandbox-instance ensure failed: ${detail}`, + }; + } +} + +export async function initCreateSessionActivity(loopCtx: any, body: any, sandbox: any, sandboxInstanceReady: any): Promise { + await setTaskState(loopCtx, "init_create_session", "creating agent session"); + await loopCtx.db + .update(taskRuntime) + .set({ + provisionStage: "session_creating", + provisionStageUpdatedAt: Date.now(), + updatedAt: Date.now(), + }) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .run(); + if (!sandboxInstanceReady.ok) { + return { + id: null, + status: "error", + error: sandboxInstanceReady.error ?? "sandbox instance is not ready", + } as const; + } + + const { config } = getActorRuntimeContext(); + const providerId = body?.providerId ?? loopCtx.state.providerId; + const sandboxInstance = getSandboxInstance(loopCtx, loopCtx.state.workspaceId, providerId, sandbox.sandboxId); + + const cwd = sandbox.metadata && typeof (sandbox.metadata as any).cwd === "string" ? ((sandbox.metadata as any).cwd as string) : undefined; + + return await sandboxInstance.createSession({ + prompt: typeof loopCtx.state.initialPrompt === "string" ? loopCtx.state.initialPrompt : buildAgentPrompt(loopCtx.state.task), + cwd, + agent: (loopCtx.state.agentType ?? config.default_agent) as any, + }); +} + +export async function initExposeSandboxActivity(loopCtx: any, body: any, sandbox: any, sandboxInstanceReady?: { actorId?: string | null }): Promise { + const providerId = body?.providerId ?? loopCtx.state.providerId; + const now = Date.now(); + const db = loopCtx.db; + const activeCwd = sandbox.metadata && typeof (sandbox.metadata as any).cwd === "string" ? ((sandbox.metadata as any).cwd as string) : null; + const sandboxActorId = typeof sandboxInstanceReady?.actorId === "string" && sandboxInstanceReady.actorId.length > 0 ? sandboxInstanceReady.actorId : null; + + await db + .insert(taskSandboxes) + .values({ + sandboxId: sandbox.sandboxId, + providerId, + sandboxActorId, + switchTarget: sandbox.switchTarget, + cwd: activeCwd, + statusMessage: "sandbox ready", + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: taskSandboxes.sandboxId, + set: { + providerId, + sandboxActorId, + switchTarget: sandbox.switchTarget, + cwd: activeCwd, + statusMessage: "sandbox ready", + updatedAt: now, + }, + }) + .run(); + + await db + .update(taskRuntime) + .set({ + activeSandboxId: sandbox.sandboxId, + activeSwitchTarget: sandbox.switchTarget, + activeCwd, + statusMessage: "sandbox ready", + updatedAt: now, + }) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .run(); +} + +export async function initWriteDbActivity( + loopCtx: any, + body: any, + sandbox: any, + session: any, + sandboxInstanceReady?: { actorId?: string | null }, +): Promise { + await setTaskState(loopCtx, "init_write_db", "persisting task runtime"); + const providerId = body?.providerId ?? loopCtx.state.providerId; + const { config } = getActorRuntimeContext(); + const now = Date.now(); + const db = loopCtx.db; + const sessionId = session?.id ?? null; + const sessionHealthy = Boolean(sessionId) && session?.status !== "error"; + const activeSessionId = sessionHealthy ? sessionId : null; + const statusMessage = sessionHealthy ? "session created" : session?.status === "error" ? (session.error ?? "session create failed") : "session unavailable"; + + const activeCwd = sandbox.metadata && typeof (sandbox.metadata as any).cwd === "string" ? ((sandbox.metadata as any).cwd as string) : null; + const sandboxActorId = typeof sandboxInstanceReady?.actorId === "string" && sandboxInstanceReady.actorId.length > 0 ? sandboxInstanceReady.actorId : null; + + await db + .update(taskTable) + .set({ + providerId, + status: sessionHealthy ? "running" : "error", + agentType: loopCtx.state.agentType ?? config.default_agent, + updatedAt: now, + }) + .where(eq(taskTable.id, TASK_ROW_ID)) + .run(); + + await db + .insert(taskSandboxes) + .values({ + sandboxId: sandbox.sandboxId, + providerId, + sandboxActorId, + switchTarget: sandbox.switchTarget, + cwd: activeCwd, + statusMessage, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: taskSandboxes.sandboxId, + set: { + providerId, + sandboxActorId, + switchTarget: sandbox.switchTarget, + cwd: activeCwd, + statusMessage, + updatedAt: now, + }, + }) + .run(); + + await db + .insert(taskRuntime) + .values({ + id: TASK_ROW_ID, + activeSandboxId: sandbox.sandboxId, + activeSessionId, + activeSwitchTarget: sandbox.switchTarget, + activeCwd, + statusMessage, + provisionStage: sessionHealthy ? "ready" : "error", + provisionStageUpdatedAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: taskRuntime.id, + set: { + activeSandboxId: sandbox.sandboxId, + activeSessionId, + activeSwitchTarget: sandbox.switchTarget, + activeCwd, + statusMessage, + provisionStage: sessionHealthy ? "ready" : "error", + provisionStageUpdatedAt: now, + updatedAt: now, + }, + }) + .run(); +} + +export async function initStartStatusSyncActivity(loopCtx: any, body: any, sandbox: any, session: any): Promise { + const sessionId = session?.id ?? null; + if (!sessionId || session?.status === "error") { + return; + } + + await setTaskState(loopCtx, "init_start_status_sync", "starting session status sync"); + const providerId = body?.providerId ?? loopCtx.state.providerId; + const sync = await getOrCreateTaskStatusSync(loopCtx, loopCtx.state.workspaceId, loopCtx.state.repoId, loopCtx.state.taskId, sandbox.sandboxId, sessionId, { + workspaceId: loopCtx.state.workspaceId, + repoId: loopCtx.state.repoId, + taskId: loopCtx.state.taskId, + providerId, + sandboxId: sandbox.sandboxId, + sessionId, + intervalMs: 2_000, + }); + + await sync.start(); + await sync.force(); +} + +export async function initCompleteActivity(loopCtx: any, body: any, sandbox: any, session: any): Promise { + const providerId = body?.providerId ?? loopCtx.state.providerId; + const sessionId = session?.id ?? null; + const sessionHealthy = Boolean(sessionId) && session?.status !== "error"; + if (sessionHealthy) { + await setTaskState(loopCtx, "init_complete", "task initialized"); + + const history = await getOrCreateHistory(loopCtx, loopCtx.state.workspaceId, loopCtx.state.repoId); + await history.append({ + kind: "task.initialized", + taskId: loopCtx.state.taskId, + branchName: loopCtx.state.branchName, + payload: { providerId, sandboxId: sandbox.sandboxId, sessionId }, + }); + + loopCtx.state.initialized = true; + await enqueuePendingWorkbenchSessions(loopCtx); + const self = selfTask(loopCtx); + await self.send(taskWorkflowQueueName("task.command.workbench.refresh_derived"), {}, { wait: false }); + if (sessionId) { + await self.send(taskWorkflowQueueName("task.command.workbench.refresh_session_transcript"), { sessionId }, { wait: false }); + } + return; + } + + const detail = session?.status === "error" ? (session.error ?? "session create failed") : "session unavailable"; + await setTaskState(loopCtx, "error", detail); + await appendHistory(loopCtx, "task.error", { + detail, + messages: [detail], + }); + loopCtx.state.initialized = false; +} + +export async function initFailedActivity(loopCtx: any, error: unknown): Promise { const now = Date.now(); const detail = resolveErrorDetail(error); const messages = collectErrorMessages(error); - const { config } = getActorRuntimeContext(); - const sandboxProviderId = defaultSandboxProviderId(config); - const task = typeof body?.task === "string" ? body.task : null; + const db = loopCtx.db; + const { config, providers } = getActorRuntimeContext(); + const providerId = loopCtx.state.providerId ?? providers.defaultProviderId(); - await loopCtx.db + await db .insert(taskTable) .values({ id: TASK_ROW_ID, - branchName: body?.branchName ?? null, - title: body?.title ?? null, - task: task ?? detail, - sandboxProviderId, + branchName: loopCtx.state.branchName ?? null, + title: loopCtx.state.title ?? null, + task: loopCtx.state.task, + providerId, status: "error", - pullRequestJson: null, + agentType: loopCtx.state.agentType ?? config.default_agent, createdAt: now, updatedAt: now, }) .onConflictDoUpdate({ target: taskTable.id, set: { - branchName: body?.branchName ?? null, - title: body?.title ?? null, - task: task ?? detail, - sandboxProviderId, + branchName: loopCtx.state.branchName ?? null, + title: loopCtx.state.title ?? null, + task: loopCtx.state.task, + providerId, status: "error", - pullRequestJson: null, + agentType: loopCtx.state.agentType ?? config.default_agent, updatedAt: now, }, }) .run(); - await loopCtx.db + await db .insert(taskRuntime) .values({ id: TASK_ROW_ID, activeSandboxId: null, + activeSessionId: null, activeSwitchTarget: null, activeCwd: null, + statusMessage: detail, + provisionStage: "error", + provisionStageUpdatedAt: now, updatedAt: now, }) .onConflictDoUpdate({ target: taskRuntime.id, set: { activeSandboxId: null, + activeSessionId: null, activeSwitchTarget: null, activeCwd: null, + statusMessage: detail, + provisionStage: "error", + provisionStageUpdatedAt: now, updatedAt: now, }, }) .run(); - await appendAuditLog(loopCtx, "task.error", { + await appendHistory(loopCtx, "task.error", { detail, messages, }); diff --git a/foundry/packages/backend/src/actors/task/workflow/push.ts b/foundry/packages/backend/src/actors/task/workflow/push.ts index f15ab0b..7ee929d 100644 --- a/foundry/packages/backend/src/actors/task/workflow/push.ts +++ b/foundry/packages/backend/src/actors/task/workflow/push.ts @@ -1,7 +1,8 @@ // @ts-nocheck -import { getTaskSandbox } from "../../handles.js"; -import { resolveOrganizationGithubAuth } from "../../../services/github-auth.js"; -import { appendAuditLog, getCurrentRecord } from "./common.js"; +import { eq } from "drizzle-orm"; +import { getActorRuntimeContext } from "../../context.js"; +import { taskRuntime, taskSandboxes } from "../db/schema.js"; +import { TASK_ROW_ID, appendHistory, getCurrentRecord } from "./common.js"; export interface PushActiveBranchOptions { reason?: string | null; @@ -11,7 +12,7 @@ export interface PushActiveBranchOptions { export async function pushActiveBranchActivity(loopCtx: any, options: PushActiveBranchOptions = {}): Promise { const record = await getCurrentRecord(loopCtx); const activeSandboxId = record.activeSandboxId; - const branchName = record.branchName; + const branchName = loopCtx.state.branchName ?? record.branchName; if (!activeSandboxId) { throw new Error("cannot push: no active sandbox"); @@ -21,11 +22,28 @@ export async function pushActiveBranchActivity(loopCtx: any, options: PushActive } const activeSandbox = record.sandboxes.find((sandbox: any) => sandbox.sandboxId === activeSandboxId) ?? null; + const providerId = activeSandbox?.providerId ?? record.providerId; const cwd = activeSandbox?.cwd ?? null; if (!cwd) { throw new Error("cannot push: active sandbox cwd is not set"); } + const { providers } = getActorRuntimeContext(); + const provider = providers.get(providerId); + + const now = Date.now(); + await loopCtx.db + .update(taskRuntime) + .set({ statusMessage: `pushing branch ${branchName}`, updatedAt: now }) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .run(); + + await loopCtx.db + .update(taskSandboxes) + .set({ statusMessage: `pushing branch ${branchName}`, updatedAt: now }) + .where(eq(taskSandboxes.sandboxId, activeSandboxId)) + .run(); + const script = [ "set -euo pipefail", `cd ${JSON.stringify(cwd)}`, @@ -34,26 +52,31 @@ export async function pushActiveBranchActivity(loopCtx: any, options: PushActive `git push -u origin ${JSON.stringify(branchName)}`, ].join("; "); - const sandbox = getTaskSandbox(loopCtx, loopCtx.state.organizationId, activeSandboxId); - const auth = await resolveOrganizationGithubAuth(loopCtx, loopCtx.state.organizationId); - const result = await sandbox.runProcess({ - command: "bash", - args: ["-lc", script], - cwd: "/", - env: auth?.githubToken - ? { - GH_TOKEN: auth.githubToken, - GITHUB_TOKEN: auth.githubToken, - } - : undefined, - timeoutMs: 5 * 60_000, + const result = await provider.executeCommand({ + workspaceId: loopCtx.state.workspaceId, + sandboxId: activeSandboxId, + command: ["bash", "-lc", JSON.stringify(script)].join(" "), + label: `git push ${branchName}`, }); - if ((result.exitCode ?? 0) !== 0) { - throw new Error(`git push failed (${result.exitCode ?? 1}): ${[result.stdout, result.stderr].filter(Boolean).join("")}`); + if (result.exitCode !== 0) { + throw new Error(`git push failed (${result.exitCode}): ${result.result}`); } - await appendAuditLog(loopCtx, options.historyKind ?? "task.push", { + const updatedAt = Date.now(); + await loopCtx.db + .update(taskRuntime) + .set({ statusMessage: `push complete for ${branchName}`, updatedAt }) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .run(); + + await loopCtx.db + .update(taskSandboxes) + .set({ statusMessage: `push complete for ${branchName}`, updatedAt }) + .where(eq(taskSandboxes.sandboxId, activeSandboxId)) + .run(); + + await appendHistory(loopCtx, options.historyKind ?? "task.push", { reason: options.reason ?? null, branchName, sandboxId: activeSandboxId, diff --git a/foundry/packages/backend/src/actors/task/workflow/queue.ts b/foundry/packages/backend/src/actors/task/workflow/queue.ts index a49c39a..db5c0a3 100644 --- a/foundry/packages/backend/src/actors/task/workflow/queue.ts +++ b/foundry/packages/backend/src/actors/task/workflow/queue.ts @@ -8,19 +8,27 @@ export const TASK_QUEUE_NAMES = [ "task.command.merge", "task.command.archive", "task.command.kill", - "task.command.workspace.create_session", - "task.command.workspace.create_session_and_send", - "task.command.workspace.ensure_session", - "task.command.workspace.send_message", - "task.command.workspace.stop_session", - "task.command.workspace.close_session", - "task.command.workspace.publish_pr", - "task.command.workspace.revert_file", - "task.command.workspace.change_owner", + "task.command.get", + "task.command.workbench.mark_unread", + "task.command.workbench.rename_task", + "task.command.workbench.rename_branch", + "task.command.workbench.create_session", + "task.command.workbench.ensure_session", + "task.command.workbench.rename_session", + "task.command.workbench.set_session_unread", + "task.command.workbench.update_draft", + "task.command.workbench.change_model", + "task.command.workbench.send_message", + "task.command.workbench.stop_session", + "task.command.workbench.sync_session_status", + "task.command.workbench.refresh_derived", + "task.command.workbench.refresh_session_transcript", + "task.command.workbench.close_session", + "task.command.workbench.publish_pr", + "task.command.workbench.revert_file", + "task.status_sync.result", ] as const; -export type TaskQueueName = (typeof TASK_QUEUE_NAMES)[number]; - export function taskWorkflowQueueName(name: string): string { return name; } diff --git a/foundry/packages/backend/src/actors/task/workflow/status-sync.ts b/foundry/packages/backend/src/actors/task/workflow/status-sync.ts new file mode 100644 index 0000000..ea3b0c8 --- /dev/null +++ b/foundry/packages/backend/src/actors/task/workflow/status-sync.ts @@ -0,0 +1,148 @@ +// @ts-nocheck +import { eq } from "drizzle-orm"; +import { getActorRuntimeContext } from "../../context.js"; +import { logActorWarning, resolveErrorMessage } from "../../logging.js"; +import { resolveWorkspaceGithubAuth } from "../../../services/github-auth.js"; +import { task as taskTable, taskRuntime, taskSandboxes } from "../db/schema.js"; +import { TASK_ROW_ID, appendHistory, resolveErrorDetail } from "./common.js"; +import { pushActiveBranchActivity } from "./push.js"; + +function mapSessionStatus(status: "running" | "idle" | "error") { + if (status === "idle") return "idle"; + if (status === "error") return "error"; + return "running"; +} + +export async function statusUpdateActivity(loopCtx: any, body: any): Promise { + const newStatus = mapSessionStatus(body.status); + const wasIdle = loopCtx.state.previousStatus === "idle"; + const didTransition = newStatus === "idle" && !wasIdle; + const isDuplicateStatus = loopCtx.state.previousStatus === newStatus; + + if (isDuplicateStatus) { + return false; + } + + const db = loopCtx.db; + const runtime = await db + .select({ + activeSandboxId: taskRuntime.activeSandboxId, + activeSessionId: taskRuntime.activeSessionId, + }) + .from(taskRuntime) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .get(); + + const isActive = runtime?.activeSandboxId === body.sandboxId && runtime?.activeSessionId === body.sessionId; + + if (isActive) { + await db.update(taskTable).set({ status: newStatus, updatedAt: body.at }).where(eq(taskTable.id, TASK_ROW_ID)).run(); + + await db + .update(taskRuntime) + .set({ statusMessage: `session:${body.status}`, updatedAt: body.at }) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .run(); + } + + await db + .update(taskSandboxes) + .set({ statusMessage: `session:${body.status}`, updatedAt: body.at }) + .where(eq(taskSandboxes.sandboxId, body.sandboxId)) + .run(); + + await appendHistory(loopCtx, "task.status", { + status: body.status, + sessionId: body.sessionId, + sandboxId: body.sandboxId, + }); + + if (isActive) { + loopCtx.state.previousStatus = newStatus; + + const { driver } = getActorRuntimeContext(); + if (loopCtx.state.branchName) { + driver.tmux.setWindowStatus(loopCtx.state.branchName, newStatus); + } + return didTransition; + } + + return false; +} + +export async function idleSubmitPrActivity(loopCtx: any): Promise { + const { driver } = getActorRuntimeContext(); + const db = loopCtx.db; + + const self = await db.select({ prSubmitted: taskTable.prSubmitted }).from(taskTable).where(eq(taskTable.id, TASK_ROW_ID)).get(); + + if (self && self.prSubmitted) return; + + const auth = await resolveWorkspaceGithubAuth(loopCtx, loopCtx.state.workspaceId); + + try { + await driver.git.fetch(loopCtx.state.repoLocalPath, { githubToken: auth?.githubToken ?? null }); + } catch (error) { + logActorWarning("task.status-sync", "fetch before PR submit failed", { + workspaceId: loopCtx.state.workspaceId, + repoId: loopCtx.state.repoId, + taskId: loopCtx.state.taskId, + error: resolveErrorMessage(error), + }); + } + + if (!loopCtx.state.branchName || !loopCtx.state.title) { + throw new Error("cannot submit PR before task has a branch and title"); + } + + try { + await pushActiveBranchActivity(loopCtx, { + reason: "auto_submit_idle", + historyKind: "task.push.auto", + }); + + const pr = await driver.github.createPr(loopCtx.state.repoLocalPath, loopCtx.state.branchName, loopCtx.state.title, undefined, { + githubToken: auth?.githubToken ?? null, + }); + + await db.update(taskTable).set({ prSubmitted: 1, updatedAt: Date.now() }).where(eq(taskTable.id, TASK_ROW_ID)).run(); + + await appendHistory(loopCtx, "task.step", { + step: "pr_submit", + taskId: loopCtx.state.taskId, + branchName: loopCtx.state.branchName, + prUrl: pr.url, + prNumber: pr.number, + }); + + await appendHistory(loopCtx, "task.pr_created", { + taskId: loopCtx.state.taskId, + branchName: loopCtx.state.branchName, + prUrl: pr.url, + prNumber: pr.number, + }); + } catch (error) { + const detail = resolveErrorDetail(error); + await db + .update(taskRuntime) + .set({ + statusMessage: `pr submit failed: ${detail}`, + updatedAt: Date.now(), + }) + .where(eq(taskRuntime.id, TASK_ROW_ID)) + .run(); + + await appendHistory(loopCtx, "task.pr_create_failed", { + taskId: loopCtx.state.taskId, + branchName: loopCtx.state.branchName, + error: detail, + }); + } +} + +export async function idleNotifyActivity(loopCtx: any): Promise { + const { notifications } = getActorRuntimeContext(); + if (notifications && loopCtx.state.branchName) { + await notifications.agentIdle(loopCtx.state.branchName); + } +} diff --git a/foundry/packages/backend/src/actors/task/workspace.ts b/foundry/packages/backend/src/actors/task/workspace.ts deleted file mode 100644 index 0856947..0000000 --- a/foundry/packages/backend/src/actors/task/workspace.ts +++ /dev/null @@ -1,1651 +0,0 @@ -// @ts-nocheck -import { randomUUID } from "node:crypto"; -import { basename } from "node:path"; -import { asc, eq } from "drizzle-orm"; -import { - DEFAULT_WORKSPACE_MODEL_GROUPS, - DEFAULT_WORKSPACE_MODEL_ID, - workspaceAgentForModel, - workspaceSandboxAgentIdForModel, -} from "@sandbox-agent/foundry-shared"; -import { getActorRuntimeContext } from "../context.js"; -import { getOrCreateOrganization, getOrCreateTaskSandbox, getOrCreateUser, getTaskSandbox, selfTask } from "../handles.js"; -import { logActorInfo, logActorWarning, resolveErrorMessage } from "../logging.js"; -import { resolveSandboxProviderId } from "../../sandbox-config.js"; -import { getBetterAuthService } from "../../services/better-auth.js"; -import { resolveOrganizationGithubAuth } from "../../services/github-auth.js"; -import { githubRepoFullNameFromRemote } from "../../services/repo.js"; -import { taskWorkflowQueueName } from "./workflow/queue.js"; -import { organizationWorkflowQueueName } from "../organization/queues.js"; - -import { task as taskTable, taskOwner, taskRuntime, taskSandboxes, taskWorkspaceSessions } from "./db/schema.js"; -import { getCurrentRecord } from "./workflow/common.js"; - -function emptyGitState() { - return { - fileChanges: [], - diffs: {}, - fileTree: [], - updatedAt: null as number | null, - }; -} - -const FALLBACK_MODEL = DEFAULT_WORKSPACE_MODEL_ID; - -function agentKindForModel(model: string) { - return workspaceAgentForModel(model); -} - -export function sandboxAgentIdForModel(model: string) { - return workspaceSandboxAgentIdForModel(model); -} - -async function resolveWorkspaceModelGroups(c: any): Promise { - try { - const sandbox = await getOrCreateTaskSandbox(c, c.state.organizationId, stableSandboxId(c)); - const groups = await sandbox.listWorkspaceModelGroups(); - return Array.isArray(groups) && groups.length > 0 ? groups : DEFAULT_WORKSPACE_MODEL_GROUPS; - } catch { - return DEFAULT_WORKSPACE_MODEL_GROUPS; - } -} - -async function resolveSandboxAgentForModel(c: any, model: string): Promise { - const groups = await resolveWorkspaceModelGroups(c); - return workspaceSandboxAgentIdForModel(model, groups); -} - -function repoLabelFromRemote(remoteUrl: string): string { - const trimmed = remoteUrl.trim(); - try { - const url = new URL(trimmed.startsWith("http") ? trimmed : `https://${trimmed}`); - const parts = url.pathname.replace(/\/+$/, "").split("/").filter(Boolean); - if (parts.length >= 2) { - return `${parts[0]}/${(parts[1] ?? "").replace(/\.git$/, "")}`; - } - } catch { - // ignore - } - - return basename(trimmed.replace(/\.git$/, "")); -} - -async function getRepositoryMetadata(c: any): Promise<{ defaultBranch: string | null; fullName: string | null; remoteUrl: string }> { - const organization = await getOrCreateOrganization(c, c.state.organizationId); - return await organization.getRepositoryMetadata({ repoId: c.state.repoId }); -} - -function parseDraftAttachments(value: string | null | undefined): Array { - if (!value) { - return []; - } - - try { - const parsed = JSON.parse(value) as unknown; - return Array.isArray(parsed) ? parsed : []; - } catch { - return []; - } -} - -function parseTranscript(value: string | null | undefined): Array { - if (!value) { - return []; - } - - try { - const parsed = JSON.parse(value) as unknown; - return Array.isArray(parsed) ? parsed : []; - } catch { - return []; - } -} - -function parseGitState(value: string | null | undefined): { fileChanges: Array; diffs: Record; fileTree: Array } { - if (!value) { - return emptyGitState(); - } - - try { - const parsed = JSON.parse(value) as { - fileChanges?: unknown; - diffs?: unknown; - fileTree?: unknown; - }; - return { - fileChanges: Array.isArray(parsed.fileChanges) ? parsed.fileChanges : [], - diffs: parsed.diffs && typeof parsed.diffs === "object" ? (parsed.diffs as Record) : {}, - fileTree: Array.isArray(parsed.fileTree) ? parsed.fileTree : [], - }; - } catch { - return emptyGitState(); - } -} - -async function readTaskOwner(c: any): Promise<{ - primaryUserId: string | null; - primaryGithubLogin: string | null; - primaryGithubEmail: string | null; - primaryGithubAvatarUrl: string | null; -} | null> { - const row = await c.db.select().from(taskOwner).where(eq(taskOwner.id, 1)).get(); - if (!row) { - return null; - } - return { - primaryUserId: row.primaryUserId ?? null, - primaryGithubLogin: row.primaryGithubLogin ?? null, - primaryGithubEmail: row.primaryGithubEmail ?? null, - primaryGithubAvatarUrl: row.primaryGithubAvatarUrl ?? null, - }; -} - -async function upsertTaskOwner( - c: any, - owner: { primaryUserId: string; primaryGithubLogin: string; primaryGithubEmail: string; primaryGithubAvatarUrl: string | null }, -): Promise { - const now = Date.now(); - await c.db - .insert(taskOwner) - .values({ - id: 1, - primaryUserId: owner.primaryUserId, - primaryGithubLogin: owner.primaryGithubLogin, - primaryGithubEmail: owner.primaryGithubEmail, - primaryGithubAvatarUrl: owner.primaryGithubAvatarUrl, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: taskOwner.id, - set: { - primaryUserId: owner.primaryUserId, - primaryGithubLogin: owner.primaryGithubLogin, - primaryGithubEmail: owner.primaryGithubEmail, - primaryGithubAvatarUrl: owner.primaryGithubAvatarUrl, - updatedAt: now, - }, - }) - .run(); -} - -/** - * Inject the user's GitHub OAuth token into the sandbox as a git credential store file. - * Also configures git user.name and user.email so commits are attributed correctly. - * The credential file is overwritten on each owner swap. - * - * Race condition note: If User A sends a message and the agent starts a long git operation, - * then User B triggers an owner swap, the in-flight git process still has User A's credentials - * (already read from the credential store). The next git operation uses User B's credentials. - */ -async function injectGitCredentials(sandbox: any, login: string, email: string, token: string): Promise { - const script = [ - "set -euo pipefail", - `git config --global user.name ${JSON.stringify(login)}`, - `git config --global user.email ${JSON.stringify(email)}`, - `git config --global credential.helper 'store --file=$HOME/.git-token'`, - `printf '%s\\n' ${JSON.stringify(`https://${login}:${token}@github.com`)} > $HOME/.git-token`, - `chmod 600 $HOME/.git-token`, - ]; - const result = await sandbox.runProcess({ - command: "bash", - args: ["-lc", script.join("; ")], - cwd: "/", - timeoutMs: 30_000, - }); - if ((result.exitCode ?? 0) !== 0) { - logActorWarning("task", "git credential injection failed", { - exitCode: result.exitCode, - output: [result.stdout, result.stderr].filter(Boolean).join(""), - }); - } -} - -/** - * Resolves the current user's GitHub identity from their auth session. - * Returns null if the session is invalid or the user has no GitHub account. - */ -async function resolveGithubIdentity(authSessionId: string): Promise<{ - userId: string; - login: string; - email: string; - avatarUrl: string | null; - accessToken: string; -} | null> { - const authService = getBetterAuthService(); - const authState = await authService.getAuthState(authSessionId); - if (!authState?.user?.id) { - return null; - } - - const tokenResult = await authService.getAccessTokenForSession(authSessionId); - if (!tokenResult?.accessToken) { - return null; - } - - const githubAccount = authState.accounts?.find((account: any) => account.providerId === "github"); - if (!githubAccount) { - return null; - } - - // Resolve the GitHub login from the API since Better Auth only stores the - // numeric account ID, not the login username. - let login = authState.user.name ?? "unknown"; - let avatarUrl = authState.user.image ?? null; - try { - const resp = await fetch("https://api.github.com/user", { - headers: { - Authorization: `Bearer ${tokenResult.accessToken}`, - Accept: "application/vnd.github+json", - }, - }); - if (resp.ok) { - const ghUser = (await resp.json()) as { login?: string; avatar_url?: string }; - if (ghUser.login) { - login = ghUser.login; - } - if (ghUser.avatar_url) { - avatarUrl = ghUser.avatar_url; - } - } - } catch (error) { - console.warn("resolveGithubIdentity: failed to fetch GitHub user", error); - } - - return { - userId: authState.user.id, - login, - email: authState.user.email ?? `${githubAccount.accountId}@users.noreply.github.com`, - avatarUrl, - accessToken: tokenResult.accessToken, - }; -} - -/** - * Check if the task owner needs to swap, and if so, update the owner record - * and inject new git credentials into the sandbox. - * Returns true if an owner swap occurred. - */ -async function maybeSwapTaskOwner(c: any, authSessionId: string | null | undefined, sandbox: any | null): Promise { - if (!authSessionId) { - return false; - } - - const identity = await resolveGithubIdentity(authSessionId); - if (!identity) { - return false; - } - - const currentOwner = await readTaskOwner(c); - if (currentOwner?.primaryUserId === identity.userId) { - return false; - } - - await upsertTaskOwner(c, { - primaryUserId: identity.userId, - primaryGithubLogin: identity.login, - primaryGithubEmail: identity.email, - primaryGithubAvatarUrl: identity.avatarUrl, - }); - - if (sandbox) { - await injectGitCredentials(sandbox, identity.login, identity.email, identity.accessToken); - } - - return true; -} - -/** - * Manually change the task owner. Updates the owner record and broadcasts the - * change to subscribers. Git credentials are NOT injected here — they will be - * injected the next time the target user sends a message (auto-swap path). - */ -export async function changeTaskOwnerManually( - c: any, - input: { primaryUserId: string; primaryGithubLogin: string; primaryGithubEmail: string; primaryGithubAvatarUrl: string | null }, -): Promise { - await upsertTaskOwner(c, input); - await broadcastTaskUpdate(c); -} - -export function shouldMarkSessionUnreadForStatus(meta: { thinkingSinceMs?: number | null }, status: "running" | "idle" | "error"): boolean { - if (status === "running") { - return false; - } - - // Only mark unread when we observe the transition out of an active thinking state. - // Repeated idle polls for an already-finished session must not flip unread back on. - return Boolean(meta.thinkingSinceMs); -} - -export function shouldRecreateSessionForModelChange(meta: { - status: "pending_provision" | "pending_session_create" | "ready" | "error"; - sandboxSessionId?: string | null; - created?: boolean; - transcript?: Array; -}): boolean { - if (meta.status !== "ready" || !meta.sandboxSessionId) { - return false; - } - - if (meta.created) { - return false; - } - - return !Array.isArray(meta.transcript) || meta.transcript.length === 0; -} - -async function listSessionMetaRows(c: any, options?: { includeClosed?: boolean }): Promise> { - const rows = await c.db.select().from(taskWorkspaceSessions).orderBy(asc(taskWorkspaceSessions.createdAt)).all(); - const mapped = rows.map((row: any) => ({ - ...row, - id: row.sessionId, - sessionId: row.sessionId, - sandboxSessionId: row.sandboxSessionId ?? null, - status: row.status ?? "ready", - errorMessage: row.errorMessage ?? null, - transcript: parseTranscript(row.transcriptJson), - transcriptUpdatedAt: row.transcriptUpdatedAt ?? null, - created: row.created === 1, - closed: row.closed === 1, - })); - - if (options?.includeClosed === true) { - return mapped; - } - - return mapped.filter((row: any) => row.closed !== true); -} - -async function nextSessionName(c: any): Promise { - const rows = await listSessionMetaRows(c, { includeClosed: true }); - return `Session ${rows.length + 1}`; -} - -async function readSessionMeta(c: any, sessionId: string): Promise { - const row = await c.db.select().from(taskWorkspaceSessions).where(eq(taskWorkspaceSessions.sessionId, sessionId)).get(); - - if (!row) { - return null; - } - - return { - ...row, - id: row.sessionId, - sessionId: row.sessionId, - sandboxSessionId: row.sandboxSessionId ?? null, - status: row.status ?? "ready", - errorMessage: row.errorMessage ?? null, - transcript: parseTranscript(row.transcriptJson), - transcriptUpdatedAt: row.transcriptUpdatedAt ?? null, - created: row.created === 1, - closed: row.closed === 1, - }; -} - -async function getUserTaskState(c: any, authSessionId?: string | null): Promise<{ activeSessionId: string | null; bySessionId: Map }> { - if (!authSessionId) { - return { activeSessionId: null, bySessionId: new Map() }; - } - - const authState = await getBetterAuthService().getAuthState(authSessionId); - const userId = authState?.user?.id; - if (typeof userId !== "string" || userId.length === 0) { - return { activeSessionId: null, bySessionId: new Map() }; - } - - const user = await getOrCreateUser(c, userId); - const state = await user.getTaskState({ taskId: c.state.taskId }); - const bySessionId = new Map( - (state?.sessions ?? []).map((row: any) => [ - row.sessionId, - { - unread: Boolean(row.unread), - draftText: row.draftText ?? "", - draftAttachments: parseDraftAttachments(row.draftAttachmentsJson), - draftUpdatedAtMs: row.draftUpdatedAt ?? null, - }, - ]), - ); - return { - activeSessionId: state?.activeSessionId ?? null, - bySessionId, - }; -} - -async function upsertUserTaskState(c: any, authSessionId: string | null | undefined, sessionId: string, patch: Record): Promise { - if (!authSessionId) { - return; - } - - const authState = await getBetterAuthService().getAuthState(authSessionId); - const userId = authState?.user?.id; - if (typeof userId !== "string" || userId.length === 0) { - return; - } - - const user = await getOrCreateUser(c, userId); - await user.upsertTaskState({ - taskId: c.state.taskId, - sessionId, - patch, - }); -} - -async function deleteUserTaskState(c: any, authSessionId: string | null | undefined, sessionId: string): Promise { - if (!authSessionId) { - return; - } - - const authState = await getBetterAuthService().getAuthState(authSessionId); - const userId = authState?.user?.id; - if (typeof userId !== "string" || userId.length === 0) { - return; - } - - const user = await getOrCreateUser(c, userId); - await user.deleteTaskState({ - taskId: c.state.taskId, - sessionId, - }); -} - -async function resolveDefaultModel(c: any, authSessionId?: string | null): Promise { - if (!authSessionId) { - return FALLBACK_MODEL; - } - - const authState = await getBetterAuthService().getAuthState(authSessionId); - const userId = authState?.user?.id; - if (typeof userId !== "string" || userId.length === 0) { - return FALLBACK_MODEL; - } - - const user = await getOrCreateUser(c, userId); - const userState = await user.getAppAuthState({ sessionId: authSessionId }); - return userState?.profile?.defaultModel ?? FALLBACK_MODEL; -} - -async function ensureSessionMeta( - c: any, - params: { - sessionId: string; - sandboxSessionId?: string | null; - model?: string; - authSessionId?: string | null; - sessionName?: string; - created?: boolean; - status?: "pending_provision" | "pending_session_create" | "ready" | "error"; - errorMessage?: string | null; - }, -): Promise { - const existing = await readSessionMeta(c, params.sessionId); - if (existing) { - return existing; - } - - const now = Date.now(); - const sessionName = params.sessionName ?? (await nextSessionName(c)); - const model = params.model ?? (await resolveDefaultModel(c, params.authSessionId)); - - await c.db - .insert(taskWorkspaceSessions) - .values({ - sessionId: params.sessionId, - sandboxSessionId: params.sandboxSessionId ?? null, - sessionName, - model, - status: params.status ?? "ready", - errorMessage: params.errorMessage ?? null, - transcriptJson: "[]", - transcriptUpdatedAt: null, - created: params.created === false ? 0 : 1, - closed: 0, - thinkingSinceMs: null, - createdAt: now, - updatedAt: now, - }) - .run(); - - return await readSessionMeta(c, params.sessionId); -} - -async function updateSessionMeta(c: any, sessionId: string, values: Record): Promise { - await ensureSessionMeta(c, { sessionId }); - await c.db - .update(taskWorkspaceSessions) - .set({ - ...values, - updatedAt: Date.now(), - }) - .where(eq(taskWorkspaceSessions.sessionId, sessionId)) - .run(); - return await readSessionMeta(c, sessionId); -} - -async function readSessionMetaBySandboxSessionId(c: any, sandboxSessionId: string): Promise { - const row = await c.db.select().from(taskWorkspaceSessions).where(eq(taskWorkspaceSessions.sandboxSessionId, sandboxSessionId)).get(); - if (!row) { - return null; - } - return await readSessionMeta(c, row.sessionId); -} - -async function requireReadySessionMeta(c: any, sessionId: string): Promise { - const meta = await readSessionMeta(c, sessionId); - if (!meta) { - throw new Error(`Unknown workspace session: ${sessionId}`); - } - if (meta.status !== "ready" || !meta.sandboxSessionId) { - throw new Error(meta.errorMessage ?? "This workspace session is still preparing"); - } - return meta; -} - -export function requireSendableSessionMeta(meta: any, sessionId: string): any { - if (!meta) { - throw new Error(`Unknown workspace session: ${sessionId}`); - } - if (meta.status !== "ready" || !meta.sandboxSessionId) { - throw new Error(`Session is not ready (status: ${meta.status}). Wait for session provisioning to complete.`); - } - return meta; -} - -function shellFragment(parts: string[]): string { - return parts.join(" && "); -} - -function stableSandboxId(c: any): string { - return c.state.taskId; -} - -async function getTaskSandboxRuntime( - c: any, - record: any, -): Promise<{ - sandbox: any; - sandboxId: string; - sandboxProviderId: string; - switchTarget: string; - cwd: string; -}> { - const { config } = getActorRuntimeContext(); - const sandboxId = stableSandboxId(c); - const sandboxProviderId = resolveSandboxProviderId(config, record.sandboxProviderId ?? null); - const sandbox = await getOrCreateTaskSandbox(c, c.state.organizationId, sandboxId, {}); - const actorId = typeof sandbox.resolve === "function" ? await sandbox.resolve().catch(() => null) : null; - const switchTarget = sandboxProviderId === "local" ? `sandbox://local/${sandboxId}` : `sandbox://e2b/${sandboxId}`; - - // Resolve the actual repo CWD from the sandbox's $HOME (differs by provider). - const repoCwdResult = await sandbox.repoCwd(); - const cwd = repoCwdResult?.cwd ?? "$HOME/repo"; - const now = Date.now(); - - await c.db - .insert(taskSandboxes) - .values({ - sandboxId, - sandboxProviderId, - sandboxActorId: typeof actorId === "string" ? actorId : null, - switchTarget, - cwd, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: taskSandboxes.sandboxId, - set: { - sandboxProviderId, - sandboxActorId: typeof actorId === "string" ? actorId : null, - switchTarget, - cwd, - updatedAt: now, - }, - }) - .run(); - - await c.db - .update(taskRuntime) - .set({ - activeSandboxId: sandboxId, - activeSwitchTarget: switchTarget, - activeCwd: cwd, - updatedAt: now, - }) - .where(eq(taskRuntime.id, 1)) - .run(); - - return { - sandbox, - sandboxId, - sandboxProviderId, - switchTarget, - cwd, - }; -} - -/** - * Track whether the sandbox repo has been fully prepared (cloned + fetched + checked out) - * for the current actor lifecycle. Subsequent calls can skip the expensive `git fetch` - * when `skipFetch` is true (used by sendWorkspaceMessage to avoid blocking on every prompt). - */ -let sandboxRepoPrepared = false; - -async function ensureSandboxRepo(c: any, sandbox: any, record: any, opts?: { skipFetchIfPrepared?: boolean; authSessionId?: string | null }): Promise { - if (!record.branchName) { - throw new Error("cannot prepare a sandbox repo before the task branch exists"); - } - - // If the repo was already prepared and the caller allows skipping fetch, just return. - // The clone, fetch, and checkout already happened on a prior call. - if (opts?.skipFetchIfPrepared && sandboxRepoPrepared) { - logActorInfo("task.sandbox", "ensureSandboxRepo skipped (already prepared)"); - return; - } - - const repoStart = performance.now(); - - const t0 = performance.now(); - const auth = await resolveOrganizationGithubAuth(c, c.state.organizationId); - const metadata = await getRepositoryMetadata(c); - logActorInfo("task.sandbox", "resolveAuth+metadata", { durationMs: Math.round(performance.now() - t0) }); - - const baseRef = metadata.defaultBranch ?? "main"; - // Use $HOME inside the shell script so the path resolves correctly regardless - // of which user the sandbox runs as (E2B: "user", local Docker: "sandbox"). - const script = [ - "set -euo pipefail", - 'REPO_DIR="$HOME/repo"', - 'mkdir -p "$HOME"', - "git config --global credential.helper '!f() { echo username=x-access-token; echo password=${GH_TOKEN:-$GITHUB_TOKEN}; }; f'", - `if [ ! -d "$REPO_DIR/.git" ]; then rm -rf "$REPO_DIR" && git clone ${JSON.stringify(metadata.remoteUrl)} "$REPO_DIR"; fi`, - 'cd "$REPO_DIR"', - "git fetch origin --prune", - `if git show-ref --verify --quiet refs/remotes/origin/${JSON.stringify(record.branchName).slice(1, -1)}; then target_ref=${JSON.stringify( - `origin/${record.branchName}`, - )}; else target_ref=${JSON.stringify(baseRef)}; fi`, - `git checkout -B ${JSON.stringify(record.branchName)} \"$target_ref\"`, - ]; - - const t1 = performance.now(); - const result = await sandbox.runProcess({ - command: "bash", - args: ["-lc", script.join("; ")], - cwd: "/", - env: auth?.githubToken - ? { - GH_TOKEN: auth.githubToken, - GITHUB_TOKEN: auth.githubToken, - } - : undefined, - timeoutMs: 5 * 60_000, - }); - logActorInfo("task.sandbox", "git clone/fetch/checkout", { - branch: record.branchName, - repo: metadata.remoteUrl, - durationMs: Math.round(performance.now() - t1), - }); - - if ((result.exitCode ?? 0) !== 0) { - throw new Error(`sandbox repo preparation failed (${result.exitCode ?? 1}): ${[result.stdout, result.stderr].filter(Boolean).join("")}`); - } - - // On first repo preparation, inject the task owner's git credentials into the sandbox - // so that push/commit operations are authenticated and attributed to the correct user. - if (!sandboxRepoPrepared && opts?.authSessionId) { - const t2 = performance.now(); - await maybeSwapTaskOwner(c, opts.authSessionId, sandbox); - logActorInfo("task.sandbox", "maybeSwapTaskOwner", { durationMs: Math.round(performance.now() - t2) }); - } - - sandboxRepoPrepared = true; - logActorInfo("task.sandbox", "ensureSandboxRepo complete", { totalDurationMs: Math.round(performance.now() - repoStart) }); -} - -async function executeInSandbox( - c: any, - params: { - sandboxId: string; - cwd: string; - command: string; - label: string; - }, -): Promise<{ exitCode: number; result: string }> { - const record = await ensureWorkspaceSeeded(c); - const runtime = await getTaskSandboxRuntime(c, record); - await ensureSandboxRepo(c, runtime.sandbox, record); - const response = await runtime.sandbox.runProcess({ - command: "bash", - args: ["-lc", shellFragment([`cd ${JSON.stringify(params.cwd)}`, params.command])], - cwd: "/", - timeoutMs: 5 * 60_000, - }); - - return { - exitCode: response.exitCode ?? 0, - result: [response.stdout, response.stderr].filter(Boolean).join(""), - }; -} - -function parseGitStatus(output: string): Array<{ path: string; type: "M" | "A" | "D" }> { - return output - .split("\n") - .map((line) => line.trimEnd()) - .filter(Boolean) - .map((line) => { - const status = line.slice(0, 2).trim(); - const rawPath = line.slice(3).trim(); - const path = rawPath.includes(" -> ") ? (rawPath.split(" -> ").pop() ?? rawPath) : rawPath; - const type = status.includes("D") ? "D" : status.includes("A") || status === "??" ? "A" : "M"; - return { path, type }; - }); -} - -function parseNumstat(output: string): Map { - const map = new Map(); - for (const line of output.split("\n")) { - const trimmed = line.trim(); - if (!trimmed) continue; - const [addedRaw, removedRaw, ...pathParts] = trimmed.split("\t"); - const path = pathParts.join("\t").trim(); - if (!path) continue; - map.set(path, { - added: Number.parseInt(addedRaw ?? "0", 10) || 0, - removed: Number.parseInt(removedRaw ?? "0", 10) || 0, - }); - } - return map; -} - -function buildFileTree(paths: string[]): Array { - const root = { - children: new Map(), - }; - - for (const path of paths) { - const parts = path.split("/").filter(Boolean); - let current = root; - let currentPath = ""; - - for (let index = 0; index < parts.length; index += 1) { - const part = parts[index]!; - currentPath = currentPath ? `${currentPath}/${part}` : part; - const isDir = index < parts.length - 1; - let node = current.children.get(part); - if (!node) { - node = { - name: part, - path: currentPath, - isDir, - children: isDir ? new Map() : undefined, - }; - current.children.set(part, node); - } else if (isDir && !(node.children instanceof Map)) { - node.children = new Map(); - } - current = node; - } - } - - function sortNodes(nodes: Iterable): Array { - return [...nodes] - .map((node) => - node.isDir - ? { - name: node.name, - path: node.path, - isDir: true, - children: sortNodes(node.children?.values?.() ?? []), - } - : { - name: node.name, - path: node.path, - isDir: false, - }, - ) - .sort((left, right) => { - if (left.isDir !== right.isDir) { - return left.isDir ? -1 : 1; - } - return left.path.localeCompare(right.path); - }); - } - - return sortNodes(root.children.values()); -} - -async function collectWorkspaceGitState(c: any, record: any) { - const activeSandboxId = record.activeSandboxId; - const activeSandbox = activeSandboxId != null ? ((record.sandboxes ?? []).find((candidate: any) => candidate.sandboxId === activeSandboxId) ?? null) : null; - const cwd = activeSandbox?.cwd ?? record.sandboxes?.[0]?.cwd ?? null; - if (!activeSandboxId || !cwd) { - return { - fileChanges: [], - diffs: {}, - fileTree: [], - }; - } - - const statusResult = await executeInSandbox(c, { - sandboxId: activeSandboxId, - cwd, - command: "git status --porcelain=v1 -uall", - label: "git status", - }); - if (statusResult.exitCode !== 0) { - return { - fileChanges: [], - diffs: {}, - fileTree: [], - }; - } - - const statusRows = parseGitStatus(statusResult.result); - const numstatResult = await executeInSandbox(c, { - sandboxId: activeSandboxId, - cwd, - command: "git diff --numstat", - label: "git diff numstat", - }); - const numstat = parseNumstat(numstatResult.result); - - const filesResult = await executeInSandbox(c, { - sandboxId: activeSandboxId, - cwd, - command: "git ls-files --cached --others --exclude-standard", - label: "git ls-files", - }); - const allPaths = filesResult.result - .split("\n") - .map((line) => line.trim()) - .filter(Boolean); - - const diffs: Record = {}; - for (const row of statusRows) { - const diffResult = await executeInSandbox(c, { - sandboxId: activeSandboxId, - cwd, - command: `git diff -- ${JSON.stringify(row.path)}`, - label: `git diff ${row.path}`, - }); - diffs[row.path] = diffResult.exitCode === 0 ? diffResult.result : ""; - } - - return { - fileChanges: statusRows.map((row) => { - const counts = numstat.get(row.path) ?? { added: 0, removed: 0 }; - return { - path: row.path, - added: counts.added, - removed: counts.removed, - type: row.type, - }; - }), - diffs, - fileTree: buildFileTree(allPaths), - }; -} - -async function readCachedGitState(c: any): Promise<{ fileChanges: Array; diffs: Record; fileTree: Array; updatedAt: number | null }> { - const row = await c.db - .select({ - gitStateJson: taskRuntime.gitStateJson, - gitStateUpdatedAt: taskRuntime.gitStateUpdatedAt, - }) - .from(taskRuntime) - .where(eq(taskRuntime.id, 1)) - .get(); - const parsed = parseGitState(row?.gitStateJson); - return { - ...parsed, - updatedAt: row?.gitStateUpdatedAt ?? null, - }; -} - -async function writeCachedGitState(c: any, gitState: { fileChanges: Array; diffs: Record; fileTree: Array }): Promise { - const now = Date.now(); - await c.db - .update(taskRuntime) - .set({ - gitStateJson: JSON.stringify(gitState), - gitStateUpdatedAt: now, - updatedAt: now, - }) - .where(eq(taskRuntime.id, 1)) - .run(); -} - -async function readSessionTranscript(c: any, record: any, sessionId: string) { - const sandboxId = record.activeSandboxId ?? stableSandboxId(c); - if (!sandboxId) { - return []; - } - - const sandbox = getTaskSandbox(c, c.state.organizationId, sandboxId); - const page = await sandbox.getEvents({ - sessionId, - limit: 100, - }); - return page.items.map((event: any) => ({ - id: event.id, - eventIndex: event.eventIndex, - sessionId: event.sessionId, - createdAt: event.createdAt, - connectionId: event.connectionId, - sender: event.sender, - payload: event.payload, - })); -} - -async function writeSessionTranscript(c: any, sessionId: string, transcript: Array): Promise { - await updateSessionMeta(c, sessionId, { - transcriptJson: JSON.stringify(transcript), - transcriptUpdatedAt: Date.now(), - }); -} - -function fireRefreshDerived(c: any): void { - const self = selfTask(c); - void self.refreshDerived({}).catch(() => {}); -} - -function fireRefreshSessionTranscript(c: any, sessionId: string): void { - const self = selfTask(c); - void self.refreshSessionTranscript({ sessionId }).catch(() => {}); -} - -async function enqueueWorkspaceEnsureSession(c: any, sessionId: string): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.workspace.ensure_session" as any), { sessionId }, { wait: false }); -} - -function pendingWorkspaceSessionStatus(record: any): "pending_provision" | "pending_session_create" { - return record.activeSandboxId ? "pending_session_create" : "pending_provision"; -} - -async function maybeScheduleWorkspaceRefreshes(c: any, record: any, sessions: Array): Promise { - const gitState = await readCachedGitState(c); - if (record.activeSandboxId && !gitState.updatedAt) { - fireRefreshDerived(c); - } - - for (const session of sessions) { - if (session.closed || session.status !== "ready" || !session.sandboxSessionId || session.transcriptUpdatedAt) { - continue; - } - fireRefreshSessionTranscript(c, session.sandboxSessionId); - } -} - -function computeWorkspaceTaskStatus(record: any, sessions: Array) { - if (record.status && String(record.status).startsWith("init_")) { - return record.status; - } - if (record.status === "archived" || record.status === "killed") { - return record.status; - } - if (sessions.some((session) => session.closed !== true && session.thinkingSinceMs)) { - return "running"; - } - if (sessions.some((session) => session.closed !== true && session.status === "error")) { - return "error"; - } - return "idle"; -} - -export async function ensureWorkspaceSeeded(c: any): Promise { - return await getCurrentRecord(c); -} - -function buildSessionSummary(meta: any, userState?: any): any { - const derivedSandboxSessionId = meta.status === "ready" ? (meta.sandboxSessionId ?? null) : null; - const sessionStatus = - meta.status === "pending_provision" || meta.status === "pending_session_create" - ? meta.status - : meta.thinkingSinceMs - ? "running" - : meta.status === "error" - ? "error" - : meta.status === "ready" && derivedSandboxSessionId - ? "idle" - : "ready"; - let thinkingSinceMs = meta.thinkingSinceMs ?? null; - let unread = Boolean(userState?.unread); - if (thinkingSinceMs && sessionStatus !== "running") { - thinkingSinceMs = null; - unread = true; - } - - return { - id: meta.id, - sessionId: meta.sessionId, - sandboxSessionId: derivedSandboxSessionId, - sessionName: meta.sessionName, - agent: agentKindForModel(meta.model), - model: meta.model, - status: sessionStatus, - thinkingSinceMs: sessionStatus === "running" ? thinkingSinceMs : null, - unread, - created: Boolean(meta.created || derivedSandboxSessionId), - errorMessage: meta.errorMessage ?? null, - }; -} - -function buildSessionDetailFromMeta(meta: any, userState?: any): any { - const summary = buildSessionSummary(meta, userState); - return { - sessionId: meta.sessionId, - sandboxSessionId: summary.sandboxSessionId ?? null, - sessionName: summary.sessionName, - agent: summary.agent, - model: summary.model, - status: summary.status, - thinkingSinceMs: summary.thinkingSinceMs, - unread: summary.unread, - created: summary.created, - errorMessage: summary.errorMessage, - draft: { - text: userState?.draftText ?? "", - attachments: Array.isArray(userState?.draftAttachments) ? userState.draftAttachments : [], - updatedAtMs: userState?.draftUpdatedAtMs ?? null, - }, - transcript: meta.transcript ?? [], - }; -} - -/** - * Builds a WorkspaceTaskSummary from local task actor state. Task actors push - * this to the parent organization actor so organization sidebar reads stay local. - */ -export async function buildTaskSummary(c: any, authSessionId?: string | null): Promise { - const record = await ensureWorkspaceSeeded(c); - const repositoryMetadata = await getRepositoryMetadata(c); - const sessions = await listSessionMetaRows(c); - await maybeScheduleWorkspaceRefreshes(c, record, sessions); - const userTaskState = await getUserTaskState(c, authSessionId); - const taskStatus = computeWorkspaceTaskStatus(record, sessions); - const activeSessionId = - userTaskState.activeSessionId && sessions.some((meta) => meta.sessionId === userTaskState.activeSessionId) ? userTaskState.activeSessionId : null; - - const owner = await readTaskOwner(c); - - return { - id: c.state.taskId, - repoId: c.state.repoId, - title: record.title ?? "New Task", - status: taskStatus, - repoName: repoLabelFromRemote(repositoryMetadata.remoteUrl), - updatedAtMs: record.updatedAt, - branch: record.branchName, - pullRequest: record.pullRequest ?? null, - activeSessionId, - sessionsSummary: sessions.map((meta) => buildSessionSummary(meta, userTaskState.bySessionId.get(meta.sessionId))), - primaryUserLogin: owner?.primaryGithubLogin ?? null, - primaryUserAvatarUrl: owner?.primaryGithubAvatarUrl ?? null, - }; -} - -/** - * Builds a WorkspaceTaskDetail from local task actor state for direct task - * subscribers. This is a full replacement payload, not a patch. - */ -export async function buildTaskDetail(c: any, authSessionId?: string | null): Promise { - const record = await ensureWorkspaceSeeded(c); - const gitState = await readCachedGitState(c); - const sessions = await listSessionMetaRows(c); - await maybeScheduleWorkspaceRefreshes(c, record, sessions); - const summary = await buildTaskSummary(c, authSessionId); - - return { - ...summary, - task: record.task, - fileChanges: gitState.fileChanges, - diffs: gitState.diffs, - fileTree: gitState.fileTree, - minutesUsed: 0, - sandboxes: await Promise.all( - (record.sandboxes ?? []).map(async (sandbox: any) => { - let url: string | null = null; - if (sandbox.sandboxId) { - try { - const handle = getTaskSandbox(c, c.state.organizationId, sandbox.sandboxId); - const conn = await handle.sandboxAgentConnection(); - if (conn?.endpoint && !conn.endpoint.startsWith("mock://")) { - url = conn.endpoint; - } - } catch { - // Sandbox may not be running - } - } - return { - sandboxProviderId: sandbox.sandboxProviderId, - sandboxId: sandbox.sandboxId, - cwd: sandbox.cwd ?? null, - url, - }; - }), - ), - activeSandboxId: record.activeSandboxId ?? null, - }; -} - -/** - * Builds a WorkspaceSessionDetail for a specific session. - */ -export async function buildSessionDetail(c: any, sessionId: string, authSessionId?: string | null): Promise { - const record = await ensureWorkspaceSeeded(c); - const meta = await readSessionMeta(c, sessionId); - if (!meta || meta.closed) { - throw new Error(`Unknown workspace session: ${sessionId}`); - } - const userTaskState = await getUserTaskState(c, authSessionId); - const userSessionState = userTaskState.bySessionId.get(sessionId); - - // Skip live transcript fetch if the sandbox session doesn't exist yet or - // the session is still provisioning — the sandbox API will block/timeout. - const isPending = meta.status === "pending_provision" || meta.status === "pending_session_create"; - if (!meta.sandboxSessionId || isPending) { - return buildSessionDetailFromMeta(meta, userSessionState); - } - - try { - const transcript = await readSessionTranscript(c, record, meta.sandboxSessionId); - if (JSON.stringify(meta.transcript ?? []) !== JSON.stringify(transcript)) { - await writeSessionTranscript(c, meta.sessionId, transcript); - return buildSessionDetailFromMeta( - { - ...meta, - transcript, - transcriptUpdatedAt: Date.now(), - }, - userSessionState, - ); - } - } catch (error) { - // Session detail reads degrade to cached transcript when sandbox is unavailable. - logActorWarning("task", "readSessionTranscript failed, using cached transcript", { - taskId: c.state.taskId, - sessionId, - error: resolveErrorMessage(error), - }); - } - - return buildSessionDetailFromMeta(meta, userSessionState); -} - -export async function getTaskSummary(c: any): Promise { - return await buildTaskSummary(c); -} - -export async function getTaskDetail(c: any, authSessionId?: string): Promise { - return await buildTaskDetail(c, authSessionId); -} - -export async function getSessionDetail(c: any, sessionId: string, authSessionId?: string): Promise { - return await buildSessionDetail(c, sessionId, authSessionId); -} - -/** - * Replaces the old notifyWorkspaceUpdated pattern. - * - * The task actor emits two kinds of updates: - * - Push summary state up to the parent organization actor so the sidebar - * materialized projection stays current. - * - Broadcast full detail/session payloads down to direct task subscribers. - */ -export async function broadcastTaskUpdate(c: any, options?: { sessionId?: string }): Promise { - const organization = await getOrCreateOrganization(c, c.state.organizationId); - await organization.send( - organizationWorkflowQueueName("organization.command.applyTaskSummaryUpdate"), - { taskSummary: await buildTaskSummary(c) }, - { wait: false }, - ); - c.broadcast("taskUpdated", { - type: "taskUpdated", - detail: await buildTaskDetail(c), - }); - - if (options?.sessionId) { - c.broadcast("sessionUpdated", { - type: "sessionUpdated", - session: await buildSessionDetail(c, options.sessionId), - }); - } -} - -export async function refreshWorkspaceDerivedState(c: any): Promise { - const record = await ensureWorkspaceSeeded(c); - const gitState = await collectWorkspaceGitState(c, record); - await writeCachedGitState(c, gitState); - await broadcastTaskUpdate(c); -} - -export async function refreshWorkspaceSessionTranscript(c: any, sessionId: string): Promise { - const record = await ensureWorkspaceSeeded(c); - const meta = (await readSessionMetaBySandboxSessionId(c, sessionId)) ?? (await readSessionMeta(c, sessionId)); - if (!meta?.sandboxSessionId) { - return; - } - - const transcript = await readSessionTranscript(c, record, meta.sandboxSessionId); - await writeSessionTranscript(c, meta.sessionId, transcript); - await broadcastTaskUpdate(c, { sessionId: meta.sessionId }); -} - -export async function renameWorkspaceTask(c: any, value: string): Promise { - const nextTitle = value.trim(); - if (!nextTitle) { - throw new Error("task title is required"); - } - - await c.db - .update(taskTable) - .set({ - title: nextTitle, - updatedAt: Date.now(), - }) - .where(eq(taskTable.id, 1)) - .run(); - await broadcastTaskUpdate(c); -} - -export async function syncTaskPullRequest(c: any, pullRequest: any): Promise { - const now = pullRequest?.updatedAtMs ?? Date.now(); - await c.db - .update(taskTable) - .set({ - pullRequestJson: pullRequest ? JSON.stringify(pullRequest) : null, - updatedAt: now, - }) - .where(eq(taskTable.id, 1)) - .run(); - await broadcastTaskUpdate(c); -} - -export async function createWorkspaceSession(c: any, model?: string, authSessionId?: string): Promise<{ sessionId: string }> { - const sessionId = `session-${randomUUID()}`; - const record = await ensureWorkspaceSeeded(c); - await ensureSessionMeta(c, { - sessionId, - model: model ?? (await resolveDefaultModel(c, authSessionId)), - authSessionId, - sandboxSessionId: null, - status: pendingWorkspaceSessionStatus(record), - created: false, - }); - await upsertUserTaskState(c, authSessionId, sessionId, { - activeSessionId: sessionId, - unread: false, - }); - await broadcastTaskUpdate(c, { sessionId: sessionId }); - await enqueueWorkspaceEnsureSession(c, sessionId); - return { sessionId }; -} - -export async function ensureWorkspaceSession(c: any, sessionId: string, model?: string, authSessionId?: string): Promise { - const ensureStart = performance.now(); - const meta = await readSessionMeta(c, sessionId); - if (!meta || meta.closed) { - return; - } - - const record = await ensureWorkspaceSeeded(c); - if (meta.sandboxSessionId && meta.status === "ready") { - fireRefreshSessionTranscript(c, meta.sandboxSessionId); - await broadcastTaskUpdate(c, { sessionId: sessionId }); - return; - } - - await updateSessionMeta(c, sessionId, { - sandboxSessionId: meta.sandboxSessionId ?? sessionId, - status: "pending_session_create", - errorMessage: null, - }); - - try { - const t0 = performance.now(); - const runtime = await getTaskSandboxRuntime(c, record); - logActorInfo("task.session", "getTaskSandboxRuntime", { sessionId, durationMs: Math.round(performance.now() - t0) }); - - const t1 = performance.now(); - await ensureSandboxRepo(c, runtime.sandbox, record); - logActorInfo("task.session", "ensureSandboxRepo", { sessionId, durationMs: Math.round(performance.now() - t1) }); - - const resolvedModel = model ?? meta.model ?? (await resolveDefaultModel(c, authSessionId)); - const resolvedAgent = await resolveSandboxAgentForModel(c, resolvedModel); - - const t2 = performance.now(); - await runtime.sandbox.createSession({ - id: meta.sandboxSessionId ?? sessionId, - agent: resolvedAgent, - model: resolvedModel, - sessionInit: { - cwd: runtime.cwd, - }, - }); - logActorInfo("task.session", "createSession", { sessionId, agent: resolvedAgent, model: resolvedModel, durationMs: Math.round(performance.now() - t2) }); - - await updateSessionMeta(c, sessionId, { - sandboxSessionId: meta.sandboxSessionId ?? sessionId, - status: "ready", - errorMessage: null, - }); - logActorInfo("task.session", "ensureWorkspaceSession complete", { sessionId, totalDurationMs: Math.round(performance.now() - ensureStart) }); - fireRefreshSessionTranscript(c, meta.sandboxSessionId ?? sessionId); - } catch (error) { - await updateSessionMeta(c, sessionId, { - status: "error", - errorMessage: error instanceof Error ? error.message : String(error), - }); - } - - await broadcastTaskUpdate(c, { sessionId: sessionId }); -} - -export async function enqueuePendingWorkspaceSessions(c: any): Promise { - const pending = (await listSessionMetaRows(c, { includeClosed: true })).filter( - (row) => row.closed !== true && row.status !== "ready" && row.status !== "error", - ); - - const self = selfTask(c); - for (const row of pending) { - await self.send(taskWorkflowQueueName("task.command.workspace.ensure_session" as any), { sessionId: row.sessionId, model: row.model }, { wait: false }); - } -} - -export async function renameWorkspaceSession(c: any, sessionId: string, title: string): Promise { - const trimmed = title.trim(); - if (!trimmed) { - throw new Error("session title is required"); - } - await updateSessionMeta(c, sessionId, { - sessionName: trimmed, - }); - await broadcastTaskUpdate(c, { sessionId }); -} - -export async function selectWorkspaceSession(c: any, sessionId: string, authSessionId?: string): Promise { - const meta = await readSessionMeta(c, sessionId); - if (!meta || meta.closed) { - return; - } - await upsertUserTaskState(c, authSessionId, sessionId, { - activeSessionId: sessionId, - }); - await broadcastTaskUpdate(c, { sessionId }); -} - -export async function setWorkspaceSessionUnread(c: any, sessionId: string, unread: boolean, authSessionId?: string): Promise { - await upsertUserTaskState(c, authSessionId, sessionId, { - unread, - }); - await broadcastTaskUpdate(c, { sessionId }); -} - -export async function updateWorkspaceDraft(c: any, sessionId: string, text: string, attachments: Array, authSessionId?: string): Promise { - await upsertUserTaskState(c, authSessionId, sessionId, { - draftText: text, - draftAttachmentsJson: JSON.stringify(attachments), - draftUpdatedAt: Date.now(), - }); - await broadcastTaskUpdate(c, { sessionId }); -} - -export async function changeWorkspaceModel(c: any, sessionId: string, model: string, _authSessionId?: string): Promise { - const meta = await readSessionMeta(c, sessionId); - if (!meta || meta.closed) { - return; - } - - if (meta.model === model) { - return; - } - - const record = await ensureWorkspaceSeeded(c); - let nextMeta = await updateSessionMeta(c, sessionId, { - model, - }); - let shouldEnsure = nextMeta.status === "pending_provision" || nextMeta.status === "pending_session_create" || nextMeta.status === "error"; - - if (shouldRecreateSessionForModelChange(nextMeta)) { - const sandbox = getTaskSandbox(c, c.state.organizationId, stableSandboxId(c)); - await sandbox.destroySession(nextMeta.sandboxSessionId); - nextMeta = await updateSessionMeta(c, sessionId, { - sandboxSessionId: null, - status: pendingWorkspaceSessionStatus(record), - errorMessage: null, - transcriptJson: "[]", - transcriptUpdatedAt: null, - thinkingSinceMs: null, - }); - shouldEnsure = true; - } else if (nextMeta.status === "ready" && nextMeta.sandboxSessionId) { - const sandbox = getTaskSandbox(c, c.state.organizationId, stableSandboxId(c)); - if (typeof sandbox.rawSendSessionMethod === "function") { - try { - await sandbox.rawSendSessionMethod(nextMeta.sandboxSessionId, "session/set_config_option", { - configId: "model", - value: model, - }); - } catch { - // Some agents do not allow live model updates. Preserve the new preference in metadata. - } - } - } else if (nextMeta.status !== "ready") { - nextMeta = await updateSessionMeta(c, sessionId, { - status: pendingWorkspaceSessionStatus(record), - errorMessage: null, - }); - } - - if (shouldEnsure) { - await enqueueWorkspaceEnsureSession(c, sessionId); - } - await broadcastTaskUpdate(c, { sessionId }); -} - -export async function sendWorkspaceMessage(c: any, sessionId: string, text: string, attachments: Array, authSessionId?: string): Promise { - const sendStart = performance.now(); - const meta = requireSendableSessionMeta(await readSessionMeta(c, sessionId), sessionId); - const record = await ensureWorkspaceSeeded(c); - - const t0 = performance.now(); - const runtime = await getTaskSandboxRuntime(c, record); - logActorInfo("task.message", "getTaskSandboxRuntime", { sessionId, durationMs: Math.round(performance.now() - t0) }); - - const t1 = performance.now(); - // Skip git fetch on subsequent messages — the repo was already prepared during session - // creation. This avoids a 5-30s network round-trip to GitHub on every prompt. - await ensureSandboxRepo(c, runtime.sandbox, record, { skipFetchIfPrepared: true, authSessionId }); - logActorInfo("task.message", "ensureSandboxRepo", { sessionId, durationMs: Math.round(performance.now() - t1) }); - - // Check if the task owner needs to swap. If a different user is sending this message, - // update the owner record and inject their git credentials into the sandbox. - const ownerSwapped = await maybeSwapTaskOwner(c, authSessionId, runtime.sandbox); - if (ownerSwapped) { - await broadcastTaskUpdate(c); - } - const prompt = [text.trim(), ...attachments.map((attachment: any) => `@ ${attachment.filePath}:${attachment.lineNumber}\n${attachment.lineContent}`)].filter( - Boolean, - ); - if (prompt.length === 0) { - throw new Error("message text is required"); - } - - await updateSessionMeta(c, sessionId, { - created: 1, - thinkingSinceMs: Date.now(), - }); - await upsertUserTaskState(c, authSessionId, sessionId, { - unread: false, - draftText: "", - draftAttachmentsJson: "[]", - draftUpdatedAt: Date.now(), - activeSessionId: sessionId, - }); - - await syncWorkspaceSessionStatus(c, meta.sandboxSessionId, "running", Date.now()); - - try { - const t2 = performance.now(); - await runtime.sandbox.sendPrompt({ - sessionId: meta.sandboxSessionId, - prompt: prompt.join("\n\n"), - }); - logActorInfo("task.message", "sendPrompt", { sessionId, durationMs: Math.round(performance.now() - t2) }); - await syncWorkspaceSessionStatus(c, meta.sandboxSessionId, "idle", Date.now()); - } catch (error) { - await updateSessionMeta(c, sessionId, { - status: "error", - errorMessage: error instanceof Error ? error.message : String(error), - }); - await syncWorkspaceSessionStatus(c, meta.sandboxSessionId, "error", Date.now()); - throw error; - } - logActorInfo("task.message", "sendWorkspaceMessage complete", { sessionId, totalDurationMs: Math.round(performance.now() - sendStart) }); -} - -export async function stopWorkspaceSession(c: any, sessionId: string): Promise { - const meta = await requireReadySessionMeta(c, sessionId); - const sandbox = getTaskSandbox(c, c.state.organizationId, stableSandboxId(c)); - await sandbox.destroySession(meta.sandboxSessionId); - await updateSessionMeta(c, sessionId, { - thinkingSinceMs: null, - }); - await broadcastTaskUpdate(c, { sessionId }); -} - -export async function syncWorkspaceSessionStatus(c: any, sessionId: string, status: "running" | "idle" | "error", at: number): Promise { - const meta = (await readSessionMetaBySandboxSessionId(c, sessionId)) ?? (await ensureSessionMeta(c, { sessionId: sessionId, sandboxSessionId: sessionId })); - let changed = false; - - if (status === "running") { - if (!meta.thinkingSinceMs) { - await updateSessionMeta(c, sessionId, { - thinkingSinceMs: at, - }); - changed = true; - } - } else { - if (meta.thinkingSinceMs) { - await updateSessionMeta(c, sessionId, { - thinkingSinceMs: null, - }); - changed = true; - } - } - - if (changed) { - const sessions = await listSessionMetaRows(c, { includeClosed: true }); - const nextStatus = computeWorkspaceTaskStatus(await ensureWorkspaceSeeded(c), sessions); - await c.db - .update(taskTable) - .set({ - status: nextStatus, - updatedAt: at, - }) - .where(eq(taskTable.id, 1)) - .run(); - fireRefreshSessionTranscript(c, sessionId); - if (status !== "running") { - fireRefreshDerived(c); - } - await broadcastTaskUpdate(c, { sessionId: meta.sessionId }); - } -} - -export async function closeWorkspaceSession(c: any, sessionId: string, authSessionId?: string): Promise { - const sessions = await listSessionMetaRows(c); - if (sessions.filter((candidate) => candidate.closed !== true).length <= 1) { - return; - } - - const meta = await readSessionMeta(c, sessionId); - if (!meta) { - return; - } - if (meta.sandboxSessionId) { - const sandbox = getTaskSandbox(c, c.state.organizationId, stableSandboxId(c)); - await sandbox.destroySession(meta.sandboxSessionId); - } - await updateSessionMeta(c, sessionId, { - closed: 1, - thinkingSinceMs: null, - }); - const remainingSessions = sessions.filter((candidate) => candidate.sessionId !== sessionId && candidate.closed !== true); - const userTaskState = await getUserTaskState(c, authSessionId); - if (userTaskState.activeSessionId === sessionId && remainingSessions[0]) { - await upsertUserTaskState(c, authSessionId, remainingSessions[0].sessionId, { - activeSessionId: remainingSessions[0].sessionId, - }); - } - await deleteUserTaskState(c, authSessionId, sessionId); - await broadcastTaskUpdate(c); -} - -export async function markWorkspaceUnread(c: any, authSessionId?: string): Promise { - const sessions = await listSessionMetaRows(c); - const latest = sessions[sessions.length - 1]; - if (!latest) { - return; - } - await upsertUserTaskState(c, authSessionId, latest.sessionId, { - unread: true, - }); - await broadcastTaskUpdate(c, { sessionId: latest.sessionId }); -} - -export async function publishWorkspacePr(c: any): Promise { - const record = await ensureWorkspaceSeeded(c); - if (!record.branchName) { - throw new Error("cannot publish PR without a branch"); - } - const metadata = await getRepositoryMetadata(c); - const repoFullName = metadata.fullName ?? githubRepoFullNameFromRemote(metadata.remoteUrl); - if (!repoFullName) { - throw new Error(`Unable to resolve GitHub repository for ${metadata.remoteUrl}`); - } - const { driver } = getActorRuntimeContext(); - const auth = await resolveOrganizationGithubAuth(c, c.state.organizationId); - const created = await driver.github.createPr(repoFullName, record.branchName, record.title ?? record.task, undefined, { - githubToken: auth?.githubToken ?? null, - baseBranch: metadata.defaultBranch ?? undefined, - }); - await syncTaskPullRequest(c, { - number: created.number, - status: "ready", - title: record.title ?? record.task, - body: null, - state: "open", - url: created.url, - headRefName: record.branchName, - baseRefName: metadata.defaultBranch ?? "main", - authorLogin: null, - isDraft: false, - merged: false, - updatedAtMs: Date.now(), - }); -} - -export async function revertWorkspaceFile(c: any, path: string): Promise { - const record = await ensureWorkspaceSeeded(c); - if (!record.activeSandboxId) { - throw new Error("cannot revert file without an active sandbox"); - } - const activeSandbox = (record.sandboxes ?? []).find((candidate: any) => candidate.sandboxId === record.activeSandboxId) ?? null; - if (!activeSandbox?.cwd) { - throw new Error("cannot revert file without a sandbox cwd"); - } - - const result = await executeInSandbox(c, { - sandboxId: record.activeSandboxId, - cwd: activeSandbox.cwd, - command: `if git ls-files --error-unmatch -- ${JSON.stringify(path)} >/dev/null 2>&1; then git restore --staged --worktree -- ${JSON.stringify(path)} || git checkout -- ${JSON.stringify(path)}; else rm -f ${JSON.stringify(path)}; fi`, - label: `git restore ${path}`, - }); - if (result.exitCode !== 0) { - throw new Error(`file revert failed (${result.exitCode}): ${result.result}`); - } - fireRefreshDerived(c); - await broadcastTaskUpdate(c); -} diff --git a/foundry/packages/backend/src/actors/user/actions/better-auth.ts b/foundry/packages/backend/src/actors/user/actions/better-auth.ts deleted file mode 100644 index 3ef8656..0000000 --- a/foundry/packages/backend/src/actors/user/actions/better-auth.ts +++ /dev/null @@ -1,105 +0,0 @@ -import { asc, count as sqlCount, desc } from "drizzle-orm"; -import { applyJoinToRow, applyJoinToRows, buildWhere, columnFor, materializeRow, persistInput, persistPatch, tableFor } from "../query-helpers.js"; - -// Exception to the CLAUDE.md queue-for-mutations rule: Better Auth adapter operations -// use direct actions even for mutations. Better Auth runs during OAuth callbacks on the -// HTTP request path, not through the normal organization lifecycle. Routing through the -// queue adds multiple sequential round-trips (each with actor wake-up + step overhead) -// that cause 30-second OAuth callbacks and proxy retry storms. These mutations are simple -// SQLite upserts/deletes with no cross-actor coordination or broadcast side effects. -export const betterAuthActions = { - // --- Mutation actions --- - async betterAuthCreateRecord(c, input: { model: string; data: Record }) { - const table = tableFor(input.model); - const persisted = persistInput(input.model, input.data); - await c.db - .insert(table) - .values(persisted as any) - .run(); - const row = await c.db - .select() - .from(table) - .where(buildWhere(table, [{ field: "id", value: input.data.id }])!) - .get(); - return materializeRow(input.model, row); - }, - - async betterAuthUpdateRecord(c, input: { model: string; where: any[]; update: Record }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - if (!predicate) throw new Error("betterAuthUpdateRecord requires a where clause"); - await c.db - .update(table) - .set(persistPatch(input.model, input.update) as any) - .where(predicate) - .run(); - return materializeRow(input.model, await c.db.select().from(table).where(predicate).get()); - }, - - async betterAuthUpdateManyRecords(c, input: { model: string; where: any[]; update: Record }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - if (!predicate) throw new Error("betterAuthUpdateManyRecords requires a where clause"); - await c.db - .update(table) - .set(persistPatch(input.model, input.update) as any) - .where(predicate) - .run(); - const row = await c.db.select({ value: sqlCount() }).from(table).where(predicate).get(); - return row?.value ?? 0; - }, - - async betterAuthDeleteRecord(c, input: { model: string; where: any[] }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - if (!predicate) throw new Error("betterAuthDeleteRecord requires a where clause"); - await c.db.delete(table).where(predicate).run(); - }, - - async betterAuthDeleteManyRecords(c, input: { model: string; where: any[] }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - if (!predicate) throw new Error("betterAuthDeleteManyRecords requires a where clause"); - const rows = await c.db.select().from(table).where(predicate).all(); - await c.db.delete(table).where(predicate).run(); - return rows.length; - }, - - // --- Read actions --- - async betterAuthFindOneRecord(c, input: { model: string; where: any[]; join?: any }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - const row = predicate ? await c.db.select().from(table).where(predicate).get() : await c.db.select().from(table).get(); - return await applyJoinToRow(c, input.model, row ?? null, input.join); - }, - - async betterAuthFindManyRecords(c, input: { model: string; where?: any[]; limit?: number; offset?: number; sortBy?: any; join?: any }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - let query: any = c.db.select().from(table); - if (predicate) { - query = query.where(predicate); - } - if (input.sortBy?.field) { - const column = columnFor(input.model, table, input.sortBy.field); - query = query.orderBy(input.sortBy.direction === "asc" ? asc(column) : desc(column)); - } - if (typeof input.limit === "number") { - query = query.limit(input.limit); - } - if (typeof input.offset === "number") { - query = query.offset(input.offset); - } - const rows = await query.all(); - return await applyJoinToRows(c, input.model, rows, input.join); - }, - - async betterAuthCountRecords(c, input: { model: string; where?: any[] }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - const row = predicate - ? await c.db.select({ value: sqlCount() }).from(table).where(predicate).get() - : await c.db.select({ value: sqlCount() }).from(table).get(); - return row?.value ?? 0; - }, -}; diff --git a/foundry/packages/backend/src/actors/user/actions/user.ts b/foundry/packages/backend/src/actors/user/actions/user.ts deleted file mode 100644 index f251c95..0000000 --- a/foundry/packages/backend/src/actors/user/actions/user.ts +++ /dev/null @@ -1,188 +0,0 @@ -import { eq, and } from "drizzle-orm"; -import { DEFAULT_WORKSPACE_MODEL_ID } from "@sandbox-agent/foundry-shared"; -import { authAccounts, authSessions, authUsers, sessionState, userProfiles, userTaskState } from "../db/schema.js"; -import { materializeRow } from "../query-helpers.js"; - -export const userActions = { - // Custom Foundry action — not part of Better Auth. - async getAppAuthState(c, input: { sessionId: string }) { - const session = await c.db.select().from(authSessions).where(eq(authSessions.id, input.sessionId)).get(); - if (!session) { - return null; - } - const [user, profile, currentSessionState, accounts] = await Promise.all([ - c.db.select().from(authUsers).where(eq(authUsers.authUserId, session.userId)).get(), - c.db.select().from(userProfiles).where(eq(userProfiles.userId, session.userId)).get(), - c.db.select().from(sessionState).where(eq(sessionState.sessionId, input.sessionId)).get(), - c.db.select().from(authAccounts).where(eq(authAccounts.userId, session.userId)).all(), - ]); - return { - session, - user: materializeRow("user", user), - profile: profile ?? null, - sessionState: currentSessionState ?? null, - accounts, - }; - }, - - // Custom Foundry action — not part of Better Auth. - async getTaskState(c, input: { taskId: string }) { - const rows = await c.db.select().from(userTaskState).where(eq(userTaskState.taskId, input.taskId)).all(); - const activeSessionId = rows.find((row) => typeof row.activeSessionId === "string" && row.activeSessionId.length > 0)?.activeSessionId ?? null; - return { - taskId: input.taskId, - activeSessionId, - sessions: rows.map((row) => ({ - sessionId: row.sessionId, - unread: row.unread === 1, - draftText: row.draftText, - draftAttachmentsJson: row.draftAttachmentsJson, - draftUpdatedAt: row.draftUpdatedAt ?? null, - updatedAt: row.updatedAt, - })), - }; - }, - - // --- Mutation actions (migrated from queue) --- - - async upsertProfile( - c, - input: { - userId: string; - patch: { - githubAccountId?: string | null; - githubLogin?: string | null; - roleLabel?: string; - defaultModel?: string; - eligibleOrganizationIdsJson?: string; - starterRepoStatus?: string; - starterRepoStarredAt?: number | null; - starterRepoSkippedAt?: number | null; - }; - }, - ) { - const now = Date.now(); - await c.db - .insert(userProfiles) - .values({ - id: 1, - userId: input.userId, - githubAccountId: input.patch.githubAccountId ?? null, - githubLogin: input.patch.githubLogin ?? null, - roleLabel: input.patch.roleLabel ?? "GitHub user", - defaultModel: input.patch.defaultModel ?? DEFAULT_WORKSPACE_MODEL_ID, - eligibleOrganizationIdsJson: input.patch.eligibleOrganizationIdsJson ?? "[]", - starterRepoStatus: input.patch.starterRepoStatus ?? "pending", - starterRepoStarredAt: input.patch.starterRepoStarredAt ?? null, - starterRepoSkippedAt: input.patch.starterRepoSkippedAt ?? null, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: userProfiles.userId, - set: { - ...(input.patch.githubAccountId !== undefined ? { githubAccountId: input.patch.githubAccountId } : {}), - ...(input.patch.githubLogin !== undefined ? { githubLogin: input.patch.githubLogin } : {}), - ...(input.patch.roleLabel !== undefined ? { roleLabel: input.patch.roleLabel } : {}), - ...(input.patch.defaultModel !== undefined ? { defaultModel: input.patch.defaultModel } : {}), - ...(input.patch.eligibleOrganizationIdsJson !== undefined ? { eligibleOrganizationIdsJson: input.patch.eligibleOrganizationIdsJson } : {}), - ...(input.patch.starterRepoStatus !== undefined ? { starterRepoStatus: input.patch.starterRepoStatus } : {}), - ...(input.patch.starterRepoStarredAt !== undefined ? { starterRepoStarredAt: input.patch.starterRepoStarredAt } : {}), - ...(input.patch.starterRepoSkippedAt !== undefined ? { starterRepoSkippedAt: input.patch.starterRepoSkippedAt } : {}), - updatedAt: now, - }, - }) - .run(); - return await c.db.select().from(userProfiles).where(eq(userProfiles.userId, input.userId)).get(); - }, - - async upsertSessionState(c, input: { sessionId: string; activeOrganizationId: string | null }) { - const now = Date.now(); - await c.db - .insert(sessionState) - .values({ - sessionId: input.sessionId, - activeOrganizationId: input.activeOrganizationId, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: sessionState.sessionId, - set: { activeOrganizationId: input.activeOrganizationId, updatedAt: now }, - }) - .run(); - return await c.db.select().from(sessionState).where(eq(sessionState.sessionId, input.sessionId)).get(); - }, - - async upsertTaskState( - c, - input: { - taskId: string; - sessionId: string; - patch: { - activeSessionId?: string | null; - unread?: boolean; - draftText?: string; - draftAttachmentsJson?: string; - draftUpdatedAt?: number | null; - }; - }, - ) { - const now = Date.now(); - const existing = await c.db - .select() - .from(userTaskState) - .where(and(eq(userTaskState.taskId, input.taskId), eq(userTaskState.sessionId, input.sessionId))) - .get(); - - if (input.patch.activeSessionId !== undefined) { - await c.db - .update(userTaskState) - .set({ activeSessionId: input.patch.activeSessionId, updatedAt: now }) - .where(eq(userTaskState.taskId, input.taskId)) - .run(); - } - - await c.db - .insert(userTaskState) - .values({ - taskId: input.taskId, - sessionId: input.sessionId, - activeSessionId: input.patch.activeSessionId ?? existing?.activeSessionId ?? null, - unread: input.patch.unread !== undefined ? (input.patch.unread ? 1 : 0) : (existing?.unread ?? 0), - draftText: input.patch.draftText ?? existing?.draftText ?? "", - draftAttachmentsJson: input.patch.draftAttachmentsJson ?? existing?.draftAttachmentsJson ?? "[]", - draftUpdatedAt: input.patch.draftUpdatedAt === undefined ? (existing?.draftUpdatedAt ?? null) : input.patch.draftUpdatedAt, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: [userTaskState.taskId, userTaskState.sessionId], - set: { - ...(input.patch.activeSessionId !== undefined ? { activeSessionId: input.patch.activeSessionId } : {}), - ...(input.patch.unread !== undefined ? { unread: input.patch.unread ? 1 : 0 } : {}), - ...(input.patch.draftText !== undefined ? { draftText: input.patch.draftText } : {}), - ...(input.patch.draftAttachmentsJson !== undefined ? { draftAttachmentsJson: input.patch.draftAttachmentsJson } : {}), - ...(input.patch.draftUpdatedAt !== undefined ? { draftUpdatedAt: input.patch.draftUpdatedAt } : {}), - updatedAt: now, - }, - }) - .run(); - - return await c.db - .select() - .from(userTaskState) - .where(and(eq(userTaskState.taskId, input.taskId), eq(userTaskState.sessionId, input.sessionId))) - .get(); - }, - - async deleteTaskState(c, input: { taskId: string; sessionId?: string }) { - if (input.sessionId) { - await c.db - .delete(userTaskState) - .where(and(eq(userTaskState.taskId, input.taskId), eq(userTaskState.sessionId, input.sessionId))) - .run(); - return; - } - await c.db.delete(userTaskState).where(eq(userTaskState.taskId, input.taskId)).run(); - }, -}; diff --git a/foundry/packages/backend/src/actors/user/db/schema.ts b/foundry/packages/backend/src/actors/user/db/schema.ts deleted file mode 100644 index 6a87a11..0000000 --- a/foundry/packages/backend/src/actors/user/db/schema.ts +++ /dev/null @@ -1,112 +0,0 @@ -import { check, integer, primaryKey, sqliteTable, text, uniqueIndex } from "drizzle-orm/sqlite-core"; -import { sql } from "drizzle-orm"; -import { DEFAULT_WORKSPACE_MODEL_ID } from "@sandbox-agent/foundry-shared"; - -/** Better Auth core model — schema defined at https://better-auth.com/docs/concepts/database */ -export const authUsers = sqliteTable( - "user", - { - id: integer("id").primaryKey(), - authUserId: text("auth_user_id").notNull(), - name: text("name").notNull(), - email: text("email").notNull(), - emailVerified: integer("email_verified").notNull(), - image: text("image"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), - }, - (table) => ({ - authUserIdIdx: uniqueIndex("user_auth_user_id_idx").on(table.authUserId), - singletonCheck: check("user_singleton_id_check", sql`${table.id} = 1`), - }), -); - -/** Better Auth core model — schema defined at https://better-auth.com/docs/concepts/database */ -export const authSessions = sqliteTable( - "session", - { - id: text("id").notNull().primaryKey(), - token: text("token").notNull(), - userId: text("user_id").notNull(), - expiresAt: integer("expires_at").notNull(), - ipAddress: text("ip_address"), - userAgent: text("user_agent"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), - }, - (table) => ({ - tokenIdx: uniqueIndex("session_token_idx").on(table.token), - }), -); - -/** Better Auth core model — schema defined at https://better-auth.com/docs/concepts/database */ -export const authAccounts = sqliteTable( - "account", - { - id: text("id").notNull().primaryKey(), - accountId: text("account_id").notNull(), - providerId: text("provider_id").notNull(), - userId: text("user_id").notNull(), - accessToken: text("access_token"), - refreshToken: text("refresh_token"), - idToken: text("id_token"), - accessTokenExpiresAt: integer("access_token_expires_at"), - refreshTokenExpiresAt: integer("refresh_token_expires_at"), - scope: text("scope"), - password: text("password"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), - }, - (table) => ({ - providerAccountIdx: uniqueIndex("account_provider_account_idx").on(table.providerId, table.accountId), - }), -); - -/** Custom Foundry table — not part of Better Auth. */ -export const userProfiles = sqliteTable( - "user_profiles", - { - id: integer("id").primaryKey(), - userId: text("user_id").notNull(), - githubAccountId: text("github_account_id"), - githubLogin: text("github_login"), - roleLabel: text("role_label").notNull(), - defaultModel: text("default_model").notNull().default(DEFAULT_WORKSPACE_MODEL_ID), - eligibleOrganizationIdsJson: text("eligible_organization_ids_json").notNull(), - starterRepoStatus: text("starter_repo_status").notNull(), - starterRepoStarredAt: integer("starter_repo_starred_at"), - starterRepoSkippedAt: integer("starter_repo_skipped_at"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), - }, - (table) => ({ - userIdIdx: uniqueIndex("user_profiles_user_id_idx").on(table.userId), - singletonCheck: check("user_profiles_singleton_id_check", sql`${table.id} = 1`), - }), -); - -/** Custom Foundry table — not part of Better Auth. */ -export const sessionState = sqliteTable("session_state", { - sessionId: text("session_id").notNull().primaryKey(), - activeOrganizationId: text("active_organization_id"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -/** Custom Foundry table — not part of Better Auth. Stores per-user task/session UI state. */ -export const userTaskState = sqliteTable( - "user_task_state", - { - taskId: text("task_id").notNull(), - sessionId: text("session_id").notNull(), - activeSessionId: text("active_session_id"), - unread: integer("unread").notNull().default(0), - draftText: text("draft_text").notNull().default(""), - draftAttachmentsJson: text("draft_attachments_json").notNull().default("[]"), - draftUpdatedAt: integer("draft_updated_at"), - updatedAt: integer("updated_at").notNull(), - }, - (table) => ({ - pk: primaryKey({ columns: [table.taskId, table.sessionId] }), - }), -); diff --git a/foundry/packages/backend/src/actors/user/index.ts b/foundry/packages/backend/src/actors/user/index.ts deleted file mode 100644 index 0deb1cb..0000000 --- a/foundry/packages/backend/src/actors/user/index.ts +++ /dev/null @@ -1,20 +0,0 @@ -import { actor } from "rivetkit"; -import { userDb } from "./db/db.js"; -import { betterAuthActions } from "./actions/better-auth.js"; -import { userActions } from "./actions/user.js"; - -export const user = actor({ - db: userDb, - options: { - name: "User", - icon: "shield", - actionTimeout: 60_000, - }, - createState: (_c, input: { userId: string }) => ({ - userId: input.userId, - }), - actions: { - ...betterAuthActions, - ...userActions, - }, -}); diff --git a/foundry/packages/backend/src/actors/user/query-helpers.ts b/foundry/packages/backend/src/actors/user/query-helpers.ts deleted file mode 100644 index 5bdee10..0000000 --- a/foundry/packages/backend/src/actors/user/query-helpers.ts +++ /dev/null @@ -1,197 +0,0 @@ -import { and, eq, inArray, isNotNull, isNull, like, lt, lte, gt, gte, ne, notInArray, or } from "drizzle-orm"; -import { authAccounts, authSessions, authUsers, sessionState, userProfiles, userTaskState } from "./db/schema.js"; - -export const userTables = { - user: authUsers, - session: authSessions, - account: authAccounts, - userProfiles, - sessionState, - userTaskState, -} as const; - -export function tableFor(model: string) { - const table = userTables[model as keyof typeof userTables]; - if (!table) { - throw new Error(`Unsupported user model: ${model}`); - } - return table as any; -} - -function dbFieldFor(model: string, field: string): string { - if (model === "user" && field === "id") { - return "authUserId"; - } - return field; -} - -export function materializeRow(model: string, row: any) { - if (!row || model !== "user") { - return row; - } - - const { id: _singletonId, authUserId, ...rest } = row; - return { - id: authUserId, - ...rest, - }; -} - -export function persistInput(model: string, data: Record) { - if (model !== "user") { - return data; - } - - const { id, ...rest } = data; - return { - id: 1, - authUserId: id, - ...rest, - }; -} - -export function persistPatch(model: string, data: Record) { - if (model !== "user") { - return data; - } - - const { id, ...rest } = data; - return { - ...(id !== undefined ? { authUserId: id } : {}), - ...rest, - }; -} - -export function columnFor(model: string, table: any, field: string) { - const column = table[dbFieldFor(model, field)]; - if (!column) { - throw new Error(`Unsupported user field: ${model}.${field}`); - } - return column; -} - -export function normalizeValue(value: unknown): unknown { - if (value instanceof Date) { - return value.getTime(); - } - if (Array.isArray(value)) { - return value.map((entry) => normalizeValue(entry)); - } - return value; -} - -export function clauseToExpr(table: any, clause: any) { - const model = table === authUsers ? "user" : table === authSessions ? "session" : table === authAccounts ? "account" : ""; - const column = columnFor(model, table, clause.field); - const value = normalizeValue(clause.value); - - switch (clause.operator) { - case "ne": - return value === null ? isNotNull(column) : ne(column, value as any); - case "lt": - return lt(column, value as any); - case "lte": - return lte(column, value as any); - case "gt": - return gt(column, value as any); - case "gte": - return gte(column, value as any); - case "in": - return inArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); - case "not_in": - return notInArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); - case "contains": - return like(column, `%${String(value ?? "")}%`); - case "starts_with": - return like(column, `${String(value ?? "")}%`); - case "ends_with": - return like(column, `%${String(value ?? "")}`); - case "eq": - default: - return value === null ? isNull(column) : eq(column, value as any); - } -} - -export function buildWhere(table: any, where: any[] | undefined) { - if (!where || where.length === 0) { - return undefined; - } - - let expr = clauseToExpr(table, where[0]); - for (const clause of where.slice(1)) { - const next = clauseToExpr(table, clause); - expr = clause.connector === "OR" ? or(expr, next) : and(expr, next); - } - return expr; -} - -export function applyJoinToRow(c: any, model: string, row: any, join: any) { - const materialized = materializeRow(model, row); - if (!materialized || !join) { - return materialized; - } - - if (model === "session" && join.user) { - return c.db - .select() - .from(authUsers) - .where(eq(authUsers.authUserId, materialized.userId)) - .get() - .then((user: any) => ({ ...materialized, user: materializeRow("user", user) ?? null })); - } - - if (model === "account" && join.user) { - return c.db - .select() - .from(authUsers) - .where(eq(authUsers.authUserId, materialized.userId)) - .get() - .then((user: any) => ({ ...materialized, user: materializeRow("user", user) ?? null })); - } - - if (model === "user" && join.account) { - return c.db - .select() - .from(authAccounts) - .where(eq(authAccounts.userId, materialized.id)) - .all() - .then((accounts: any[]) => ({ ...materialized, account: accounts })); - } - - return Promise.resolve(materialized); -} - -export async function applyJoinToRows(c: any, model: string, rows: any[], join: any) { - if (!join || rows.length === 0) { - return rows.map((row) => materializeRow(model, row)); - } - - if (model === "session" && join.user) { - const userIds = [...new Set(rows.map((row) => row.userId).filter(Boolean))]; - const users = userIds.length > 0 ? await c.db.select().from(authUsers).where(inArray(authUsers.authUserId, userIds)).all() : []; - const userMap = new Map(users.map((user: any) => [user.authUserId, materializeRow("user", user)])); - return rows.map((row) => ({ ...row, user: userMap.get(row.userId) ?? null })); - } - - if (model === "account" && join.user) { - const userIds = [...new Set(rows.map((row) => row.userId).filter(Boolean))]; - const users = userIds.length > 0 ? await c.db.select().from(authUsers).where(inArray(authUsers.authUserId, userIds)).all() : []; - const userMap = new Map(users.map((user: any) => [user.authUserId, materializeRow("user", user)])); - return rows.map((row) => ({ ...row, user: userMap.get(row.userId) ?? null })); - } - - if (model === "user" && join.account) { - const materializedRows = rows.map((row) => materializeRow("user", row)); - const userIds = materializedRows.map((row) => row.id); - const accounts = userIds.length > 0 ? await c.db.select().from(authAccounts).where(inArray(authAccounts.userId, userIds)).all() : []; - const accountsByUserId = new Map(); - for (const account of accounts) { - const entries = accountsByUserId.get(account.userId) ?? []; - entries.push(account); - accountsByUserId.set(account.userId, entries); - } - return materializedRows.map((row) => ({ ...row, account: accountsByUserId.get(row.id) ?? [] })); - } - - return rows.map((row) => materializeRow(model, row)); -} diff --git a/foundry/packages/backend/src/actors/workspace/actions.ts b/foundry/packages/backend/src/actors/workspace/actions.ts new file mode 100644 index 0000000..0ba55f8 --- /dev/null +++ b/foundry/packages/backend/src/actors/workspace/actions.ts @@ -0,0 +1,830 @@ +// @ts-nocheck +import { desc, eq } from "drizzle-orm"; +import { Loop } from "rivetkit/workflow"; +import type { + AddRepoInput, + CreateTaskInput, + HistoryEvent, + HistoryQueryInput, + ListTasksInput, + ProviderId, + RepoOverview, + RepoRecord, + RepoStackActionInput, + RepoStackActionResult, + StarSandboxAgentRepoInput, + StarSandboxAgentRepoResult, + SwitchResult, + TaskRecord, + TaskSummary, + TaskWorkbenchChangeModelInput, + TaskWorkbenchCreateTaskInput, + TaskWorkbenchDiffInput, + TaskWorkbenchRenameInput, + TaskWorkbenchRenameSessionInput, + TaskWorkbenchSelectInput, + TaskWorkbenchSetSessionUnreadInput, + TaskWorkbenchSendMessageInput, + TaskWorkbenchTabInput, + TaskWorkbenchUpdateDraftInput, + WorkbenchRepoSummary, + WorkbenchSessionSummary, + WorkbenchTaskSummary, + WorkspaceEvent, + WorkspaceSummarySnapshot, + WorkspaceUseInput, +} from "@sandbox-agent/foundry-shared"; +import { getActorRuntimeContext } from "../context.js"; +import { getTask, getOrCreateHistory, getOrCreateProject, selfWorkspace } from "../handles.js"; +import { logActorWarning, resolveErrorMessage } from "../logging.js"; +import { normalizeRemoteUrl, repoIdFromRemote } from "../../services/repo.js"; +import { resolveWorkspaceGithubAuth } from "../../services/github-auth.js"; +import { taskLookup, repos, providerProfiles, taskSummaries } from "./db/schema.js"; +import { agentTypeForModel } from "../task/workbench.js"; +import { expectQueueResponse } from "../../services/queue.js"; +import { workspaceAppActions } from "./app-shell.js"; + +interface WorkspaceState { + workspaceId: string; +} + +interface RefreshProviderProfilesCommand { + providerId?: ProviderId; +} + +interface GetTaskInput { + workspaceId: string; + taskId: string; +} + +interface TaskProxyActionInput extends GetTaskInput { + reason?: string; +} + +interface RepoOverviewInput { + workspaceId: string; + repoId: string; +} + +const WORKSPACE_QUEUE_NAMES = [ + "workspace.command.addRepo", + "workspace.command.createTask", + "workspace.command.refreshProviderProfiles", + "workspace.command.syncGithubOrganizationRepos", + "workspace.command.syncGithubSession", +] as const; +const SANDBOX_AGENT_REPO = "rivet-dev/sandbox-agent"; + +type WorkspaceQueueName = (typeof WORKSPACE_QUEUE_NAMES)[number]; + +export { WORKSPACE_QUEUE_NAMES }; + +export function workspaceWorkflowQueueName(name: WorkspaceQueueName): WorkspaceQueueName { + return name; +} + +function assertWorkspace(c: { state: WorkspaceState }, workspaceId: string): void { + if (workspaceId !== c.state.workspaceId) { + throw new Error(`Workspace actor mismatch: actor=${c.state.workspaceId} command=${workspaceId}`); + } +} + +async function resolveRepoId(c: any, taskId: string): Promise { + const row = await c.db.select({ repoId: taskLookup.repoId }).from(taskLookup).where(eq(taskLookup.taskId, taskId)).get(); + + if (!row) { + throw new Error(`Unknown task: ${taskId} (not in lookup)`); + } + + return row.repoId; +} + +async function upsertTaskLookupRow(c: any, taskId: string, repoId: string): Promise { + await c.db + .insert(taskLookup) + .values({ + taskId, + repoId, + }) + .onConflictDoUpdate({ + target: taskLookup.taskId, + set: { repoId }, + }) + .run(); +} + +function parseJsonValue(value: string | null | undefined, fallback: T): T { + if (!value) { + return fallback; + } + + try { + return JSON.parse(value) as T; + } catch { + return fallback; + } +} + +async function collectAllTaskSummaries(c: any): Promise { + const repoRows = await c.db.select({ repoId: repos.repoId, remoteUrl: repos.remoteUrl }).from(repos).orderBy(desc(repos.updatedAt)).all(); + + const all: TaskSummary[] = []; + for (const row of repoRows) { + try { + const project = await getOrCreateProject(c, c.state.workspaceId, row.repoId, row.remoteUrl); + const snapshot = await project.listTaskSummaries({ includeArchived: true }); + all.push(...snapshot); + } catch (error) { + logActorWarning("workspace", "failed collecting tasks for repo", { + workspaceId: c.state.workspaceId, + repoId: row.repoId, + error: resolveErrorMessage(error), + }); + } + } + + all.sort((a, b) => b.updatedAt - a.updatedAt); + return all; +} + +function repoLabelFromRemote(remoteUrl: string): string { + try { + const url = new URL(remoteUrl.startsWith("http") ? remoteUrl : `https://${remoteUrl}`); + const parts = url.pathname.replace(/\/+$/, "").split("/").filter(Boolean); + if (parts.length >= 2) { + return `${parts[0]}/${(parts[1] ?? "").replace(/\.git$/, "")}`; + } + } catch { + // ignore + } + + return remoteUrl; +} + +function buildRepoSummary(repoRow: { repoId: string; remoteUrl: string; updatedAt: number }, taskRows: WorkbenchTaskSummary[]): WorkbenchRepoSummary { + const repoTasks = taskRows.filter((task) => task.repoId === repoRow.repoId); + const latestActivityMs = repoTasks.reduce((latest, task) => Math.max(latest, task.updatedAtMs), repoRow.updatedAt); + + return { + id: repoRow.repoId, + label: repoLabelFromRemote(repoRow.remoteUrl), + taskCount: repoTasks.length, + latestActivityMs, + }; +} + +function taskSummaryRowFromSummary(taskSummary: WorkbenchTaskSummary) { + return { + taskId: taskSummary.id, + repoId: taskSummary.repoId, + title: taskSummary.title, + status: taskSummary.status, + repoName: taskSummary.repoName, + updatedAtMs: taskSummary.updatedAtMs, + branch: taskSummary.branch, + pullRequestJson: JSON.stringify(taskSummary.pullRequest), + sessionsSummaryJson: JSON.stringify(taskSummary.sessionsSummary), + }; +} + +function taskSummaryFromRow(row: any): WorkbenchTaskSummary { + return { + id: row.taskId, + repoId: row.repoId, + title: row.title, + status: row.status, + repoName: row.repoName, + updatedAtMs: row.updatedAtMs, + branch: row.branch ?? null, + pullRequest: parseJsonValue(row.pullRequestJson, null), + sessionsSummary: parseJsonValue(row.sessionsSummaryJson, []), + }; +} + +async function reconcileWorkbenchProjection(c: any): Promise { + const repoRows = await c.db + .select({ repoId: repos.repoId, remoteUrl: repos.remoteUrl, updatedAt: repos.updatedAt }) + .from(repos) + .orderBy(desc(repos.updatedAt)) + .all(); + + const taskRows: WorkbenchTaskSummary[] = []; + for (const row of repoRows) { + try { + const project = await getOrCreateProject(c, c.state.workspaceId, row.repoId, row.remoteUrl); + const summaries = await project.listTaskSummaries({ includeArchived: true }); + for (const summary of summaries) { + try { + await upsertTaskLookupRow(c, summary.taskId, row.repoId); + const task = getTask(c, c.state.workspaceId, row.repoId, summary.taskId); + const taskSummary = await task.getTaskSummary({}); + taskRows.push(taskSummary); + await c.db + .insert(taskSummaries) + .values(taskSummaryRowFromSummary(taskSummary)) + .onConflictDoUpdate({ + target: taskSummaries.taskId, + set: taskSummaryRowFromSummary(taskSummary), + }) + .run(); + } catch (error) { + logActorWarning("workspace", "failed collecting task summary during reconciliation", { + workspaceId: c.state.workspaceId, + repoId: row.repoId, + taskId: summary.taskId, + error: resolveErrorMessage(error), + }); + } + } + } catch (error) { + logActorWarning("workspace", "failed collecting repo during workbench reconciliation", { + workspaceId: c.state.workspaceId, + repoId: row.repoId, + error: resolveErrorMessage(error), + }); + } + } + + taskRows.sort((left, right) => right.updatedAtMs - left.updatedAtMs); + return { + workspaceId: c.state.workspaceId, + repos: repoRows.map((row) => buildRepoSummary(row, taskRows)).sort((left, right) => right.latestActivityMs - left.latestActivityMs), + taskSummaries: taskRows, + }; +} + +async function requireWorkbenchTask(c: any, taskId: string) { + const repoId = await resolveRepoId(c, taskId); + return getTask(c, c.state.workspaceId, repoId, taskId); +} + +/** + * Reads the workspace sidebar snapshot from the workspace actor's local SQLite + * only. Task actors push summary updates into `task_summaries`, so clients do + * not need this action to fan out to every child actor on the hot read path. + */ +async function getWorkspaceSummarySnapshot(c: any): Promise { + const repoRows = await c.db + .select({ + repoId: repos.repoId, + remoteUrl: repos.remoteUrl, + updatedAt: repos.updatedAt, + }) + .from(repos) + .orderBy(desc(repos.updatedAt)) + .all(); + const taskRows = await c.db.select().from(taskSummaries).orderBy(desc(taskSummaries.updatedAtMs)).all(); + const summaries = taskRows.map(taskSummaryFromRow); + + return { + workspaceId: c.state.workspaceId, + repos: repoRows.map((row) => buildRepoSummary(row, summaries)).sort((left, right) => right.latestActivityMs - left.latestActivityMs), + taskSummaries: summaries, + }; +} + +async function broadcastRepoSummary( + c: any, + type: "repoAdded" | "repoUpdated", + repoRow: { repoId: string; remoteUrl: string; updatedAt: number }, +): Promise { + const matchingTaskRows = await c.db.select().from(taskSummaries).where(eq(taskSummaries.repoId, repoRow.repoId)).all(); + const repo = buildRepoSummary(repoRow, matchingTaskRows.map(taskSummaryFromRow)); + c.broadcast("workspaceUpdated", { type, repo } satisfies WorkspaceEvent); +} + +async function addRepoMutation(c: any, input: AddRepoInput): Promise { + assertWorkspace(c, input.workspaceId); + + const remoteUrl = normalizeRemoteUrl(input.remoteUrl); + if (!remoteUrl) { + throw new Error("remoteUrl is required"); + } + + const { driver } = getActorRuntimeContext(); + const auth = await resolveWorkspaceGithubAuth(c, c.state.workspaceId); + await driver.git.validateRemote(remoteUrl, { githubToken: auth?.githubToken ?? null }); + + const repoId = repoIdFromRemote(remoteUrl); + const now = Date.now(); + const existing = await c.db.select({ repoId: repos.repoId }).from(repos).where(eq(repos.repoId, repoId)).get(); + + await c.db + .insert(repos) + .values({ + repoId, + remoteUrl, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: repos.repoId, + set: { + remoteUrl, + updatedAt: now, + }, + }) + .run(); + + await broadcastRepoSummary(c, existing ? "repoUpdated" : "repoAdded", { + repoId, + remoteUrl, + updatedAt: now, + }); + return { + workspaceId: c.state.workspaceId, + repoId, + remoteUrl, + createdAt: now, + updatedAt: now, + }; +} + +async function createTaskMutation(c: any, input: CreateTaskInput): Promise { + assertWorkspace(c, input.workspaceId); + + const { providers } = getActorRuntimeContext(); + const providerId = input.providerId ?? providers.defaultProviderId(); + + const repoId = input.repoId; + const repoRow = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).where(eq(repos.repoId, repoId)).get(); + if (!repoRow) { + throw new Error(`Unknown repo: ${repoId}`); + } + const remoteUrl = repoRow.remoteUrl; + + await c.db + .insert(providerProfiles) + .values({ + providerId, + profileJson: JSON.stringify({ providerId }), + updatedAt: Date.now(), + }) + .onConflictDoUpdate({ + target: providerProfiles.providerId, + set: { + profileJson: JSON.stringify({ providerId }), + updatedAt: Date.now(), + }, + }) + .run(); + + const project = await getOrCreateProject(c, c.state.workspaceId, repoId, remoteUrl); + await project.ensure({ remoteUrl }); + + const created = await project.createTask({ + task: input.task, + providerId, + agentType: input.agentType ?? null, + explicitTitle: input.explicitTitle ?? null, + explicitBranchName: input.explicitBranchName ?? null, + onBranch: input.onBranch ?? null, + }); + + await c.db + .insert(taskLookup) + .values({ + taskId: created.taskId, + repoId, + }) + .onConflictDoUpdate({ + target: taskLookup.taskId, + set: { repoId }, + }) + .run(); + + try { + const task = getTask(c, c.state.workspaceId, repoId, created.taskId); + await workspaceActions.applyTaskSummaryUpdate(c, { + taskSummary: await task.getTaskSummary({}), + }); + } catch (error) { + logActorWarning("workspace", "failed seeding task summary after task creation", { + workspaceId: c.state.workspaceId, + repoId, + taskId: created.taskId, + error: resolveErrorMessage(error), + }); + } + + return created; +} + +async function refreshProviderProfilesMutation(c: any, command?: RefreshProviderProfilesCommand): Promise { + const body = command ?? {}; + const { providers } = getActorRuntimeContext(); + const providerIds: ProviderId[] = body.providerId ? [body.providerId] : providers.availableProviderIds(); + + for (const providerId of providerIds) { + await c.db + .insert(providerProfiles) + .values({ + providerId, + profileJson: JSON.stringify({ providerId }), + updatedAt: Date.now(), + }) + .onConflictDoUpdate({ + target: providerProfiles.providerId, + set: { + profileJson: JSON.stringify({ providerId }), + updatedAt: Date.now(), + }, + }) + .run(); + } +} + +export async function runWorkspaceWorkflow(ctx: any): Promise { + await ctx.loop("workspace-command-loop", async (loopCtx: any) => { + const msg = await loopCtx.queue.next("next-workspace-command", { + names: [...WORKSPACE_QUEUE_NAMES], + completable: true, + }); + if (!msg) { + return Loop.continue(undefined); + } + + if (msg.name === "workspace.command.addRepo") { + const result = await loopCtx.step({ + name: "workspace-add-repo", + timeout: 60_000, + run: async () => addRepoMutation(loopCtx, msg.body as AddRepoInput), + }); + await msg.complete(result); + return Loop.continue(undefined); + } + + if (msg.name === "workspace.command.createTask") { + const result = await loopCtx.step({ + name: "workspace-create-task", + timeout: 12 * 60_000, + run: async () => createTaskMutation(loopCtx, msg.body as CreateTaskInput), + }); + await msg.complete(result); + return Loop.continue(undefined); + } + + if (msg.name === "workspace.command.refreshProviderProfiles") { + await loopCtx.step("workspace-refresh-provider-profiles", async () => + refreshProviderProfilesMutation(loopCtx, msg.body as RefreshProviderProfilesCommand), + ); + await msg.complete({ ok: true }); + return Loop.continue(undefined); + } + + if (msg.name === "workspace.command.syncGithubSession") { + await loopCtx.step({ + name: "workspace-sync-github-session", + timeout: 60_000, + run: async () => { + const { syncGithubOrganizations } = await import("./app-shell.js"); + await syncGithubOrganizations(loopCtx, msg.body as { sessionId: string; accessToken: string }); + }, + }); + await msg.complete({ ok: true }); + return Loop.continue(undefined); + } + + if (msg.name === "workspace.command.syncGithubOrganizationRepos") { + await loopCtx.step({ + name: "workspace-sync-github-organization-repos", + timeout: 60_000, + run: async () => { + const { syncGithubOrganizationRepos } = await import("./app-shell.js"); + await syncGithubOrganizationRepos(loopCtx, msg.body as { sessionId: string; organizationId: string }); + }, + }); + await msg.complete({ ok: true }); + return Loop.continue(undefined); + } + + return Loop.continue(undefined); + }); +} + +export const workspaceActions = { + ...workspaceAppActions, + async useWorkspace(c: any, input: WorkspaceUseInput): Promise<{ workspaceId: string }> { + assertWorkspace(c, input.workspaceId); + return { workspaceId: c.state.workspaceId }; + }, + + async addRepo(c: any, input: AddRepoInput): Promise { + const self = selfWorkspace(c); + return expectQueueResponse( + await self.send(workspaceWorkflowQueueName("workspace.command.addRepo"), input, { + wait: true, + timeout: 60_000, + }), + ); + }, + + async listRepos(c: any, input: WorkspaceUseInput): Promise { + assertWorkspace(c, input.workspaceId); + + const rows = await c.db + .select({ + repoId: repos.repoId, + remoteUrl: repos.remoteUrl, + createdAt: repos.createdAt, + updatedAt: repos.updatedAt, + }) + .from(repos) + .orderBy(desc(repos.updatedAt)) + .all(); + + return rows.map((row) => ({ + workspaceId: c.state.workspaceId, + repoId: row.repoId, + remoteUrl: row.remoteUrl, + createdAt: row.createdAt, + updatedAt: row.updatedAt, + })); + }, + + async createTask(c: any, input: CreateTaskInput): Promise { + const self = selfWorkspace(c); + return expectQueueResponse( + await self.send(workspaceWorkflowQueueName("workspace.command.createTask"), input, { + wait: true, + timeout: 12 * 60_000, + }), + ); + }, + + async starSandboxAgentRepo(c: any, input: StarSandboxAgentRepoInput): Promise { + assertWorkspace(c, input.workspaceId); + const { driver } = getActorRuntimeContext(); + await driver.github.starRepository(SANDBOX_AGENT_REPO); + return { + repo: SANDBOX_AGENT_REPO, + starredAt: Date.now(), + }; + }, + + /** + * Called by task actors when their summary-level state changes. + * This is the write path for the local materialized projection; clients read + * the projection via `getWorkspaceSummary`, but only task actors should push + * rows into it. + */ + async applyTaskSummaryUpdate(c: any, input: { taskSummary: WorkbenchTaskSummary }): Promise { + await c.db + .insert(taskSummaries) + .values(taskSummaryRowFromSummary(input.taskSummary)) + .onConflictDoUpdate({ + target: taskSummaries.taskId, + set: taskSummaryRowFromSummary(input.taskSummary), + }) + .run(); + c.broadcast("workspaceUpdated", { type: "taskSummaryUpdated", taskSummary: input.taskSummary } satisfies WorkspaceEvent); + }, + + async removeTaskSummary(c: any, input: { taskId: string }): Promise { + await c.db.delete(taskSummaries).where(eq(taskSummaries.taskId, input.taskId)).run(); + c.broadcast("workspaceUpdated", { type: "taskRemoved", taskId: input.taskId } satisfies WorkspaceEvent); + }, + + async getWorkspaceSummary(c: any, input: WorkspaceUseInput): Promise { + assertWorkspace(c, input.workspaceId); + return await getWorkspaceSummarySnapshot(c); + }, + + async reconcileWorkbenchState(c: any, input: WorkspaceUseInput): Promise { + assertWorkspace(c, input.workspaceId); + return await reconcileWorkbenchProjection(c); + }, + + async createWorkbenchTask(c: any, input: TaskWorkbenchCreateTaskInput): Promise<{ taskId: string; tabId?: string }> { + const created = await workspaceActions.createTask(c, { + workspaceId: c.state.workspaceId, + repoId: input.repoId, + task: input.task, + ...(input.title ? { explicitTitle: input.title } : {}), + ...(input.branch ? { explicitBranchName: input.branch } : {}), + ...(input.model ? { agentType: agentTypeForModel(input.model) } : {}), + }); + return { + taskId: created.taskId, + }; + }, + + async markWorkbenchUnread(c: any, input: TaskWorkbenchSelectInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.markWorkbenchUnread({}); + }, + + async renameWorkbenchTask(c: any, input: TaskWorkbenchRenameInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.renameWorkbenchTask(input); + }, + + async renameWorkbenchBranch(c: any, input: TaskWorkbenchRenameInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.renameWorkbenchBranch(input); + }, + + async createWorkbenchSession(c: any, input: TaskWorkbenchSelectInput & { model?: string }): Promise<{ tabId: string }> { + const task = await requireWorkbenchTask(c, input.taskId); + return await task.createWorkbenchSession({ ...(input.model ? { model: input.model } : {}) }); + }, + + async renameWorkbenchSession(c: any, input: TaskWorkbenchRenameSessionInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.renameWorkbenchSession(input); + }, + + async setWorkbenchSessionUnread(c: any, input: TaskWorkbenchSetSessionUnreadInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.setWorkbenchSessionUnread(input); + }, + + async updateWorkbenchDraft(c: any, input: TaskWorkbenchUpdateDraftInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.updateWorkbenchDraft(input); + }, + + async changeWorkbenchModel(c: any, input: TaskWorkbenchChangeModelInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.changeWorkbenchModel(input); + }, + + async sendWorkbenchMessage(c: any, input: TaskWorkbenchSendMessageInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.sendWorkbenchMessage(input); + }, + + async stopWorkbenchSession(c: any, input: TaskWorkbenchTabInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.stopWorkbenchSession(input); + }, + + async closeWorkbenchSession(c: any, input: TaskWorkbenchTabInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.closeWorkbenchSession(input); + }, + + async publishWorkbenchPr(c: any, input: TaskWorkbenchSelectInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.publishWorkbenchPr({}); + }, + + async revertWorkbenchFile(c: any, input: TaskWorkbenchDiffInput): Promise { + const task = await requireWorkbenchTask(c, input.taskId); + await task.revertWorkbenchFile(input); + }, + + async listTasks(c: any, input: ListTasksInput): Promise { + assertWorkspace(c, input.workspaceId); + + if (input.repoId) { + const repoRow = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).where(eq(repos.repoId, input.repoId)).get(); + if (!repoRow) { + throw new Error(`Unknown repo: ${input.repoId}`); + } + + const project = await getOrCreateProject(c, c.state.workspaceId, input.repoId, repoRow.remoteUrl); + return await project.listTaskSummaries({ includeArchived: true }); + } + + return await collectAllTaskSummaries(c); + }, + + async getRepoOverview(c: any, input: RepoOverviewInput): Promise { + assertWorkspace(c, input.workspaceId); + + const repoRow = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).where(eq(repos.repoId, input.repoId)).get(); + if (!repoRow) { + throw new Error(`Unknown repo: ${input.repoId}`); + } + + const project = await getOrCreateProject(c, c.state.workspaceId, input.repoId, repoRow.remoteUrl); + await project.ensure({ remoteUrl: repoRow.remoteUrl }); + return await project.getRepoOverview({}); + }, + + async runRepoStackAction(c: any, input: RepoStackActionInput): Promise { + assertWorkspace(c, input.workspaceId); + + const repoRow = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).where(eq(repos.repoId, input.repoId)).get(); + if (!repoRow) { + throw new Error(`Unknown repo: ${input.repoId}`); + } + + const project = await getOrCreateProject(c, c.state.workspaceId, input.repoId, repoRow.remoteUrl); + await project.ensure({ remoteUrl: repoRow.remoteUrl }); + return await project.runRepoStackAction({ + action: input.action, + branchName: input.branchName, + parentBranch: input.parentBranch, + }); + }, + + async switchTask(c: any, taskId: string): Promise { + const repoId = await resolveRepoId(c, taskId); + const h = getTask(c, c.state.workspaceId, repoId, taskId); + const record = await h.get(); + const switched = await h.switch(); + + return { + workspaceId: c.state.workspaceId, + taskId, + providerId: record.providerId, + switchTarget: switched.switchTarget, + }; + }, + + async refreshProviderProfiles(c: any, command?: RefreshProviderProfilesCommand): Promise { + const self = selfWorkspace(c); + await self.send(workspaceWorkflowQueueName("workspace.command.refreshProviderProfiles"), command ?? {}, { + wait: true, + timeout: 60_000, + }); + }, + + async history(c: any, input: HistoryQueryInput): Promise { + assertWorkspace(c, input.workspaceId); + + const limit = input.limit ?? 20; + const repoRows = await c.db.select({ repoId: repos.repoId }).from(repos).all(); + + const allEvents: HistoryEvent[] = []; + + for (const row of repoRows) { + try { + const hist = await getOrCreateHistory(c, c.state.workspaceId, row.repoId); + const items = await hist.list({ + branch: input.branch, + taskId: input.taskId, + limit, + }); + allEvents.push(...items); + } catch (error) { + logActorWarning("workspace", "history lookup failed for repo", { + workspaceId: c.state.workspaceId, + repoId: row.repoId, + error: resolveErrorMessage(error), + }); + } + } + + allEvents.sort((a, b) => b.createdAt - a.createdAt); + return allEvents.slice(0, limit); + }, + + async getTask(c: any, input: GetTaskInput): Promise { + assertWorkspace(c, input.workspaceId); + + const repoId = await resolveRepoId(c, input.taskId); + + const repoRow = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).where(eq(repos.repoId, repoId)).get(); + if (!repoRow) { + throw new Error(`Unknown repo: ${repoId}`); + } + + const project = await getOrCreateProject(c, c.state.workspaceId, repoId, repoRow.remoteUrl); + return await project.getTaskEnriched({ taskId: input.taskId }); + }, + + async attachTask(c: any, input: TaskProxyActionInput): Promise<{ target: string; sessionId: string | null }> { + assertWorkspace(c, input.workspaceId); + const repoId = await resolveRepoId(c, input.taskId); + const h = getTask(c, c.state.workspaceId, repoId, input.taskId); + return await h.attach({ reason: input.reason }); + }, + + async pushTask(c: any, input: TaskProxyActionInput): Promise { + assertWorkspace(c, input.workspaceId); + const repoId = await resolveRepoId(c, input.taskId); + const h = getTask(c, c.state.workspaceId, repoId, input.taskId); + await h.push({ reason: input.reason }); + }, + + async syncTask(c: any, input: TaskProxyActionInput): Promise { + assertWorkspace(c, input.workspaceId); + const repoId = await resolveRepoId(c, input.taskId); + const h = getTask(c, c.state.workspaceId, repoId, input.taskId); + await h.sync({ reason: input.reason }); + }, + + async mergeTask(c: any, input: TaskProxyActionInput): Promise { + assertWorkspace(c, input.workspaceId); + const repoId = await resolveRepoId(c, input.taskId); + const h = getTask(c, c.state.workspaceId, repoId, input.taskId); + await h.merge({ reason: input.reason }); + }, + + async archiveTask(c: any, input: TaskProxyActionInput): Promise { + assertWorkspace(c, input.workspaceId); + const repoId = await resolveRepoId(c, input.taskId); + const h = getTask(c, c.state.workspaceId, repoId, input.taskId); + await h.archive({ reason: input.reason }); + }, + + async killTask(c: any, input: TaskProxyActionInput): Promise { + assertWorkspace(c, input.workspaceId); + const repoId = await resolveRepoId(c, input.taskId); + const h = getTask(c, c.state.workspaceId, repoId, input.taskId); + await h.kill({ reason: input.reason }); + }, +}; diff --git a/foundry/packages/backend/src/actors/workspace/app-shell.ts b/foundry/packages/backend/src/actors/workspace/app-shell.ts new file mode 100644 index 0000000..7f6e73f --- /dev/null +++ b/foundry/packages/backend/src/actors/workspace/app-shell.ts @@ -0,0 +1,1955 @@ +import { and, asc, count as sqlCount, desc, eq, gt, gte, inArray, isNotNull, isNull, like, lt, lte, ne, notInArray, or } from "drizzle-orm"; +import { randomUUID } from "node:crypto"; +import type { + FoundryAppSnapshot, + FoundryBillingPlanId, + FoundryBillingState, + FoundryOrganization, + FoundryOrganizationMember, + FoundryUser, + UpdateFoundryOrganizationProfileInput, +} from "@sandbox-agent/foundry-shared"; +import { getActorRuntimeContext } from "../context.js"; +import { getOrCreateWorkspace, selfWorkspace } from "../handles.js"; +import { GitHubAppError } from "../../services/app-github.js"; +import { getBetterAuthService } from "../../services/better-auth.js"; +import { repoIdFromRemote, repoLabelFromRemote } from "../../services/repo.js"; +import { logger } from "../../logging.js"; +import { + authAccountIndex, + authEmailIndex, + authSessionIndex, + authVerification, + invoices, + organizationMembers, + organizationProfile, + repos, + seatAssignments, + stripeLookup, +} from "./db/schema.js"; + +export const APP_SHELL_WORKSPACE_ID = "app"; + +// ── Better Auth adapter where-clause helpers ── +// These convert the adapter's `{ field, value, operator }` clause arrays into +// Drizzle predicates for workspace-level auth index / verification tables. + +function workspaceAuthColumn(table: any, field: string): any { + const column = table[field]; + if (!column) { + throw new Error(`Unknown auth table field: ${field}`); + } + return column; +} + +function normalizeAuthValue(value: unknown): unknown { + if (value instanceof Date) { + return value.getTime(); + } + if (Array.isArray(value)) { + return value.map((entry) => normalizeAuthValue(entry)); + } + return value; +} + +function workspaceAuthClause(table: any, clause: { field: string; value: unknown; operator?: string }): any { + const column = workspaceAuthColumn(table, clause.field); + const value = normalizeAuthValue(clause.value); + switch (clause.operator) { + case "ne": + return value === null ? isNotNull(column) : ne(column, value as any); + case "lt": + return lt(column, value as any); + case "lte": + return lte(column, value as any); + case "gt": + return gt(column, value as any); + case "gte": + return gte(column, value as any); + case "in": + return inArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); + case "not_in": + return notInArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); + case "contains": + return like(column, `%${String(value ?? "")}%`); + case "starts_with": + return like(column, `${String(value ?? "")}%`); + case "ends_with": + return like(column, `%${String(value ?? "")}`); + case "eq": + default: + return value === null ? isNull(column) : eq(column, value as any); + } +} + +function workspaceAuthWhere(table: any, clauses: any[] | undefined): any { + if (!clauses || clauses.length === 0) { + return undefined; + } + let expr = workspaceAuthClause(table, clauses[0]); + for (const clause of clauses.slice(1)) { + const next = workspaceAuthClause(table, clause); + expr = clause.connector === "OR" ? or(expr, next) : and(expr, next); + } + return expr; +} + +const githubWebhookLogger = logger.child({ + scope: "github-webhook", +}); + +const PROFILE_ROW_ID = "profile"; + +function roundDurationMs(start: number): number { + return Math.round((performance.now() - start) * 100) / 100; +} + +function assertAppWorkspace(c: any): void { + if (c.state.workspaceId !== APP_SHELL_WORKSPACE_ID) { + throw new Error(`App shell action requires workspace ${APP_SHELL_WORKSPACE_ID}, got ${c.state.workspaceId}`); + } +} + +function assertOrganizationWorkspace(c: any): void { + if (c.state.workspaceId === APP_SHELL_WORKSPACE_ID) { + throw new Error("Organization action cannot run on the reserved app workspace"); + } +} + +function slugify(value: string): string { + return value + .trim() + .toLowerCase() + .replace(/[^a-z0-9]+/g, "-") + .replace(/^-+|-+$/g, ""); +} + +function personalWorkspaceId(login: string): string { + return `personal-${slugify(login)}`; +} + +function organizationWorkspaceId(kind: FoundryOrganization["kind"], login: string): string { + return kind === "personal" ? personalWorkspaceId(login) : slugify(login); +} + +function hasRepoScope(scopes: string[]): boolean { + return scopes.some((scope) => scope === "repo" || scope.startsWith("repo:")); +} + +function parseEligibleOrganizationIds(value: string): string[] { + try { + const parsed = JSON.parse(value); + if (!Array.isArray(parsed)) { + return []; + } + return parsed.filter((entry): entry is string => typeof entry === "string" && entry.length > 0); + } catch { + return []; + } +} + +function encodeEligibleOrganizationIds(value: string[]): string { + return JSON.stringify([...new Set(value)]); +} + +function errorMessage(error: unknown): string { + return error instanceof Error ? error.message : String(error); +} + +function seatsIncludedForPlan(planId: FoundryBillingPlanId): number { + switch (planId) { + case "free": + return 1; + case "team": + return 5; + } +} + +function stripeStatusToBillingStatus(stripeStatus: string, cancelAtPeriodEnd: boolean): FoundryBillingState["status"] { + if (cancelAtPeriodEnd) { + return "scheduled_cancel"; + } + if (stripeStatus === "trialing") { + return "trialing"; + } + if (stripeStatus === "past_due" || stripeStatus === "unpaid" || stripeStatus === "incomplete") { + return "past_due"; + } + return "active"; +} + +function formatUnixDate(value: number): string { + return new Date(value * 1000).toISOString().slice(0, 10); +} + +function legacyRepoImportStatusToGithubSyncStatus(value: string | null | undefined): FoundryOrganization["github"]["syncStatus"] { + switch (value) { + case "ready": + return "synced"; + case "importing": + return "syncing"; + default: + return "pending"; + } +} + +function stringFromMetadata(metadata: unknown, key: string): string | null { + if (!metadata || typeof metadata !== "object") { + return null; + } + const value = (metadata as Record)[key]; + return typeof value === "string" && value.length > 0 ? value : null; +} + +function stripeWebhookSubscription(event: any) { + const object = event.data.object as Record; + const items = (object.items as { data?: Array> } | undefined)?.data ?? []; + const price = items[0]?.price as Record | undefined; + return { + id: typeof object.id === "string" ? object.id : "", + customerId: typeof object.customer === "string" ? object.customer : "", + priceId: typeof price?.id === "string" ? price.id : null, + status: typeof object.status === "string" ? object.status : "active", + cancelAtPeriodEnd: object.cancel_at_period_end === true, + currentPeriodEnd: typeof object.current_period_end === "number" ? object.current_period_end : null, + trialEnd: typeof object.trial_end === "number" ? object.trial_end : null, + defaultPaymentMethodLabel: "Payment method on file", + }; +} + +async function getOrganizationState(workspace: any) { + return await workspace.getOrganizationShellState({}); +} + +async function getOrganizationStateIfInitialized(workspace: any) { + return await workspace.getOrganizationShellStateIfInitialized({}); +} + +async function listSnapshotOrganizations(c: any, sessionId: string, organizationIds: string[]) { + const results = await Promise.all( + organizationIds.map(async (organizationId) => { + const organizationStartedAt = performance.now(); + try { + const workspace = await getOrCreateWorkspace(c, organizationId); + const organizationState = await getOrganizationStateIfInitialized(workspace); + if (!organizationState) { + logger.warn( + { + sessionId, + workspaceId: c.state.workspaceId, + organizationId, + durationMs: roundDurationMs(organizationStartedAt), + }, + "build_app_snapshot_organization_uninitialized", + ); + return { organizationId, snapshot: null, status: "uninitialized" as const }; + } + logger.info( + { + sessionId, + workspaceId: c.state.workspaceId, + organizationId, + durationMs: roundDurationMs(organizationStartedAt), + }, + "build_app_snapshot_organization_completed", + ); + return { organizationId, snapshot: organizationState.snapshot, status: "ok" as const }; + } catch (error) { + const message = errorMessage(error); + if (!message.includes("Actor not found")) { + logger.error( + { + sessionId, + workspaceId: c.state.workspaceId, + organizationId, + durationMs: roundDurationMs(organizationStartedAt), + errorMessage: message, + errorStack: error instanceof Error ? error.stack : undefined, + }, + "build_app_snapshot_organization_failed", + ); + throw error; + } + logger.info( + { + sessionId, + workspaceId: c.state.workspaceId, + organizationId, + durationMs: roundDurationMs(organizationStartedAt), + }, + "build_app_snapshot_organization_missing", + ); + return { organizationId, snapshot: null, status: "missing" as const }; + } + }), + ); + + return { + organizations: results.map((result) => result.snapshot).filter((organization): organization is FoundryOrganization => organization !== null), + uninitializedOrganizationIds: results.filter((result) => result.status === "uninitialized").map((result) => result.organizationId), + }; +} + +async function buildAppSnapshot(c: any, sessionId: string, allowOrganizationRepair = true): Promise { + assertAppWorkspace(c); + const startedAt = performance.now(); + const auth = getBetterAuthService(); + let authState = await auth.getAuthState(sessionId); + // Inline fallback: if the user is signed in but has no eligible organizations yet + // (e.g. first load after OAuth callback), sync GitHub orgs before building the snapshot. + if (authState?.user && parseEligibleOrganizationIds(authState.profile?.eligibleOrganizationIdsJson ?? "[]").length === 0) { + const token = await auth.getAccessTokenForSession(sessionId); + if (token?.accessToken) { + logger.info({ sessionId }, "build_app_snapshot_sync_orgs"); + await syncGithubOrganizations(c, { sessionId, accessToken: token.accessToken }); + authState = await auth.getAuthState(sessionId); + } else { + logger.warn({ sessionId }, "build_app_snapshot_no_access_token"); + } + } + + const session = authState?.session ?? null; + const user = authState?.user ?? null; + const profile = authState?.profile ?? null; + const currentSessionState = authState?.sessionState ?? null; + const githubAccount = authState?.accounts?.find((account: any) => account.providerId === "github") ?? null; + const eligibleOrganizationIds = parseEligibleOrganizationIds(profile?.eligibleOrganizationIdsJson ?? "[]"); + + logger.info( + { + sessionId, + workspaceId: c.state.workspaceId, + eligibleOrganizationCount: eligibleOrganizationIds.length, + eligibleOrganizationIds, + }, + "build_app_snapshot_started", + ); + + let { organizations, uninitializedOrganizationIds } = await listSnapshotOrganizations(c, sessionId, eligibleOrganizationIds); + + if (allowOrganizationRepair && uninitializedOrganizationIds.length > 0) { + const token = await auth.getAccessTokenForSession(sessionId); + if (token?.accessToken) { + logger.info( + { + sessionId, + workspaceId: c.state.workspaceId, + organizationIds: uninitializedOrganizationIds, + }, + "build_app_snapshot_repairing_organizations", + ); + await syncGithubOrganizationsInternal(c, { sessionId, accessToken: token.accessToken }, { broadcast: false }); + return await buildAppSnapshot(c, sessionId, false); + } + logger.warn( + { + sessionId, + workspaceId: c.state.workspaceId, + organizationIds: uninitializedOrganizationIds, + }, + "build_app_snapshot_repair_skipped_no_access_token", + ); + } + + const currentUser: FoundryUser | null = user + ? { + id: profile?.githubAccountId ?? githubAccount?.accountId ?? user.id, + name: user.name, + email: user.email, + githubLogin: profile?.githubLogin ?? "", + roleLabel: profile?.roleLabel ?? "GitHub user", + eligibleOrganizationIds, + } + : null; + + const activeOrganizationId = + currentUser && + currentSessionState?.activeOrganizationId && + organizations.some((organization) => organization.id === currentSessionState.activeOrganizationId) + ? currentSessionState.activeOrganizationId + : currentUser && organizations.length === 1 + ? (organizations[0]?.id ?? null) + : null; + + const snapshot: FoundryAppSnapshot = { + auth: { + status: currentUser ? "signed_in" : "signed_out", + currentUserId: currentUser?.id ?? null, + }, + activeOrganizationId, + onboarding: { + starterRepo: { + repoFullName: "rivet-dev/sandbox-agent", + repoUrl: "https://github.com/rivet-dev/sandbox-agent", + status: profile?.starterRepoStatus ?? "pending", + starredAt: profile?.starterRepoStarredAt ?? null, + skippedAt: profile?.starterRepoSkippedAt ?? null, + }, + }, + users: currentUser ? [currentUser] : [], + organizations, + }; + + logger.info( + { + sessionId, + workspaceId: c.state.workspaceId, + eligibleOrganizationCount: eligibleOrganizationIds.length, + organizationCount: organizations.length, + durationMs: roundDurationMs(startedAt), + }, + "build_app_snapshot_completed", + ); + + return snapshot; +} + +async function requireSignedInSession(c: any, sessionId: string) { + const auth = getBetterAuthService(); + const authState = await auth.getAuthState(sessionId); + const user = authState?.user ?? null; + const profile = authState?.profile ?? null; + const githubAccount = authState?.accounts?.find((account: any) => account.providerId === "github") ?? null; + if (!authState?.session || !user?.email) { + throw new Error("User must be signed in"); + } + const token = await auth.getAccessTokenForSession(sessionId); + return { + ...authState.session, + authUserId: user.id, + currentUserId: profile?.githubAccountId ?? githubAccount?.accountId ?? user.id, + currentUserName: user.name, + currentUserEmail: user.email, + currentUserGithubLogin: profile?.githubLogin ?? "", + currentUserRoleLabel: profile?.roleLabel ?? "GitHub user", + eligibleOrganizationIdsJson: profile?.eligibleOrganizationIdsJson ?? "[]", + githubAccessToken: token?.accessToken ?? null, + githubScope: (token?.scopes ?? []).join(","), + starterRepoStatus: profile?.starterRepoStatus ?? "pending", + starterRepoStarredAt: profile?.starterRepoStarredAt ?? null, + starterRepoSkippedAt: profile?.starterRepoSkippedAt ?? null, + }; +} + +function requireEligibleOrganization(session: any, organizationId: string): void { + const eligibleOrganizationIds = parseEligibleOrganizationIds(session.eligibleOrganizationIdsJson); + if (!eligibleOrganizationIds.includes(organizationId)) { + throw new Error(`Organization ${organizationId} is not available in this app session`); + } +} + +async function upsertStripeLookupEntries(c: any, organizationId: string, customerId: string | null, subscriptionId: string | null): Promise { + assertAppWorkspace(c); + const now = Date.now(); + for (const lookupKey of [customerId ? `customer:${customerId}` : null, subscriptionId ? `subscription:${subscriptionId}` : null]) { + if (!lookupKey) { + continue; + } + await c.db + .insert(stripeLookup) + .values({ + lookupKey, + organizationId, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: stripeLookup.lookupKey, + set: { + organizationId, + updatedAt: now, + }, + }) + .run(); + } +} + +async function findOrganizationIdForStripeEvent(c: any, customerId: string | null, subscriptionId: string | null): Promise { + assertAppWorkspace(c); + const customerLookup = customerId + ? await c.db + .select({ organizationId: stripeLookup.organizationId }) + .from(stripeLookup) + .where(eq(stripeLookup.lookupKey, `customer:${customerId}`)) + .get() + : null; + if (customerLookup?.organizationId) { + return customerLookup.organizationId; + } + + const subscriptionLookup = subscriptionId + ? await c.db + .select({ organizationId: stripeLookup.organizationId }) + .from(stripeLookup) + .where(eq(stripeLookup.lookupKey, `subscription:${subscriptionId}`)) + .get() + : null; + return subscriptionLookup?.organizationId ?? null; +} + +async function safeListOrganizations(accessToken: string): Promise { + const { appShell } = getActorRuntimeContext(); + try { + return await appShell.github.listOrganizations(accessToken); + } catch (error) { + if (error instanceof GitHubAppError && error.status === 403) { + return []; + } + throw error; + } +} + +async function safeListInstallations(accessToken: string): Promise { + const { appShell } = getActorRuntimeContext(); + try { + return await appShell.github.listInstallations(accessToken); + } catch (error) { + if (error instanceof GitHubAppError && (error.status === 403 || error.status === 404)) { + return []; + } + throw error; + } +} + +/** + * Slow path: list GitHub orgs + installations, sync each org workspace, + * and update the session's eligible organization list. Called from the + * workflow queue so it runs in the background after the callback has + * already returned a redirect to the browser. + */ +export async function syncGithubOrganizations(c: any, input: { sessionId: string; accessToken: string }): Promise { + await syncGithubOrganizationsInternal(c, input, { broadcast: true }); +} + +async function syncGithubOrganizationsInternal(c: any, input: { sessionId: string; accessToken: string }, options: { broadcast: boolean }): Promise { + assertAppWorkspace(c); + const auth = getBetterAuthService(); + const { appShell } = getActorRuntimeContext(); + const { sessionId, accessToken } = input; + const authState = await auth.getAuthState(sessionId); + if (!authState?.user) { + throw new Error("User must be signed in"); + } + const viewer = await appShell.github.getViewer(accessToken); + const organizations = await safeListOrganizations(accessToken); + const installations = await safeListInstallations(accessToken); + const authUserId = authState.user.id; + const githubUserId = String(viewer.id); + + const linkedOrganizationIds: string[] = []; + const accounts = [ + { + githubAccountId: viewer.id, + githubLogin: viewer.login, + githubAccountType: "User", + kind: "personal" as const, + displayName: viewer.name || viewer.login, + }, + ...organizations.map((organization) => ({ + githubAccountId: organization.id, + githubLogin: organization.login, + githubAccountType: "Organization", + kind: "organization" as const, + displayName: organization.name || organization.login, + })), + ]; + + for (const account of accounts) { + const organizationId = organizationWorkspaceId(account.kind, account.githubLogin); + const installation = installations.find((candidate) => candidate.accountLogin === account.githubLogin) ?? null; + const workspace = await getOrCreateWorkspace(c, organizationId); + await workspace.syncOrganizationShellFromGithub({ + userId: githubUserId, + userName: viewer.name || viewer.login, + userEmail: viewer.email ?? `${viewer.login}@users.noreply.github.com`, + githubUserLogin: viewer.login, + githubAccountId: account.githubAccountId, + githubLogin: account.githubLogin, + githubAccountType: account.githubAccountType, + kind: account.kind, + displayName: account.displayName, + installationId: installation?.id ?? null, + appConfigured: appShell.github.isAppConfigured(), + }); + linkedOrganizationIds.push(organizationId); + } + + const activeOrganizationId = + authState.sessionState?.activeOrganizationId && linkedOrganizationIds.includes(authState.sessionState.activeOrganizationId) + ? authState.sessionState.activeOrganizationId + : linkedOrganizationIds.length === 1 + ? (linkedOrganizationIds[0] ?? null) + : null; + + await auth.setActiveOrganization(sessionId, activeOrganizationId); + await auth.upsertUserProfile(authUserId, { + githubAccountId: String(viewer.id), + githubLogin: viewer.login, + roleLabel: "GitHub user", + eligibleOrganizationIdsJson: encodeEligibleOrganizationIds(linkedOrganizationIds), + }); + if (!options.broadcast) { + return; + } + c.broadcast("appUpdated", { + type: "appUpdated", + snapshot: await buildAppSnapshot(c, sessionId), + }); +} + +export async function syncGithubOrganizationRepos(c: any, input: { sessionId: string; organizationId: string }): Promise { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + + const { appShell } = getActorRuntimeContext(); + const workspace = await getOrCreateWorkspace(c, input.organizationId); + const organization = await getOrganizationState(workspace); + + try { + let repositories; + let installationStatus = organization.snapshot.github.installationStatus; + + if (organization.snapshot.kind === "personal") { + repositories = await appShell.github.listUserRepositories(session.githubAccessToken); + installationStatus = "connected"; + } else if (organization.githubInstallationId) { + try { + repositories = await appShell.github.listInstallationRepositories(organization.githubInstallationId); + } catch (error) { + if (!(error instanceof GitHubAppError) || (error.status !== 403 && error.status !== 404)) { + throw error; + } + repositories = (await appShell.github.listUserRepositories(session.githubAccessToken)).filter((repository) => + repository.fullName.startsWith(`${organization.githubLogin}/`), + ); + installationStatus = "reconnect_required"; + } + } else { + repositories = (await appShell.github.listUserRepositories(session.githubAccessToken)).filter((repository) => + repository.fullName.startsWith(`${organization.githubLogin}/`), + ); + installationStatus = "reconnect_required"; + } + + await workspace.applyOrganizationSyncCompleted({ + repositories, + installationStatus, + lastSyncLabel: repositories.length > 0 ? "Synced just now" : "No repositories available", + }); + + // Broadcast updated app snapshot so connected clients see the new repos + c.broadcast("appUpdated", { + type: "appUpdated", + snapshot: await buildAppSnapshot(c, input.sessionId), + }); + } catch (error) { + const installationStatus = + error instanceof GitHubAppError && (error.status === 403 || error.status === 404) + ? "reconnect_required" + : organization.snapshot.github.installationStatus; + await workspace.markOrganizationSyncFailed({ + message: error instanceof Error ? error.message : "GitHub import failed", + installationStatus, + }); + + // Broadcast sync failure so the client updates status + c.broadcast("appUpdated", { + type: "appUpdated", + snapshot: await buildAppSnapshot(c, input.sessionId), + }); + } +} + +async function readOrganizationProfileRow(c: any) { + assertOrganizationWorkspace(c); + return await c.db.select().from(organizationProfile).where(eq(organizationProfile.id, PROFILE_ROW_ID)).get(); +} + +async function requireOrganizationProfileRow(c: any) { + const row = await readOrganizationProfileRow(c); + if (!row) { + throw new Error(`Organization profile is not initialized for workspace ${c.state.workspaceId}`); + } + return row; +} + +async function listOrganizationMembers(c: any): Promise { + assertOrganizationWorkspace(c); + const rows = await c.db.select().from(organizationMembers).orderBy(organizationMembers.role, organizationMembers.name).all(); + return rows.map((row) => ({ + id: row.id, + name: row.name, + email: row.email, + role: row.role, + state: row.state, + })); +} + +async function listOrganizationSeatAssignments(c: any): Promise { + assertOrganizationWorkspace(c); + const rows = await c.db.select({ email: seatAssignments.email }).from(seatAssignments).orderBy(seatAssignments.email).all(); + return rows.map((row) => row.email); +} + +async function listOrganizationInvoices(c: any): Promise { + assertOrganizationWorkspace(c); + const rows = await c.db.select().from(invoices).orderBy(desc(invoices.issuedAt), desc(invoices.createdAt)).all(); + return rows.map((row) => ({ + id: row.id, + label: row.label, + issuedAt: row.issuedAt, + amountUsd: row.amountUsd, + status: row.status, + })); +} + +async function listOrganizationRepoCatalog(c: any): Promise { + assertOrganizationWorkspace(c); + const rows = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).orderBy(desc(repos.updatedAt)).all(); + return rows.map((row) => repoLabelFromRemote(row.remoteUrl)).sort((left, right) => left.localeCompare(right)); +} + +async function buildOrganizationState(c: any) { + const startedAt = performance.now(); + const row = await requireOrganizationProfileRow(c); + return await buildOrganizationStateFromRow(c, row, startedAt); +} + +async function buildOrganizationStateIfInitialized(c: any) { + const startedAt = performance.now(); + const row = await readOrganizationProfileRow(c); + if (!row) { + return null; + } + return await buildOrganizationStateFromRow(c, row, startedAt); +} + +async function buildOrganizationStateFromRow(c: any, row: any, startedAt: number) { + const repoCatalog = await listOrganizationRepoCatalog(c); + const members = await listOrganizationMembers(c); + const seatAssignmentEmails = await listOrganizationSeatAssignments(c); + const invoiceRows = await listOrganizationInvoices(c); + + const state = { + id: c.state.workspaceId, + workspaceId: c.state.workspaceId, + kind: row.kind, + githubLogin: row.githubLogin, + githubInstallationId: row.githubInstallationId ?? null, + stripeCustomerId: row.stripeCustomerId ?? null, + stripeSubscriptionId: row.stripeSubscriptionId ?? null, + stripePriceId: row.stripePriceId ?? null, + billingPlanId: row.billingPlanId, + snapshot: { + id: c.state.workspaceId, + workspaceId: c.state.workspaceId, + kind: row.kind, + settings: { + displayName: row.displayName, + slug: row.slug, + primaryDomain: row.primaryDomain, + seatAccrualMode: "first_prompt", + defaultModel: row.defaultModel, + autoImportRepos: row.autoImportRepos === 1, + }, + github: { + connectedAccount: row.githubConnectedAccount, + installationStatus: row.githubInstallationStatus, + syncStatus: row.githubSyncStatus ?? legacyRepoImportStatusToGithubSyncStatus(row.repoImportStatus), + importedRepoCount: repoCatalog.length, + lastSyncLabel: row.githubLastSyncLabel, + lastSyncAt: row.githubLastSyncAt ?? null, + }, + billing: { + planId: row.billingPlanId, + status: row.billingStatus, + seatsIncluded: row.billingSeatsIncluded, + trialEndsAt: row.billingTrialEndsAt, + renewalAt: row.billingRenewalAt, + stripeCustomerId: row.stripeCustomerId ?? "", + paymentMethodLabel: row.billingPaymentMethodLabel, + invoices: invoiceRows, + }, + members, + seatAssignments: seatAssignmentEmails, + repoCatalog, + }, + }; + + logger.info( + { + workspaceId: c.state.workspaceId, + githubLogin: row.githubLogin, + repoCount: repoCatalog.length, + memberCount: members.length, + seatAssignmentCount: seatAssignmentEmails.length, + invoiceCount: invoiceRows.length, + durationMs: roundDurationMs(startedAt), + }, + "build_organization_state_completed", + ); + + return state; +} + +async function applySubscriptionState( + workspace: any, + subscription: { + id: string; + customerId: string; + priceId: string | null; + status: string; + cancelAtPeriodEnd: boolean; + currentPeriodEnd: number | null; + trialEnd: number | null; + defaultPaymentMethodLabel: string; + }, + fallbackPlanId: FoundryBillingPlanId, +): Promise { + await workspace.applyOrganizationStripeSubscription({ + subscription, + fallbackPlanId, + }); +} + +export const workspaceAppActions = { + async authFindSessionIndex(c: any, input: { sessionId?: string; sessionToken?: string }) { + assertAppWorkspace(c); + + const clauses = [ + ...(input.sessionId ? [{ field: "sessionId", value: input.sessionId }] : []), + ...(input.sessionToken ? [{ field: "sessionToken", value: input.sessionToken }] : []), + ]; + if (clauses.length === 0) { + return null; + } + const predicate = workspaceAuthWhere(authSessionIndex, clauses); + return await c.db.select().from(authSessionIndex).where(predicate!).get(); + }, + + async authUpsertSessionIndex(c: any, input: { sessionId: string; sessionToken: string; userId: string }) { + assertAppWorkspace(c); + + const now = Date.now(); + await c.db + .insert(authSessionIndex) + .values({ + sessionId: input.sessionId, + sessionToken: input.sessionToken, + userId: input.userId, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: authSessionIndex.sessionId, + set: { + sessionToken: input.sessionToken, + userId: input.userId, + updatedAt: now, + }, + }) + .run(); + return await c.db.select().from(authSessionIndex).where(eq(authSessionIndex.sessionId, input.sessionId)).get(); + }, + + async authDeleteSessionIndex(c: any, input: { sessionId?: string; sessionToken?: string }) { + assertAppWorkspace(c); + + const clauses = [ + ...(input.sessionId ? [{ field: "sessionId", value: input.sessionId }] : []), + ...(input.sessionToken ? [{ field: "sessionToken", value: input.sessionToken }] : []), + ]; + if (clauses.length === 0) { + return; + } + const predicate = workspaceAuthWhere(authSessionIndex, clauses); + await c.db.delete(authSessionIndex).where(predicate!).run(); + }, + + async authFindEmailIndex(c: any, input: { email: string }) { + assertAppWorkspace(c); + + return await c.db.select().from(authEmailIndex).where(eq(authEmailIndex.email, input.email)).get(); + }, + + async authUpsertEmailIndex(c: any, input: { email: string; userId: string }) { + assertAppWorkspace(c); + + const now = Date.now(); + await c.db + .insert(authEmailIndex) + .values({ + email: input.email, + userId: input.userId, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: authEmailIndex.email, + set: { + userId: input.userId, + updatedAt: now, + }, + }) + .run(); + return await c.db.select().from(authEmailIndex).where(eq(authEmailIndex.email, input.email)).get(); + }, + + async authDeleteEmailIndex(c: any, input: { email: string }) { + assertAppWorkspace(c); + + await c.db.delete(authEmailIndex).where(eq(authEmailIndex.email, input.email)).run(); + }, + + async authFindAccountIndex(c: any, input: { id?: string; providerId?: string; accountId?: string }) { + assertAppWorkspace(c); + + if (input.id) { + return await c.db.select().from(authAccountIndex).where(eq(authAccountIndex.id, input.id)).get(); + } + if (!input.providerId || !input.accountId) { + return null; + } + return await c.db + .select() + .from(authAccountIndex) + .where(and(eq(authAccountIndex.providerId, input.providerId), eq(authAccountIndex.accountId, input.accountId))) + .get(); + }, + + async authUpsertAccountIndex(c: any, input: { id: string; providerId: string; accountId: string; userId: string }) { + assertAppWorkspace(c); + + const now = Date.now(); + await c.db + .insert(authAccountIndex) + .values({ + id: input.id, + providerId: input.providerId, + accountId: input.accountId, + userId: input.userId, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: authAccountIndex.id, + set: { + providerId: input.providerId, + accountId: input.accountId, + userId: input.userId, + updatedAt: now, + }, + }) + .run(); + return await c.db.select().from(authAccountIndex).where(eq(authAccountIndex.id, input.id)).get(); + }, + + async authDeleteAccountIndex(c: any, input: { id?: string; providerId?: string; accountId?: string }) { + assertAppWorkspace(c); + + if (input.id) { + await c.db.delete(authAccountIndex).where(eq(authAccountIndex.id, input.id)).run(); + return; + } + if (input.providerId && input.accountId) { + await c.db + .delete(authAccountIndex) + .where(and(eq(authAccountIndex.providerId, input.providerId), eq(authAccountIndex.accountId, input.accountId))) + .run(); + } + }, + + async authCreateVerification(c: any, input: { data: Record }) { + assertAppWorkspace(c); + + await c.db + .insert(authVerification) + .values(input.data as any) + .run(); + return await c.db + .select() + .from(authVerification) + .where(eq(authVerification.id, input.data.id as string)) + .get(); + }, + + async authFindOneVerification(c: any, input: { where: any[] }) { + assertAppWorkspace(c); + + const predicate = workspaceAuthWhere(authVerification, input.where); + return predicate ? await c.db.select().from(authVerification).where(predicate).get() : null; + }, + + async authFindManyVerification(c: any, input: { where?: any[]; limit?: number; sortBy?: any; offset?: number }) { + assertAppWorkspace(c); + + const predicate = workspaceAuthWhere(authVerification, input.where); + let query = c.db.select().from(authVerification); + if (predicate) { + query = query.where(predicate); + } + if (input.sortBy?.field) { + const column = workspaceAuthColumn(authVerification, input.sortBy.field); + query = query.orderBy(input.sortBy.direction === "asc" ? asc(column) : desc(column)); + } + if (typeof input.limit === "number") { + query = query.limit(input.limit); + } + if (typeof input.offset === "number") { + query = query.offset(input.offset); + } + return await query.all(); + }, + + async authUpdateVerification(c: any, input: { where: any[]; update: Record }) { + assertAppWorkspace(c); + + const predicate = workspaceAuthWhere(authVerification, input.where); + if (!predicate) { + return null; + } + await c.db + .update(authVerification) + .set(input.update as any) + .where(predicate) + .run(); + return await c.db.select().from(authVerification).where(predicate).get(); + }, + + async authUpdateManyVerification(c: any, input: { where: any[]; update: Record }) { + assertAppWorkspace(c); + + const predicate = workspaceAuthWhere(authVerification, input.where); + if (!predicate) { + return 0; + } + await c.db + .update(authVerification) + .set(input.update as any) + .where(predicate) + .run(); + const row = await c.db.select({ value: sqlCount() }).from(authVerification).where(predicate).get(); + return row?.value ?? 0; + }, + + async authDeleteVerification(c: any, input: { where: any[] }) { + assertAppWorkspace(c); + + const predicate = workspaceAuthWhere(authVerification, input.where); + if (!predicate) { + return; + } + await c.db.delete(authVerification).where(predicate).run(); + }, + + async authDeleteManyVerification(c: any, input: { where: any[] }) { + assertAppWorkspace(c); + + const predicate = workspaceAuthWhere(authVerification, input.where); + if (!predicate) { + return 0; + } + const rows = await c.db.select().from(authVerification).where(predicate).all(); + await c.db.delete(authVerification).where(predicate).run(); + return rows.length; + }, + + async authCountVerification(c: any, input: { where?: any[] }) { + assertAppWorkspace(c); + + const predicate = workspaceAuthWhere(authVerification, input.where); + const row = predicate + ? await c.db.select({ value: sqlCount() }).from(authVerification).where(predicate).get() + : await c.db.select({ value: sqlCount() }).from(authVerification).get(); + return row?.value ?? 0; + }, + + async getAppSnapshot(c: any, input: { sessionId: string }): Promise { + return await buildAppSnapshot(c, input.sessionId); + }, + + async resolveAppGithubToken( + c: any, + input: { organizationId: string; requireRepoScope?: boolean }, + ): Promise<{ accessToken: string; scopes: string[] } | null> { + assertAppWorkspace(c); + const auth = getBetterAuthService(); + const rows = await c.db.select().from(authSessionIndex).orderBy(desc(authSessionIndex.updatedAt)).all(); + + for (const row of rows) { + const authState = await auth.getAuthState(row.sessionId); + if (authState?.sessionState?.activeOrganizationId !== input.organizationId) { + continue; + } + + const token = await auth.getAccessTokenForSession(row.sessionId); + if (!token?.accessToken) { + continue; + } + + const scopes = token.scopes; + if (input.requireRepoScope !== false && scopes.length > 0 && !hasRepoScope(scopes)) { + continue; + } + + return { + accessToken: token.accessToken, + scopes, + }; + } + + return null; + }, + + async skipAppStarterRepo(c: any, input: { sessionId: string }): Promise { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + await getBetterAuthService().upsertUserProfile(session.authUserId, { + starterRepoStatus: "skipped", + starterRepoSkippedAt: Date.now(), + starterRepoStarredAt: null, + }); + return await buildAppSnapshot(c, input.sessionId); + }, + + async starAppStarterRepo(c: any, input: { sessionId: string; organizationId: string }): Promise { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + const workspace = await getOrCreateWorkspace(c, input.organizationId); + await workspace.starSandboxAgentRepo({ + workspaceId: input.organizationId, + }); + await getBetterAuthService().upsertUserProfile(session.authUserId, { + starterRepoStatus: "starred", + starterRepoStarredAt: Date.now(), + starterRepoSkippedAt: null, + }); + return await buildAppSnapshot(c, input.sessionId); + }, + + async selectAppOrganization(c: any, input: { sessionId: string; organizationId: string }): Promise { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + await getBetterAuthService().setActiveOrganization(input.sessionId, input.organizationId); + + const workspace = await getOrCreateWorkspace(c, input.organizationId); + const organization = await getOrganizationState(workspace); + if (organization.snapshot.github.syncStatus !== "synced") { + if (organization.snapshot.github.syncStatus !== "syncing") { + await workspace.markOrganizationSyncStarted({ + label: "Importing repository catalog...", + }); + + const self = selfWorkspace(c); + await self.send( + "workspace.command.syncGithubOrganizationRepos", + { sessionId: input.sessionId, organizationId: input.organizationId }, + { + wait: false, + }, + ); + } + + return await buildAppSnapshot(c, input.sessionId); + } + return await buildAppSnapshot(c, input.sessionId); + }, + + async updateAppOrganizationProfile( + c: any, + input: { sessionId: string; organizationId: string } & UpdateFoundryOrganizationProfileInput, + ): Promise { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + const workspace = await getOrCreateWorkspace(c, input.organizationId); + await workspace.updateOrganizationShellProfile({ + displayName: input.displayName, + slug: input.slug, + primaryDomain: input.primaryDomain, + }); + return await buildAppSnapshot(c, input.sessionId); + }, + + async triggerAppRepoImport(c: any, input: { sessionId: string; organizationId: string }): Promise { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + + const workspace = await getOrCreateWorkspace(c, input.organizationId); + const organization = await getOrganizationState(workspace); + if (organization.snapshot.github.syncStatus === "syncing") { + return await buildAppSnapshot(c, input.sessionId); + } + + await workspace.markOrganizationSyncStarted({ + label: "Importing repository catalog...", + }); + + const self = selfWorkspace(c); + await self.send( + "workspace.command.syncGithubOrganizationRepos", + { sessionId: input.sessionId, organizationId: input.organizationId }, + { + wait: false, + }, + ); + + return await buildAppSnapshot(c, input.sessionId); + }, + + async beginAppGithubInstall(c: any, input: { sessionId: string; organizationId: string }): Promise<{ url: string }> { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + const { appShell } = getActorRuntimeContext(); + const workspace = await getOrCreateWorkspace(c, input.organizationId); + const organization = await getOrganizationState(workspace); + if (organization.snapshot.kind !== "organization") { + return { + url: `${appShell.appUrl}/workspaces/${input.organizationId}`, + }; + } + return { + url: await appShell.github.buildInstallationUrl(organization.githubLogin, randomUUID()), + }; + }, + + async createAppCheckoutSession(c: any, input: { sessionId: string; organizationId: string; planId: FoundryBillingPlanId }): Promise<{ url: string }> { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + const { appShell } = getActorRuntimeContext(); + const workspace = await getOrCreateWorkspace(c, input.organizationId); + const organization = await getOrganizationState(workspace); + + if (input.planId === "free") { + await workspace.applyOrganizationFreePlan({ clearSubscription: false }); + return { + url: `${appShell.appUrl}/organizations/${input.organizationId}/billing`, + }; + } + + if (!appShell.stripe.isConfigured()) { + throw new Error("Stripe is not configured"); + } + + let customerId = organization.stripeCustomerId; + if (!customerId) { + customerId = ( + await appShell.stripe.createCustomer({ + organizationId: input.organizationId, + displayName: organization.snapshot.settings.displayName, + email: session.currentUserEmail, + }) + ).id; + await workspace.applyOrganizationStripeCustomer({ customerId }); + await upsertStripeLookupEntries(c, input.organizationId, customerId, null); + } + + return { + url: await appShell.stripe + .createCheckoutSession({ + organizationId: input.organizationId, + customerId, + customerEmail: session.currentUserEmail, + planId: input.planId, + successUrl: `${appShell.apiUrl}/v1/billing/checkout/complete?organizationId=${encodeURIComponent( + input.organizationId, + )}&session_id={CHECKOUT_SESSION_ID}`, + cancelUrl: `${appShell.appUrl}/organizations/${input.organizationId}/billing`, + }) + .then((checkout) => checkout.url), + }; + }, + + async finalizeAppCheckoutSession(c: any, input: { sessionId: string; organizationId: string; checkoutSessionId: string }): Promise<{ redirectTo: string }> { + assertAppWorkspace(c); + const { appShell } = getActorRuntimeContext(); + const workspace = await getOrCreateWorkspace(c, input.organizationId); + const organization = await getOrganizationState(workspace); + const completion = await appShell.stripe.retrieveCheckoutCompletion(input.checkoutSessionId); + + if (completion.customerId) { + await workspace.applyOrganizationStripeCustomer({ customerId: completion.customerId }); + } + await upsertStripeLookupEntries(c, input.organizationId, completion.customerId, completion.subscriptionId); + + if (completion.subscriptionId) { + const subscription = await appShell.stripe.retrieveSubscription(completion.subscriptionId); + await applySubscriptionState(workspace, subscription, completion.planId ?? organization.billingPlanId); + } + + if (completion.paymentMethodLabel) { + await workspace.setOrganizationBillingPaymentMethod({ + label: completion.paymentMethodLabel, + }); + } + + return { + redirectTo: `${appShell.appUrl}/organizations/${input.organizationId}/billing`, + }; + }, + + async createAppBillingPortalSession(c: any, input: { sessionId: string; organizationId: string }): Promise<{ url: string }> { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + const { appShell } = getActorRuntimeContext(); + const workspace = await getOrCreateWorkspace(c, input.organizationId); + const organization = await getOrganizationState(workspace); + if (!organization.stripeCustomerId) { + throw new Error("Stripe customer is not available for this organization"); + } + const portal = await appShell.stripe.createPortalSession({ + customerId: organization.stripeCustomerId, + returnUrl: `${appShell.appUrl}/organizations/${input.organizationId}/billing`, + }); + return { url: portal.url }; + }, + + async cancelAppScheduledRenewal(c: any, input: { sessionId: string; organizationId: string }): Promise { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + const { appShell } = getActorRuntimeContext(); + const workspace = await getOrCreateWorkspace(c, input.organizationId); + const organization = await getOrganizationState(workspace); + + if (organization.stripeSubscriptionId && appShell.stripe.isConfigured()) { + const subscription = await appShell.stripe.updateSubscriptionCancellation(organization.stripeSubscriptionId, true); + await applySubscriptionState(workspace, subscription, organization.billingPlanId); + await upsertStripeLookupEntries(c, input.organizationId, subscription.customerId ?? organization.stripeCustomerId, subscription.id); + } else { + await workspace.setOrganizationBillingStatus({ status: "scheduled_cancel" }); + } + + return await buildAppSnapshot(c, input.sessionId); + }, + + async resumeAppSubscription(c: any, input: { sessionId: string; organizationId: string }): Promise { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + const { appShell } = getActorRuntimeContext(); + const workspace = await getOrCreateWorkspace(c, input.organizationId); + const organization = await getOrganizationState(workspace); + + if (organization.stripeSubscriptionId && appShell.stripe.isConfigured()) { + const subscription = await appShell.stripe.updateSubscriptionCancellation(organization.stripeSubscriptionId, false); + await applySubscriptionState(workspace, subscription, organization.billingPlanId); + await upsertStripeLookupEntries(c, input.organizationId, subscription.customerId ?? organization.stripeCustomerId, subscription.id); + } else { + await workspace.setOrganizationBillingStatus({ status: "active" }); + } + + return await buildAppSnapshot(c, input.sessionId); + }, + + async recordAppSeatUsage(c: any, input: { sessionId: string; workspaceId: string }): Promise { + assertAppWorkspace(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.workspaceId); + const workspace = await getOrCreateWorkspace(c, input.workspaceId); + await workspace.recordOrganizationSeatUsage({ + email: session.currentUserEmail, + }); + return await buildAppSnapshot(c, input.sessionId); + }, + + async handleAppStripeWebhook(c: any, input: { payload: string; signatureHeader: string | null }): Promise<{ ok: true }> { + assertAppWorkspace(c); + const { appShell } = getActorRuntimeContext(); + const event = appShell.stripe.verifyWebhookEvent(input.payload, input.signatureHeader); + + if (event.type === "checkout.session.completed") { + const object = event.data.object as Record; + const organizationId = + stringFromMetadata(object.metadata, "organizationId") ?? + (await findOrganizationIdForStripeEvent( + c, + typeof object.customer === "string" ? object.customer : null, + typeof object.subscription === "string" ? object.subscription : null, + )); + if (organizationId) { + const workspace = await getOrCreateWorkspace(c, organizationId); + if (typeof object.customer === "string") { + await workspace.applyOrganizationStripeCustomer({ customerId: object.customer }); + } + await upsertStripeLookupEntries( + c, + organizationId, + typeof object.customer === "string" ? object.customer : null, + typeof object.subscription === "string" ? object.subscription : null, + ); + } + return { ok: true }; + } + + if (event.type === "customer.subscription.updated" || event.type === "customer.subscription.created") { + const subscription = stripeWebhookSubscription(event); + const organizationId = await findOrganizationIdForStripeEvent(c, subscription.customerId, subscription.id); + if (organizationId) { + const workspace = await getOrCreateWorkspace(c, organizationId); + const organization = await getOrganizationState(workspace); + await applySubscriptionState(workspace, subscription, appShell.stripe.planIdForPriceId(subscription.priceId ?? "") ?? organization.billingPlanId); + await upsertStripeLookupEntries(c, organizationId, subscription.customerId, subscription.id); + } + return { ok: true }; + } + + if (event.type === "customer.subscription.deleted") { + const subscription = stripeWebhookSubscription(event); + const organizationId = await findOrganizationIdForStripeEvent(c, subscription.customerId, subscription.id); + if (organizationId) { + const workspace = await getOrCreateWorkspace(c, organizationId); + await workspace.applyOrganizationFreePlan({ clearSubscription: true }); + } + return { ok: true }; + } + + if (event.type === "invoice.paid" || event.type === "invoice.payment_failed") { + const invoice = event.data.object as Record; + const organizationId = await findOrganizationIdForStripeEvent(c, typeof invoice.customer === "string" ? invoice.customer : null, null); + if (organizationId) { + const workspace = await getOrCreateWorkspace(c, organizationId); + const rawAmount = typeof invoice.amount_paid === "number" ? invoice.amount_paid : invoice.amount_due; + const amountUsd = Math.round((typeof rawAmount === "number" ? rawAmount : 0) / 100); + await workspace.upsertOrganizationInvoice({ + id: String(invoice.id), + label: typeof invoice.number === "string" ? `Invoice ${invoice.number}` : "Stripe invoice", + issuedAt: formatUnixDate(typeof invoice.created === "number" ? invoice.created : Math.floor(Date.now() / 1000)), + amountUsd: Number.isFinite(amountUsd) ? amountUsd : 0, + status: event.type === "invoice.paid" ? "paid" : "open", + }); + } + } + + return { ok: true }; + }, + + async handleAppGithubWebhook(c: any, input: { payload: string; signatureHeader: string | null; eventHeader: string | null }): Promise<{ ok: true }> { + assertAppWorkspace(c); + const { appShell } = getActorRuntimeContext(); + const { event, body } = appShell.github.verifyWebhookEvent(input.payload, input.signatureHeader, input.eventHeader); + + const accountLogin = body.installation?.account?.login; + const accountType = body.installation?.account?.type; + if (!accountLogin) { + githubWebhookLogger.info( + { + event, + action: body.action ?? null, + reason: "missing_installation_account", + }, + "ignored", + ); + return { ok: true }; + } + + const kind: FoundryOrganization["kind"] = accountType === "User" ? "personal" : "organization"; + const organizationId = organizationWorkspaceId(kind, accountLogin); + + if (event === "installation" && (body.action === "created" || body.action === "deleted" || body.action === "suspend" || body.action === "unsuspend")) { + githubWebhookLogger.info( + { + event, + action: body.action, + accountLogin, + organizationId, + }, + "installation_event", + ); + if (body.action === "deleted") { + const workspace = await getOrCreateWorkspace(c, organizationId); + await workspace.applyGithubInstallationRemoved({}); + } else if (body.action === "created") { + const workspace = await getOrCreateWorkspace(c, organizationId); + await workspace.applyGithubInstallationCreated({ + installationId: body.installation?.id ?? 0, + }); + } + return { ok: true }; + } + + if (event === "installation_repositories") { + githubWebhookLogger.info( + { + event, + action: body.action ?? null, + accountLogin, + organizationId, + repositoriesAdded: body.repositories_added?.length ?? 0, + repositoriesRemoved: body.repositories_removed?.length ?? 0, + }, + "repository_membership_changed", + ); + const workspace = await getOrCreateWorkspace(c, organizationId); + await workspace.applyGithubRepositoryChanges({ + added: (body.repositories_added ?? []).map((r) => ({ + fullName: r.full_name, + private: r.private, + })), + removed: (body.repositories_removed ?? []).map((r) => r.full_name), + }); + return { ok: true }; + } + + if ( + event === "push" || + event === "pull_request" || + event === "pull_request_review" || + event === "pull_request_review_comment" || + event === "check_run" || + event === "check_suite" || + event === "status" || + event === "create" || + event === "delete" + ) { + const repoFullName = body.repository?.full_name; + if (repoFullName) { + githubWebhookLogger.info( + { + event, + action: body.action ?? null, + accountLogin, + organizationId, + repoFullName, + }, + "repository_event", + ); + // TODO: Dispatch to GitHubStateActor / downstream actors + } + return { ok: true }; + } + + githubWebhookLogger.info( + { + event, + action: body.action ?? null, + accountLogin, + organizationId, + }, + "unhandled_event", + ); + return { ok: true }; + }, + + async syncOrganizationShellFromGithub( + c: any, + input: { + userId: string; + userName: string; + userEmail: string; + githubUserLogin: string; + githubAccountId: string; + githubLogin: string; + githubAccountType: string; + kind: FoundryOrganization["kind"]; + displayName: string; + installationId: number | null; + appConfigured: boolean; + }, + ): Promise<{ organizationId: string }> { + assertOrganizationWorkspace(c); + const now = Date.now(); + const existing = await readOrganizationProfileRow(c); + const slug = existing?.slug ?? slugify(input.githubLogin); + const organizationId = organizationWorkspaceId(input.kind, input.githubLogin); + if (organizationId !== c.state.workspaceId) { + throw new Error(`Workspace actor mismatch: actor=${c.state.workspaceId} github=${organizationId}`); + } + + const installationStatus = + input.kind === "personal" ? "connected" : input.installationId ? "connected" : input.appConfigured ? "install_required" : "reconnect_required"; + const syncStatus = existing?.githubSyncStatus ?? legacyRepoImportStatusToGithubSyncStatus(existing?.repoImportStatus); + const lastSyncLabel = + syncStatus === "synced" + ? existing.githubLastSyncLabel + : installationStatus === "connected" + ? "Waiting for first import" + : installationStatus === "install_required" + ? "GitHub App installation required" + : "GitHub App configuration incomplete"; + const hasStripeBillingState = Boolean(existing?.stripeCustomerId || existing?.stripeSubscriptionId || existing?.stripePriceId); + const defaultBillingPlanId = input.kind === "personal" || !hasStripeBillingState ? "free" : (existing?.billingPlanId ?? "team"); + const defaultSeatsIncluded = input.kind === "personal" || !hasStripeBillingState ? 1 : (existing?.billingSeatsIncluded ?? 5); + const defaultPaymentMethodLabel = + input.kind === "personal" + ? "No card required" + : hasStripeBillingState + ? (existing?.billingPaymentMethodLabel ?? "Payment method on file") + : "No payment method on file"; + + await c.db + .insert(organizationProfile) + .values({ + id: PROFILE_ROW_ID, + kind: input.kind, + githubAccountId: input.githubAccountId, + githubLogin: input.githubLogin, + githubAccountType: input.githubAccountType, + displayName: input.displayName, + slug, + primaryDomain: existing?.primaryDomain ?? (input.kind === "personal" ? "personal" : `${slug}.github`), + defaultModel: existing?.defaultModel ?? "claude-sonnet-4", + autoImportRepos: existing?.autoImportRepos ?? 1, + repoImportStatus: existing?.repoImportStatus ?? "not_started", + githubConnectedAccount: input.githubLogin, + githubInstallationStatus: installationStatus, + githubSyncStatus: syncStatus, + githubInstallationId: input.installationId, + githubLastSyncLabel: lastSyncLabel, + githubLastSyncAt: existing?.githubLastSyncAt ?? null, + stripeCustomerId: existing?.stripeCustomerId ?? null, + stripeSubscriptionId: existing?.stripeSubscriptionId ?? null, + stripePriceId: existing?.stripePriceId ?? null, + billingPlanId: defaultBillingPlanId, + billingStatus: existing?.billingStatus ?? "active", + billingSeatsIncluded: defaultSeatsIncluded, + billingTrialEndsAt: existing?.billingTrialEndsAt ?? null, + billingRenewalAt: existing?.billingRenewalAt ?? null, + billingPaymentMethodLabel: defaultPaymentMethodLabel, + createdAt: existing?.createdAt ?? now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: organizationProfile.id, + set: { + kind: input.kind, + githubAccountId: input.githubAccountId, + githubLogin: input.githubLogin, + githubAccountType: input.githubAccountType, + displayName: input.displayName, + githubConnectedAccount: input.githubLogin, + githubInstallationStatus: installationStatus, + githubSyncStatus: syncStatus, + githubInstallationId: input.installationId, + githubLastSyncLabel: lastSyncLabel, + githubLastSyncAt: existing?.githubLastSyncAt ?? null, + billingPlanId: defaultBillingPlanId, + billingSeatsIncluded: defaultSeatsIncluded, + billingPaymentMethodLabel: defaultPaymentMethodLabel, + updatedAt: now, + }, + }) + .run(); + + await c.db + .insert(organizationMembers) + .values({ + id: input.userId, + name: input.userName, + email: input.userEmail, + role: input.kind === "personal" ? "owner" : "admin", + state: "active", + updatedAt: now, + }) + .onConflictDoUpdate({ + target: organizationMembers.id, + set: { + name: input.userName, + email: input.userEmail, + role: input.kind === "personal" ? "owner" : "admin", + state: "active", + updatedAt: now, + }, + }) + .run(); + + return { organizationId }; + }, + + async getOrganizationShellState(c: any): Promise { + assertOrganizationWorkspace(c); + return await buildOrganizationState(c); + }, + + async getOrganizationShellStateIfInitialized(c: any): Promise { + assertOrganizationWorkspace(c); + return await buildOrganizationStateIfInitialized(c); + }, + + async updateOrganizationShellProfile(c: any, input: Pick): Promise { + assertOrganizationWorkspace(c); + const existing = await requireOrganizationProfileRow(c); + await c.db + .update(organizationProfile) + .set({ + displayName: input.displayName.trim() || existing.displayName, + slug: input.slug.trim() || existing.slug, + primaryDomain: input.primaryDomain.trim() || existing.primaryDomain, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, + + async markOrganizationSyncStarted(c: any, input: { label: string }): Promise { + assertOrganizationWorkspace(c); + await c.db + .update(organizationProfile) + .set({ + githubSyncStatus: "syncing", + githubLastSyncLabel: input.label, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, + + async applyOrganizationSyncCompleted( + c: any, + input: { + repositories: Array<{ fullName: string; cloneUrl: string; private: boolean }>; + installationStatus: FoundryOrganization["github"]["installationStatus"]; + lastSyncLabel: string; + }, + ): Promise { + assertOrganizationWorkspace(c); + const now = Date.now(); + for (const repository of input.repositories) { + const remoteUrl = repository.cloneUrl; + await c.db + .insert(repos) + .values({ + repoId: repoIdFromRemote(remoteUrl), + remoteUrl, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: repos.repoId, + set: { + remoteUrl, + updatedAt: now, + }, + }) + .run(); + } + await c.db + .update(organizationProfile) + .set({ + githubInstallationStatus: input.installationStatus, + githubSyncStatus: "synced", + githubLastSyncLabel: input.lastSyncLabel, + githubLastSyncAt: now, + updatedAt: now, + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, + + async markOrganizationSyncFailed(c: any, input: { message: string; installationStatus: FoundryOrganization["github"]["installationStatus"] }): Promise { + assertOrganizationWorkspace(c); + await c.db + .update(organizationProfile) + .set({ + githubInstallationStatus: input.installationStatus, + githubSyncStatus: "error", + githubLastSyncLabel: input.message, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, + + async applyOrganizationStripeCustomer(c: any, input: { customerId: string }): Promise { + assertOrganizationWorkspace(c); + await c.db + .update(organizationProfile) + .set({ + stripeCustomerId: input.customerId, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, + + async applyOrganizationStripeSubscription( + c: any, + input: { + subscription: { + id: string; + customerId: string; + priceId: string | null; + status: string; + cancelAtPeriodEnd: boolean; + currentPeriodEnd: number | null; + trialEnd: number | null; + defaultPaymentMethodLabel: string; + }; + fallbackPlanId: FoundryBillingPlanId; + }, + ): Promise { + assertOrganizationWorkspace(c); + const { appShell } = getActorRuntimeContext(); + const planId = appShell.stripe.planIdForPriceId(input.subscription.priceId ?? "") ?? input.fallbackPlanId; + await c.db + .update(organizationProfile) + .set({ + stripeCustomerId: input.subscription.customerId || null, + stripeSubscriptionId: input.subscription.id || null, + stripePriceId: input.subscription.priceId, + billingPlanId: planId, + billingStatus: stripeStatusToBillingStatus(input.subscription.status, input.subscription.cancelAtPeriodEnd), + billingSeatsIncluded: seatsIncludedForPlan(planId), + billingTrialEndsAt: input.subscription.trialEnd ? new Date(input.subscription.trialEnd * 1000).toISOString() : null, + billingRenewalAt: input.subscription.currentPeriodEnd ? new Date(input.subscription.currentPeriodEnd * 1000).toISOString() : null, + billingPaymentMethodLabel: input.subscription.defaultPaymentMethodLabel || "Payment method on file", + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, + + async applyOrganizationFreePlan(c: any, input: { clearSubscription: boolean }): Promise { + assertOrganizationWorkspace(c); + const patch: Record = { + billingPlanId: "free", + billingStatus: "active", + billingSeatsIncluded: 1, + billingTrialEndsAt: null, + billingRenewalAt: null, + billingPaymentMethodLabel: "No card required", + updatedAt: Date.now(), + }; + if (input.clearSubscription) { + patch.stripeSubscriptionId = null; + patch.stripePriceId = null; + } + await c.db.update(organizationProfile).set(patch).where(eq(organizationProfile.id, PROFILE_ROW_ID)).run(); + }, + + async setOrganizationBillingPaymentMethod(c: any, input: { label: string }): Promise { + assertOrganizationWorkspace(c); + await c.db + .update(organizationProfile) + .set({ + billingPaymentMethodLabel: input.label, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, + + async setOrganizationBillingStatus(c: any, input: { status: FoundryBillingState["status"] }): Promise { + assertOrganizationWorkspace(c); + await c.db + .update(organizationProfile) + .set({ + billingStatus: input.status, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, + + async upsertOrganizationInvoice(c: any, input: { id: string; label: string; issuedAt: string; amountUsd: number; status: "paid" | "open" }): Promise { + assertOrganizationWorkspace(c); + await c.db + .insert(invoices) + .values({ + id: input.id, + label: input.label, + issuedAt: input.issuedAt, + amountUsd: input.amountUsd, + status: input.status, + createdAt: Date.now(), + }) + .onConflictDoUpdate({ + target: invoices.id, + set: { + label: input.label, + issuedAt: input.issuedAt, + amountUsd: input.amountUsd, + status: input.status, + }, + }) + .run(); + }, + + async recordOrganizationSeatUsage(c: any, input: { email: string }): Promise { + assertOrganizationWorkspace(c); + await c.db + .insert(seatAssignments) + .values({ + email: input.email, + createdAt: Date.now(), + }) + .onConflictDoNothing() + .run(); + }, + + async applyGithubInstallationCreated(c: any, input: { installationId: number }): Promise { + assertOrganizationWorkspace(c); + await c.db + .update(organizationProfile) + .set({ + githubInstallationId: input.installationId, + githubInstallationStatus: "connected", + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, + + async applyGithubInstallationRemoved(c: any, _input: {}): Promise { + assertOrganizationWorkspace(c); + await c.db + .update(organizationProfile) + .set({ + githubInstallationId: null, + githubInstallationStatus: "install_required", + githubSyncStatus: "pending", + githubLastSyncLabel: "GitHub App installation removed", + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, + + async applyGithubRepositoryChanges(c: any, input: { added: Array<{ fullName: string; private: boolean }>; removed: string[] }): Promise { + assertOrganizationWorkspace(c); + const now = Date.now(); + + for (const repo of input.added) { + const remoteUrl = `https://github.com/${repo.fullName}.git`; + const repoId = repoIdFromRemote(remoteUrl); + await c.db + .insert(repos) + .values({ + repoId, + remoteUrl, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: repos.repoId, + set: { + remoteUrl, + updatedAt: now, + }, + }) + .run(); + } + + for (const fullName of input.removed) { + const remoteUrl = `https://github.com/${fullName}.git`; + const repoId = repoIdFromRemote(remoteUrl); + await c.db.delete(repos).where(eq(repos.repoId, repoId)).run(); + } + + const repoCount = (await c.db.select().from(repos).all()).length; + await c.db + .update(organizationProfile) + .set({ + githubSyncStatus: "synced", + githubLastSyncLabel: `${repoCount} repositories synced`, + githubLastSyncAt: now, + updatedAt: now, + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); + }, +}; diff --git a/foundry/packages/backend/src/actors/organization/db/db.ts b/foundry/packages/backend/src/actors/workspace/db/db.ts similarity index 68% rename from foundry/packages/backend/src/actors/organization/db/db.ts rename to foundry/packages/backend/src/actors/workspace/db/db.ts index f7eb392..1b7c080 100644 --- a/foundry/packages/backend/src/actors/organization/db/db.ts +++ b/foundry/packages/backend/src/actors/workspace/db/db.ts @@ -2,4 +2,4 @@ import { db } from "rivetkit/db/drizzle"; import * as schema from "./schema.js"; import migrations from "./migrations.js"; -export const organizationDb = db({ schema, migrations }); +export const workspaceDb = db({ schema, migrations }); diff --git a/foundry/packages/backend/src/actors/workspace/db/drizzle.config.ts b/foundry/packages/backend/src/actors/workspace/db/drizzle.config.ts new file mode 100644 index 0000000..3049f40 --- /dev/null +++ b/foundry/packages/backend/src/actors/workspace/db/drizzle.config.ts @@ -0,0 +1,6 @@ +import { defineConfig } from "rivetkit/db/drizzle"; + +export default defineConfig({ + out: "./src/actors/workspace/db/drizzle", + schema: "./src/actors/workspace/db/schema.ts", +}); diff --git a/foundry/packages/backend/src/actors/organization/db/drizzle/0000_melted_viper.sql b/foundry/packages/backend/src/actors/workspace/db/drizzle/0000_melted_viper.sql similarity index 89% rename from foundry/packages/backend/src/actors/organization/db/drizzle/0000_melted_viper.sql rename to foundry/packages/backend/src/actors/workspace/db/drizzle/0000_melted_viper.sql index 80be04f..508cc74 100644 --- a/foundry/packages/backend/src/actors/organization/db/drizzle/0000_melted_viper.sql +++ b/foundry/packages/backend/src/actors/workspace/db/drizzle/0000_melted_viper.sql @@ -54,12 +54,6 @@ CREATE TABLE `organization_profile` ( `github_installation_id` integer, `github_last_sync_label` text NOT NULL, `github_last_sync_at` integer, - `github_last_webhook_at` integer, - `github_last_webhook_event` text, - `github_sync_generation` integer NOT NULL, - `github_sync_phase` text, - `github_processed_repository_count` integer NOT NULL, - `github_total_repository_count` integer NOT NULL, `stripe_customer_id` text, `stripe_subscription_id` text, `stripe_price_id` text, @@ -73,6 +67,12 @@ CREATE TABLE `organization_profile` ( `updated_at` integer NOT NULL ); --> statement-breakpoint +CREATE TABLE `provider_profiles` ( + `provider_id` text PRIMARY KEY NOT NULL, + `profile_json` text NOT NULL, + `updated_at` integer NOT NULL +); +--> statement-breakpoint CREATE TABLE `repos` ( `repo_id` text PRIMARY KEY NOT NULL, `remote_url` text NOT NULL, @@ -90,3 +90,8 @@ CREATE TABLE `stripe_lookup` ( `organization_id` text NOT NULL, `updated_at` integer NOT NULL ); +--> statement-breakpoint +CREATE TABLE `task_lookup` ( + `task_id` text PRIMARY KEY NOT NULL, + `repo_id` text NOT NULL +); diff --git a/foundry/packages/backend/src/actors/organization/db/drizzle/meta/0000_snapshot.json b/foundry/packages/backend/src/actors/workspace/db/drizzle/meta/0000_snapshot.json similarity index 94% rename from foundry/packages/backend/src/actors/organization/db/drizzle/meta/0000_snapshot.json rename to foundry/packages/backend/src/actors/workspace/db/drizzle/meta/0000_snapshot.json index a29c546..08a47e5 100644 --- a/foundry/packages/backend/src/actors/organization/db/drizzle/meta/0000_snapshot.json +++ b/foundry/packages/backend/src/actors/workspace/db/drizzle/meta/0000_snapshot.json @@ -359,48 +359,6 @@ "notNull": false, "autoincrement": false }, - "github_last_webhook_at": { - "name": "github_last_webhook_at", - "type": "integer", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, - "github_last_webhook_event": { - "name": "github_last_webhook_event", - "type": "text", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, - "github_sync_generation": { - "name": "github_sync_generation", - "type": "integer", - "primaryKey": false, - "notNull": true, - "autoincrement": false - }, - "github_sync_phase": { - "name": "github_sync_phase", - "type": "text", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, - "github_processed_repository_count": { - "name": "github_processed_repository_count", - "type": "integer", - "primaryKey": false, - "notNull": true, - "autoincrement": false - }, - "github_total_repository_count": { - "name": "github_total_repository_count", - "type": "integer", - "primaryKey": false, - "notNull": true, - "autoincrement": false - }, "stripe_customer_id": { "name": "stripe_customer_id", "type": "text", @@ -485,6 +443,37 @@ "uniqueConstraints": {}, "checkConstraints": {} }, + "provider_profiles": { + "name": "provider_profiles", + "columns": { + "provider_id": { + "name": "provider_id", + "type": "text", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "profile_json": { + "name": "profile_json", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "updated_at": { + "name": "updated_at", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} + }, "repos": { "name": "repos", "columns": { @@ -577,6 +566,30 @@ "compositePrimaryKeys": {}, "uniqueConstraints": {}, "checkConstraints": {} + }, + "task_lookup": { + "name": "task_lookup", + "columns": { + "task_id": { + "name": "task_id", + "type": "text", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "repo_id": { + "name": "repo_id", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} } }, "views": {}, diff --git a/foundry/packages/backend/src/actors/organization/db/drizzle/meta/_journal.json b/foundry/packages/backend/src/actors/workspace/db/drizzle/meta/_journal.json similarity index 57% rename from foundry/packages/backend/src/actors/organization/db/drizzle/meta/_journal.json rename to foundry/packages/backend/src/actors/workspace/db/drizzle/meta/_journal.json index 41ea23b..e3668a1 100644 --- a/foundry/packages/backend/src/actors/organization/db/drizzle/meta/_journal.json +++ b/foundry/packages/backend/src/actors/workspace/db/drizzle/meta/_journal.json @@ -8,13 +8,6 @@ "when": 1773376221152, "tag": "0000_melted_viper", "breakpoints": true - }, - { - "idx": 1, - "version": "6", - "when": 1773840000000, - "tag": "0001_add_auth_and_task_tables", - "breakpoints": true } ] } diff --git a/foundry/packages/backend/src/actors/organization/db/migrations.ts b/foundry/packages/backend/src/actors/workspace/db/migrations.ts similarity index 85% rename from foundry/packages/backend/src/actors/organization/db/migrations.ts rename to foundry/packages/backend/src/actors/workspace/db/migrations.ts index 2e8570b..607eb19 100644 --- a/foundry/packages/backend/src/actors/organization/db/migrations.ts +++ b/foundry/packages/backend/src/actors/workspace/db/migrations.ts @@ -12,14 +12,14 @@ const journal = { }, { idx: 1, - when: 1773840000000, - tag: "0001_add_auth_and_task_tables", + when: 1773638400000, + tag: "0001_auth_index_tables", breakpoints: true, }, { idx: 2, - when: 1773984000000, - tag: "0002_add_task_owner_columns", + when: 1773720000000, + tag: "0002_task_summaries", breakpoints: true, }, ], @@ -84,12 +84,6 @@ CREATE TABLE \`organization_profile\` ( \`github_installation_id\` integer, \`github_last_sync_label\` text NOT NULL, \`github_last_sync_at\` integer, - \`github_last_webhook_at\` integer, - \`github_last_webhook_event\` text, - \`github_sync_generation\` integer NOT NULL, - \`github_sync_phase\` text, - \`github_processed_repository_count\` integer NOT NULL, - \`github_total_repository_count\` integer NOT NULL, \`stripe_customer_id\` text, \`stripe_subscription_id\` text, \`stripe_price_id\` text, @@ -103,6 +97,12 @@ CREATE TABLE \`organization_profile\` ( \`updated_at\` integer NOT NULL ); --> statement-breakpoint +CREATE TABLE \`provider_profiles\` ( + \`provider_id\` text PRIMARY KEY NOT NULL, + \`profile_json\` text NOT NULL, + \`updated_at\` integer NOT NULL +); +--> statement-breakpoint CREATE TABLE \`repos\` ( \`repo_id\` text PRIMARY KEY NOT NULL, \`remote_url\` text NOT NULL, @@ -120,6 +120,11 @@ CREATE TABLE \`stripe_lookup\` ( \`organization_id\` text NOT NULL, \`updated_at\` integer NOT NULL ); +--> statement-breakpoint +CREATE TABLE \`task_lookup\` ( + \`task_id\` text PRIMARY KEY NOT NULL, + \`repo_id\` text NOT NULL +); `, m0001: `CREATE TABLE IF NOT EXISTS \`auth_session_index\` ( \`session_id\` text PRIMARY KEY NOT NULL, @@ -151,16 +156,8 @@ CREATE TABLE IF NOT EXISTS \`auth_verification\` ( \`created_at\` integer NOT NULL, \`updated_at\` integer NOT NULL ); ---> statement-breakpoint -CREATE TABLE IF NOT EXISTS \`task_index\` ( - \`task_id\` text PRIMARY KEY NOT NULL, - \`repo_id\` text NOT NULL, - \`branch_name\` text, - \`created_at\` integer NOT NULL, - \`updated_at\` integer NOT NULL -); ---> statement-breakpoint -CREATE TABLE IF NOT EXISTS \`task_summaries\` ( +`, + m0002: `CREATE TABLE IF NOT EXISTS \`task_summaries\` ( \`task_id\` text PRIMARY KEY NOT NULL, \`repo_id\` text NOT NULL, \`title\` text NOT NULL, @@ -171,10 +168,6 @@ CREATE TABLE IF NOT EXISTS \`task_summaries\` ( \`pull_request_json\` text, \`sessions_summary_json\` text DEFAULT '[]' NOT NULL ); -`, - m0002: `ALTER TABLE \`task_summaries\` ADD COLUMN \`primary_user_login\` text; ---> statement-breakpoint -ALTER TABLE \`task_summaries\` ADD COLUMN \`primary_user_avatar_url\` text; `, } as const, }; diff --git a/foundry/packages/backend/src/actors/workspace/db/schema.ts b/foundry/packages/backend/src/actors/workspace/db/schema.ts new file mode 100644 index 0000000..93082af --- /dev/null +++ b/foundry/packages/backend/src/actors/workspace/db/schema.ts @@ -0,0 +1,129 @@ +import { integer, sqliteTable, text } from "rivetkit/db/drizzle"; + +// SQLite is per workspace actor instance, so no workspaceId column needed. +export const providerProfiles = sqliteTable("provider_profiles", { + providerId: text("provider_id").notNull().primaryKey(), + // Structured by the provider profile snapshot returned by provider integrations. + profileJson: text("profile_json").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const repos = sqliteTable("repos", { + repoId: text("repo_id").notNull().primaryKey(), + remoteUrl: text("remote_url").notNull(), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const taskLookup = sqliteTable("task_lookup", { + taskId: text("task_id").notNull().primaryKey(), + repoId: text("repo_id").notNull(), +}); + +/** + * Materialized sidebar projection maintained by task actors. + * The source of truth still lives on each task actor; this table exists so + * workspace reads can stay local and avoid fan-out across child actors. + */ +export const taskSummaries = sqliteTable("task_summaries", { + taskId: text("task_id").notNull().primaryKey(), + repoId: text("repo_id").notNull(), + title: text("title").notNull(), + status: text("status").notNull(), + repoName: text("repo_name").notNull(), + updatedAtMs: integer("updated_at_ms").notNull(), + branch: text("branch"), + pullRequestJson: text("pull_request_json"), + sessionsSummaryJson: text("sessions_summary_json").notNull().default("[]"), +}); + +export const organizationProfile = sqliteTable("organization_profile", { + id: text("id").notNull().primaryKey(), + kind: text("kind").notNull(), + githubAccountId: text("github_account_id").notNull(), + githubLogin: text("github_login").notNull(), + githubAccountType: text("github_account_type").notNull(), + displayName: text("display_name").notNull(), + slug: text("slug").notNull(), + primaryDomain: text("primary_domain").notNull(), + defaultModel: text("default_model").notNull(), + autoImportRepos: integer("auto_import_repos").notNull(), + repoImportStatus: text("repo_import_status").notNull(), + githubConnectedAccount: text("github_connected_account").notNull(), + githubInstallationStatus: text("github_installation_status").notNull(), + githubSyncStatus: text("github_sync_status").notNull(), + githubInstallationId: integer("github_installation_id"), + githubLastSyncLabel: text("github_last_sync_label").notNull(), + githubLastSyncAt: integer("github_last_sync_at"), + stripeCustomerId: text("stripe_customer_id"), + stripeSubscriptionId: text("stripe_subscription_id"), + stripePriceId: text("stripe_price_id"), + billingPlanId: text("billing_plan_id").notNull(), + billingStatus: text("billing_status").notNull(), + billingSeatsIncluded: integer("billing_seats_included").notNull(), + billingTrialEndsAt: text("billing_trial_ends_at"), + billingRenewalAt: text("billing_renewal_at"), + billingPaymentMethodLabel: text("billing_payment_method_label").notNull(), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const organizationMembers = sqliteTable("organization_members", { + id: text("id").notNull().primaryKey(), + name: text("name").notNull(), + email: text("email").notNull(), + role: text("role").notNull(), + state: text("state").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const seatAssignments = sqliteTable("seat_assignments", { + email: text("email").notNull().primaryKey(), + createdAt: integer("created_at").notNull(), +}); + +export const invoices = sqliteTable("invoices", { + id: text("id").notNull().primaryKey(), + label: text("label").notNull(), + issuedAt: text("issued_at").notNull(), + amountUsd: integer("amount_usd").notNull(), + status: text("status").notNull(), + createdAt: integer("created_at").notNull(), +}); + +export const authSessionIndex = sqliteTable("auth_session_index", { + sessionId: text("session_id").notNull().primaryKey(), + sessionToken: text("session_token").notNull(), + userId: text("user_id").notNull(), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const authEmailIndex = sqliteTable("auth_email_index", { + email: text("email").notNull().primaryKey(), + userId: text("user_id").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const authAccountIndex = sqliteTable("auth_account_index", { + id: text("id").notNull().primaryKey(), + providerId: text("provider_id").notNull(), + accountId: text("account_id").notNull(), + userId: text("user_id").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const authVerification = sqliteTable("auth_verification", { + id: text("id").notNull().primaryKey(), + identifier: text("identifier").notNull(), + value: text("value").notNull(), + expiresAt: integer("expires_at").notNull(), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +export const stripeLookup = sqliteTable("stripe_lookup", { + lookupKey: text("lookup_key").notNull().primaryKey(), + organizationId: text("organization_id").notNull(), + updatedAt: integer("updated_at").notNull(), +}); diff --git a/foundry/packages/backend/src/actors/workspace/index.ts b/foundry/packages/backend/src/actors/workspace/index.ts new file mode 100644 index 0000000..62e662d --- /dev/null +++ b/foundry/packages/backend/src/actors/workspace/index.ts @@ -0,0 +1,19 @@ +import { actor, queue } from "rivetkit"; +import { workflow } from "rivetkit/workflow"; +import { workspaceDb } from "./db/db.js"; +import { runWorkspaceWorkflow, WORKSPACE_QUEUE_NAMES, workspaceActions } from "./actions.js"; + +export const workspace = actor({ + db: workspaceDb, + queues: Object.fromEntries(WORKSPACE_QUEUE_NAMES.map((name) => [name, queue()])), + options: { + name: "Workspace", + icon: "compass", + actionTimeout: 5 * 60_000, + }, + createState: (_c, workspaceId: string) => ({ + workspaceId, + }), + actions: workspaceActions, + run: workflow(runWorkspaceWorkflow), +}); diff --git a/foundry/packages/backend/src/config/organization.ts b/foundry/packages/backend/src/config/organization.ts deleted file mode 100644 index 8b5c766..0000000 --- a/foundry/packages/backend/src/config/organization.ts +++ /dev/null @@ -1,13 +0,0 @@ -import type { AppConfig } from "@sandbox-agent/foundry-shared"; - -export function defaultOrganization(config: AppConfig): string { - const organizationId = config.organization.default.trim(); - return organizationId.length > 0 ? organizationId : "default"; -} - -export function resolveOrganization(flagOrganization: string | undefined, config: AppConfig): string { - if (flagOrganization && flagOrganization.trim().length > 0) { - return flagOrganization.trim(); - } - return defaultOrganization(config); -} diff --git a/foundry/packages/backend/src/config/runner-version.ts b/foundry/packages/backend/src/config/runner-version.ts deleted file mode 100644 index 5c33672..0000000 --- a/foundry/packages/backend/src/config/runner-version.ts +++ /dev/null @@ -1,33 +0,0 @@ -import { readFileSync } from "node:fs"; - -function parseRunnerVersion(rawValue: string | undefined): number | undefined { - const value = rawValue?.trim(); - if (!value) { - return undefined; - } - - const parsed = Number.parseInt(value, 10); - if (Number.isNaN(parsed)) { - return undefined; - } - - return parsed; -} - -export function resolveRunnerVersion(): number | undefined { - const envVersion = parseRunnerVersion(process.env.RIVET_RUNNER_VERSION); - if (envVersion !== undefined) { - return envVersion; - } - - const versionFilePath = process.env.RIVET_RUNNER_VERSION_FILE; - if (!versionFilePath) { - return undefined; - } - - try { - return parseRunnerVersion(readFileSync(versionFilePath, "utf8")); - } catch { - return undefined; - } -} diff --git a/foundry/packages/backend/src/config/workspace.ts b/foundry/packages/backend/src/config/workspace.ts new file mode 100644 index 0000000..2225200 --- /dev/null +++ b/foundry/packages/backend/src/config/workspace.ts @@ -0,0 +1,13 @@ +import type { AppConfig } from "@sandbox-agent/foundry-shared"; + +export function defaultWorkspace(config: AppConfig): string { + const ws = config.workspace.default.trim(); + return ws.length > 0 ? ws : "default"; +} + +export function resolveWorkspace(flagWorkspace: string | undefined, config: AppConfig): string { + if (flagWorkspace && flagWorkspace.trim().length > 0) { + return flagWorkspace.trim(); + } + return defaultWorkspace(config); +} diff --git a/foundry/packages/backend/src/driver.ts b/foundry/packages/backend/src/driver.ts index 5c01035..4e1d248 100644 --- a/foundry/packages/backend/src/driver.ts +++ b/foundry/packages/backend/src/driver.ts @@ -1,31 +1,189 @@ -import { createPr, starRepository } from "./integrations/github/index.js"; +import type { BranchSnapshot } from "./integrations/git/index.js"; +import type { PullRequestSnapshot } from "./integrations/github/index.js"; +import type { SandboxSession, SandboxAgentClientOptions, SandboxSessionCreateRequest } from "./integrations/sandbox-agent/client.js"; +import type { + ListEventsRequest, + ListPage, + ListPageRequest, + ProcessCreateRequest, + ProcessInfo, + ProcessLogFollowQuery, + ProcessLogsResponse, + ProcessSignalQuery, + SessionEvent, + SessionRecord, +} from "sandbox-agent"; +import type { DaytonaClientOptions, DaytonaCreateSandboxOptions, DaytonaPreviewEndpoint, DaytonaSandbox } from "./integrations/daytona/client.js"; +import { + validateRemote, + ensureCloned, + fetch, + listRemoteBranches, + remoteDefaultBaseRef, + revParse, + ensureRemoteBranch, + diffStatForBranch, + conflictsWithMain, +} from "./integrations/git/index.js"; +import { + gitSpiceAvailable, + gitSpiceListStack, + gitSpiceRebaseBranch, + gitSpiceReparentBranch, + gitSpiceRestackRepo, + gitSpiceRestackSubtree, + gitSpiceSyncRepo, + gitSpiceTrackBranch, +} from "./integrations/git-spice/index.js"; +import { listPullRequests, createPr, starRepository } from "./integrations/github/index.js"; +import { SandboxAgentClient } from "./integrations/sandbox-agent/client.js"; +import { DaytonaClient } from "./integrations/daytona/client.js"; + +export interface GitDriver { + validateRemote(remoteUrl: string, options?: { githubToken?: string | null }): Promise; + ensureCloned(remoteUrl: string, targetPath: string, options?: { githubToken?: string | null }): Promise; + fetch(repoPath: string, options?: { githubToken?: string | null }): Promise; + listRemoteBranches(repoPath: string, options?: { githubToken?: string | null }): Promise; + remoteDefaultBaseRef(repoPath: string): Promise; + revParse(repoPath: string, ref: string): Promise; + ensureRemoteBranch(repoPath: string, branchName: string, options?: { githubToken?: string | null }): Promise; + diffStatForBranch(repoPath: string, branchName: string): Promise; + conflictsWithMain(repoPath: string, branchName: string): Promise; +} + +export interface StackBranchSnapshot { + branchName: string; + parentBranch: string | null; +} + +export interface StackDriver { + available(repoPath: string): Promise; + listStack(repoPath: string): Promise; + syncRepo(repoPath: string): Promise; + restackRepo(repoPath: string): Promise; + restackSubtree(repoPath: string, branchName: string): Promise; + rebaseBranch(repoPath: string, branchName: string): Promise; + reparentBranch(repoPath: string, branchName: string, parentBranch: string): Promise; + trackBranch(repoPath: string, branchName: string, parentBranch: string): Promise; +} export interface GithubDriver { + listPullRequests(repoPath: string, options?: { githubToken?: string | null }): Promise; createPr( - repoFullName: string, + repoPath: string, headBranch: string, title: string, body?: string, - options?: { githubToken?: string | null; baseBranch?: string | null }, + options?: { githubToken?: string | null }, ): Promise<{ number: number; url: string }>; starRepository(repoFullName: string, options?: { githubToken?: string | null }): Promise; } +export interface SandboxAgentClientLike { + createSession(request: string | SandboxSessionCreateRequest): Promise; + sessionStatus(sessionId: string): Promise; + listSessions(request?: ListPageRequest): Promise>; + listEvents(request: ListEventsRequest): Promise>; + createProcess(request: ProcessCreateRequest): Promise; + listProcesses(): Promise<{ processes: ProcessInfo[] }>; + getProcessLogs(processId: string, query?: ProcessLogFollowQuery): Promise; + stopProcess(processId: string, query?: ProcessSignalQuery): Promise; + killProcess(processId: string, query?: ProcessSignalQuery): Promise; + deleteProcess(processId: string): Promise; + sendPrompt(request: { sessionId: string; prompt: string; notification?: boolean }): Promise; + cancelSession(sessionId: string): Promise; + destroySession(sessionId: string): Promise; +} + +export interface SandboxAgentDriver { + createClient(options: SandboxAgentClientOptions): SandboxAgentClientLike; +} + +export interface DaytonaClientLike { + createSandbox(options: DaytonaCreateSandboxOptions): Promise; + getSandbox(sandboxId: string): Promise; + startSandbox(sandboxId: string, timeoutSeconds?: number): Promise; + stopSandbox(sandboxId: string, timeoutSeconds?: number): Promise; + deleteSandbox(sandboxId: string): Promise; + executeCommand(sandboxId: string, command: string): Promise<{ exitCode: number; result: string }>; + getPreviewEndpoint(sandboxId: string, port: number): Promise; +} + +export interface DaytonaDriver { + createClient(options: DaytonaClientOptions): DaytonaClientLike; +} + export interface TmuxDriver { setWindowStatus(branchName: string, status: string): number; } export interface BackendDriver { + git: GitDriver; + stack: StackDriver; github: GithubDriver; + sandboxAgent: SandboxAgentDriver; + daytona: DaytonaDriver; tmux: TmuxDriver; } export function createDefaultDriver(): BackendDriver { + const sandboxAgentClients = new Map(); + const daytonaClients = new Map(); + return { + git: { + validateRemote, + ensureCloned, + fetch, + listRemoteBranches, + remoteDefaultBaseRef, + revParse, + ensureRemoteBranch, + diffStatForBranch, + conflictsWithMain, + }, + stack: { + available: gitSpiceAvailable, + listStack: gitSpiceListStack, + syncRepo: gitSpiceSyncRepo, + restackRepo: gitSpiceRestackRepo, + restackSubtree: gitSpiceRestackSubtree, + rebaseBranch: gitSpiceRebaseBranch, + reparentBranch: gitSpiceReparentBranch, + trackBranch: gitSpiceTrackBranch, + }, github: { + listPullRequests, createPr, starRepository, }, + sandboxAgent: { + createClient: (opts) => { + if (opts.persist) { + return new SandboxAgentClient(opts); + } + const key = `${opts.endpoint}|${opts.token ?? ""}|${opts.agent ?? ""}`; + const cached = sandboxAgentClients.get(key); + if (cached) { + return cached; + } + const created = new SandboxAgentClient(opts); + sandboxAgentClients.set(key, created); + return created; + }, + }, + daytona: { + createClient: (opts) => { + const key = `${opts.apiUrl ?? ""}|${opts.apiKey ?? ""}|${opts.target ?? ""}`; + const cached = daytonaClients.get(key); + if (cached) { + return cached; + } + const created = new DaytonaClient(opts); + daytonaClients.set(key, created); + return created; + }, + }, tmux: { setWindowStatus: () => 0, }, diff --git a/foundry/packages/backend/src/index.ts b/foundry/packages/backend/src/index.ts index 617bacc..cf1e6e7 100644 --- a/foundry/packages/backend/src/index.ts +++ b/foundry/packages/backend/src/index.ts @@ -3,14 +3,15 @@ import { cors } from "hono/cors"; import { randomUUID } from "node:crypto"; import { initActorRuntimeContext } from "./actors/context.js"; import { registry } from "./actors/index.js"; -import { organizationKey } from "./actors/keys.js"; +import { workspaceKey } from "./actors/keys.js"; import { loadConfig } from "./config/backend.js"; import { createBackends, createNotificationService } from "./notifications/index.js"; import { createDefaultDriver } from "./driver.js"; +import { createProviderRegistry } from "./providers/index.js"; import { createClient } from "rivetkit/client"; import { initBetterAuthService } from "./services/better-auth.js"; import { createDefaultAppShellServices } from "./services/app-shell-runtime.js"; -import { APP_SHELL_ORGANIZATION_ID } from "./actors/organization/constants.js"; +import { APP_SHELL_WORKSPACE_ID } from "./actors/workspace/app-shell.js"; import { logger } from "./logging.js"; export interface BackendStartOptions { @@ -18,7 +19,7 @@ export interface BackendStartOptions { port?: number; } -interface AppOrganizationLogContext { +interface AppWorkspaceLogContext { action?: string; cfConnectingIp?: string; cfRay?: string; @@ -48,19 +49,6 @@ function isRivetRequest(request: Request): boolean { } export async function startBackend(options: BackendStartOptions = {}): Promise { - // Prevent the sandbox-agent SDK's unhandled SQLite constraint errors from - // crashing the entire process. The SDK has a bug where duplicate event - // inserts (sandbox_agent_events UNIQUE constraint) throw from an internal - // async path with no catch. Log and continue. - process.on("uncaughtException", (error) => { - logger.error({ error: error?.message ?? String(error), stack: error?.stack }, "uncaughtException (kept alive)"); - }); - process.on("unhandledRejection", (reason) => { - const msg = reason instanceof Error ? reason.message : String(reason); - const stack = reason instanceof Error ? reason.stack : undefined; - logger.error({ error: msg, stack }, "unhandledRejection (kept alive)"); - }); - // sandbox-agent agent plugins vary on which env var they read for OpenAI/Codex auth. // Normalize to keep local dev + docker-compose simple. if (!process.env.CODEX_API_KEY && process.env.OPENAI_API_KEY) { @@ -81,14 +69,15 @@ export async function startBackend(options: BackendStartOptions = {}): Promise ({ + const requestHeaderContext = (c: any): AppWorkspaceLogContext => ({ cfConnectingIp: c.req.header("cf-connecting-ip") ?? undefined, cfRay: c.req.header("cf-ray") ?? undefined, forwardedFor: c.req.header("x-forwarded-for") ?? undefined, @@ -141,59 +130,6 @@ export async function startBackend(options: BackendStartOptions = {}): Promise.json, inspect with chrome://tracing) - app.get("/debug/memory", async (c) => { - if (process.env.NODE_ENV !== "development") { - return c.json({ error: "debug endpoints disabled in production" }, 403); - } - const wantGc = c.req.query("gc") === "1"; - if (wantGc && typeof Bun !== "undefined") { - // Bun.gc(true) triggers a synchronous full GC sweep in JavaScriptCore. - Bun.gc(true); - } - const mem = process.memoryUsage(); - const rssMb = Math.round(mem.rss / 1024 / 1024); - const heapUsedMb = Math.round(mem.heapUsed / 1024 / 1024); - const heapTotalMb = Math.round(mem.heapTotal / 1024 / 1024); - const externalMb = Math.round(mem.external / 1024 / 1024); - const nonHeapMb = rssMb - heapUsedMb - externalMb; - // Bun.heapStats() gives JSC-specific breakdown: object counts, typed array - // bytes, extra memory (native allocations tracked by JSC). Useful for - // distinguishing JS object bloat from native/WASM memory. - // eslint-disable-next-line @typescript-eslint/no-explicit-any - const BunAny = Bun as any; - const heapStats = typeof BunAny.heapStats === "function" ? BunAny.heapStats() : null; - const snapshot = { - rssMb, - heapUsedMb, - heapTotalMb, - externalMb, - nonHeapMb, - gcTriggered: wantGc, - rssBytes: mem.rss, - heapUsedBytes: mem.heapUsed, - heapTotalBytes: mem.heapTotal, - externalBytes: mem.external, - ...(heapStats ? { bunHeapStats: heapStats } : {}), - }; - // Optionally write a full JSC heap snapshot for offline analysis. - let heapSnapshotPath: string | null = null; - const wantHeap = c.req.query("heap") === "1"; - if (wantHeap && typeof Bun !== "undefined") { - heapSnapshotPath = `/tmp/foundry-heap-${Date.now()}.json`; - // Bun.generateHeapSnapshot("v8") returns a V8-compatible JSON string. - const heapJson = Bun.generateHeapSnapshot("v8"); - await Bun.write(heapSnapshotPath, heapJson); - } - logger.info(snapshot, "memory_usage_debug"); - return c.json({ ...snapshot, ...(heapSnapshotPath ? { heapSnapshotPath } : {}) }); - }); - app.use("*", async (c, next) => { const requestId = c.req.header("x-request-id")?.trim() || randomUUID(); const start = performance.now(); @@ -230,27 +166,27 @@ export async function startBackend(options: BackendStartOptions = {}): Promise { - if (cachedAppOrganization) return cachedAppOrganization; + const appWorkspace = async (context: AppWorkspaceLogContext = {}) => { + if (cachedAppWorkspace) return cachedAppWorkspace; const start = performance.now(); try { - const handle = await actorClient.organization.getOrCreate(organizationKey(APP_SHELL_ORGANIZATION_ID), { - createWithInput: APP_SHELL_ORGANIZATION_ID, + const handle = await actorClient.workspace.getOrCreate(workspaceKey(APP_SHELL_WORKSPACE_ID), { + createWithInput: APP_SHELL_WORKSPACE_ID, }); - cachedAppOrganization = handle; + cachedAppWorkspace = handle; logger.info( { ...context, cache: "miss", durationMs: Math.round((performance.now() - start) * 100) / 100, }, - "app_organization_resolve", + "app_workspace_resolve", ); return handle; } catch (error) { @@ -262,13 +198,13 @@ export async function startBackend(options: BackendStartOptions = {}): Promise ({ + const requestLogContext = (c: any, sessionId?: string): AppWorkspaceLogContext => ({ ...requestHeaderContext(c), method: c.req.method, path: c.req.path, @@ -281,55 +217,7 @@ export async function startBackend(options: BackendStartOptions = {}): Promise Fastly -> Railway) retries callback requests when they take - // >10s. The first request deletes the verification record on success, so the - // retry fails with "verification not found" -> ?error=please_restart_the_process. - // This map tracks in-flight callbacks by state param so retries wait for and - // reuse the first request's response. - const inflightCallbacks = new Map>(); - app.all("/v1/auth/*", async (c) => { - const authPath = c.req.path; - const authMethod = c.req.method; - const isCallback = authPath.includes("/callback/"); - - // Deduplicate callback requests by OAuth state parameter - if (isCallback) { - const url = new URL(c.req.url); - const state = url.searchParams.get("state"); - if (state) { - const existing = inflightCallbacks.get(state); - if (existing) { - logger.info({ path: authPath, state: state.slice(0, 8) + "..." }, "auth_callback_dedup"); - const original = await existing; - return original.clone(); - } - - const promise = (async () => { - logger.info({ path: authPath, method: authMethod, state: state.slice(0, 8) + "..." }, "auth_callback_start"); - const start = performance.now(); - const response = await betterAuth.auth.handler(c.req.raw); - const durationMs = Math.round((performance.now() - start) * 100) / 100; - const location = response.headers.get("location"); - logger.info({ path: authPath, status: response.status, durationMs, location: location ?? undefined }, "auth_callback_complete"); - if (location && location.includes("error=")) { - logger.error({ path: authPath, status: response.status, durationMs, location }, "auth_callback_error_redirect"); - } - return response; - })(); - - inflightCallbacks.set(state, promise); - try { - const response = await promise; - return response.clone(); - } finally { - // Keep entry briefly so late retries still hit the cache - setTimeout(() => inflightCallbacks.delete(state), 30_000); - } - } - } - return await betterAuth.auth.handler(c.req.raw); }); @@ -369,7 +257,7 @@ export async function startBackend(options: BackendStartOptions = {}): Promise { const payload = await c.req.text(); - await (await appOrganization(requestLogContext(c))).handleAppStripeWebhook({ + await (await appWorkspace(requestLogContext(c))).handleAppStripeWebhook({ payload, signatureHeader: c.req.header("stripe-signature") ?? null, }); @@ -390,7 +278,7 @@ export async function startBackend(options: BackendStartOptions = {}): Promise { const payload = await c.req.text(); - await (await appOrganization(requestLogContext(c))).handleAppGithubWebhook({ + await (await appWorkspace(requestLogContext(c))).handleAppGithubWebhook({ payload, signatureHeader: c.req.header("x-hub-signature-256") ?? null, eventHeader: c.req.header("x-github-event") ?? null, @@ -407,11 +295,6 @@ export async function startBackend(options: BackendStartOptions = {}): Promise { - const mem = process.memoryUsage(); - const rssMb = Math.round(mem.rss / 1024 / 1024); - const heapUsedMb = Math.round(mem.heapUsed / 1024 / 1024); - const heapTotalMb = Math.round(mem.heapTotal / 1024 / 1024); - const externalMb = Math.round(mem.external / 1024 / 1024); - // Non-heap RSS: memory not accounted for by JS heap or external buffers. - // Large values here point to native allocations (WASM, mmap, child process - // bookkeeping, Bun's internal arena, etc.). - const nonHeapMb = rssMb - heapUsedMb - externalMb; - const deltaRss = rssMb - prevRss; - prevRss = rssMb; - logger.info( - { - rssMb, - heapUsedMb, - heapTotalMb, - externalMb, - nonHeapMb, - deltaRssMb: deltaRss, - rssBytes: mem.rss, - heapUsedBytes: mem.heapUsed, - heapTotalBytes: mem.heapTotal, - externalBytes: mem.external, - }, - "memory_usage", - ); - }, 60_000); - } - process.on("SIGINT", async () => { server.stop(); process.exit(0); diff --git a/foundry/packages/backend/src/integrations/daytona/client.ts b/foundry/packages/backend/src/integrations/daytona/client.ts new file mode 100644 index 0000000..4b00581 --- /dev/null +++ b/foundry/packages/backend/src/integrations/daytona/client.ts @@ -0,0 +1,113 @@ +import { Daytona, type Image } from "@daytonaio/sdk"; + +export interface DaytonaSandbox { + id: string; + state?: string; + snapshot?: string; + labels?: Record; +} + +export interface DaytonaCreateSandboxOptions { + image: string | Image; + envVars?: Record; + labels?: Record; + autoStopInterval?: number; +} + +export interface DaytonaPreviewEndpoint { + url: string; + token?: string; +} + +export interface DaytonaClientOptions { + apiUrl?: string; + apiKey?: string; + target?: string; +} + +function normalizeApiUrl(input?: string): string | undefined { + if (!input) return undefined; + const trimmed = input.replace(/\/+$/, ""); + if (trimmed.endsWith("/api")) { + return trimmed; + } + return `${trimmed}/api`; +} + +export class DaytonaClient { + private readonly daytona: Daytona; + + constructor(options: DaytonaClientOptions) { + const apiUrl = normalizeApiUrl(options.apiUrl); + this.daytona = new Daytona({ + _experimental: {}, + ...(apiUrl ? { apiUrl } : {}), + ...(options.apiKey ? { apiKey: options.apiKey } : {}), + ...(options.target ? { target: options.target } : {}), + }); + } + + async createSandbox(options: DaytonaCreateSandboxOptions): Promise { + const sandbox = await this.daytona.create({ + image: options.image, + envVars: options.envVars, + labels: options.labels, + ...(options.autoStopInterval !== undefined ? { autoStopInterval: options.autoStopInterval } : {}), + }); + + return { + id: sandbox.id, + state: sandbox.state, + snapshot: sandbox.snapshot, + labels: (sandbox as any).labels, + }; + } + + async getSandbox(sandboxId: string): Promise { + const sandbox = await this.daytona.get(sandboxId); + return { + id: sandbox.id, + state: sandbox.state, + snapshot: sandbox.snapshot, + labels: (sandbox as any).labels, + }; + } + + async startSandbox(sandboxId: string, timeoutSeconds?: number): Promise { + const sandbox = await this.daytona.get(sandboxId); + await sandbox.start(timeoutSeconds); + } + + async stopSandbox(sandboxId: string, timeoutSeconds?: number): Promise { + const sandbox = await this.daytona.get(sandboxId); + await sandbox.stop(timeoutSeconds); + } + + async deleteSandbox(sandboxId: string): Promise { + const sandbox = await this.daytona.get(sandboxId); + await this.daytona.delete(sandbox); + } + + async executeCommand(sandboxId: string, command: string): Promise<{ exitCode: number; result: string }> { + const sandbox = await this.daytona.get(sandboxId); + const response = await sandbox.process.executeCommand(command); + return { + exitCode: response.exitCode, + result: response.result, + }; + } + + async getPreviewEndpoint(sandboxId: string, port: number): Promise { + const sandbox = await this.daytona.get(sandboxId); + // Use signed preview URLs for server-to-sandbox communication. + // The standard preview link may redirect to an interactive Auth0 flow from non-browser clients. + // Signed preview URLs work for direct HTTP access. + // + // Request a longer-lived URL so sessions can run for several minutes without refresh. + const preview = await sandbox.getSignedPreviewUrl(port, 6 * 60 * 60); + return { + url: preview.url, + token: preview.token, + }; + } +} diff --git a/foundry/packages/backend/src/integrations/git-spice/index.ts b/foundry/packages/backend/src/integrations/git-spice/index.ts new file mode 100644 index 0000000..877c82a --- /dev/null +++ b/foundry/packages/backend/src/integrations/git-spice/index.ts @@ -0,0 +1,223 @@ +import { execFile } from "node:child_process"; +import { promisify } from "node:util"; + +const execFileAsync = promisify(execFile); + +const DEFAULT_TIMEOUT_MS = 2 * 60_000; + +interface SpiceCommand { + command: string; + prefix: string[]; +} + +export interface SpiceStackEntry { + branchName: string; + parentBranch: string | null; +} + +function spiceCommands(): SpiceCommand[] { + const explicit = process.env.HF_GIT_SPICE_BIN?.trim(); + const list: SpiceCommand[] = []; + if (explicit) { + list.push({ command: explicit, prefix: [] }); + } + list.push({ command: "git-spice", prefix: [] }); + list.push({ command: "git", prefix: ["spice"] }); + return list; +} + +function commandLabel(cmd: SpiceCommand): string { + return [cmd.command, ...cmd.prefix].join(" "); +} + +function looksMissing(error: unknown): boolean { + const detail = error instanceof Error ? error.message : String(error); + return detail.includes("ENOENT") || detail.includes("not a git command") || detail.includes("command not found"); +} + +async function tryRun(repoPath: string, cmd: SpiceCommand, args: string[]): Promise<{ stdout: string; stderr: string }> { + return await execFileAsync(cmd.command, [...cmd.prefix, ...args], { + cwd: repoPath, + timeout: DEFAULT_TIMEOUT_MS, + maxBuffer: 1024 * 1024 * 8, + env: { + ...process.env, + NO_COLOR: "1", + FORCE_COLOR: "0", + }, + }); +} + +async function pickCommand(repoPath: string): Promise { + for (const candidate of spiceCommands()) { + try { + await tryRun(repoPath, candidate, ["--help"]); + return candidate; + } catch (error) { + if (looksMissing(error)) { + continue; + } + } + } + return null; +} + +async function runSpice(repoPath: string, args: string[]): Promise<{ stdout: string; stderr: string }> { + const cmd = await pickCommand(repoPath); + if (!cmd) { + throw new Error("git-spice is not available (set HF_GIT_SPICE_BIN or install git-spice)"); + } + return await tryRun(repoPath, cmd, args); +} + +function parseLogJson(stdout: string): SpiceStackEntry[] { + const trimmed = stdout.trim(); + if (!trimmed) { + return []; + } + + const entries: SpiceStackEntry[] = []; + + // `git-spice log ... --json` prints one JSON object per line. + for (const line of trimmed.split("\n")) { + const raw = line.trim(); + if (!raw.startsWith("{")) { + continue; + } + try { + const value = JSON.parse(raw) as { + name?: string; + branch?: string; + parent?: string | null; + parentBranch?: string | null; + }; + const branchName = (value.name ?? value.branch ?? "").trim(); + if (!branchName) { + continue; + } + const parentRaw = value.parent ?? value.parentBranch ?? null; + const parentBranch = parentRaw ? parentRaw.trim() || null : null; + entries.push({ branchName, parentBranch }); + } catch { + continue; + } + } + + const seen = new Set(); + return entries.filter((entry) => { + if (seen.has(entry.branchName)) { + return false; + } + seen.add(entry.branchName); + return true; + }); +} + +async function runFallbacks(repoPath: string, commands: string[][], errorContext: string): Promise { + const failures: string[] = []; + for (const args of commands) { + try { + await runSpice(repoPath, args); + return; + } catch (error) { + failures.push(`${args.join(" ")} :: ${error instanceof Error ? error.message : String(error)}`); + } + } + throw new Error(`${errorContext}. attempts=${failures.join(" | ")}`); +} + +export async function gitSpiceAvailable(repoPath: string): Promise { + return (await pickCommand(repoPath)) !== null; +} + +export async function gitSpiceListStack(repoPath: string): Promise { + try { + const { stdout } = await runSpice(repoPath, ["log", "short", "--all", "--json", "--no-cr-status", "--no-prompt"]); + return parseLogJson(stdout); + } catch { + return []; + } +} + +export async function gitSpiceSyncRepo(repoPath: string): Promise { + await runFallbacks( + repoPath, + [ + ["repo", "sync", "--restack", "--no-prompt"], + ["repo", "sync", "--restack"], + ["repo", "sync"], + ], + "git-spice repo sync failed", + ); +} + +export async function gitSpiceRestackRepo(repoPath: string): Promise { + await runFallbacks( + repoPath, + [ + ["repo", "restack", "--no-prompt"], + ["repo", "restack"], + ], + "git-spice repo restack failed", + ); +} + +export async function gitSpiceRestackSubtree(repoPath: string, branchName: string): Promise { + await runFallbacks( + repoPath, + [ + ["upstack", "restack", "--branch", branchName, "--no-prompt"], + ["upstack", "restack", "--branch", branchName], + ["branch", "restack", "--branch", branchName, "--no-prompt"], + ["branch", "restack", "--branch", branchName], + ], + `git-spice restack subtree failed for ${branchName}`, + ); +} + +export async function gitSpiceRebaseBranch(repoPath: string, branchName: string): Promise { + await runFallbacks( + repoPath, + [ + ["branch", "restack", "--branch", branchName, "--no-prompt"], + ["branch", "restack", "--branch", branchName], + ], + `git-spice branch restack failed for ${branchName}`, + ); +} + +export async function gitSpiceReparentBranch(repoPath: string, branchName: string, parentBranch: string): Promise { + await runFallbacks( + repoPath, + [ + ["upstack", "onto", "--branch", branchName, parentBranch, "--no-prompt"], + ["upstack", "onto", "--branch", branchName, parentBranch], + ["branch", "onto", "--branch", branchName, parentBranch, "--no-prompt"], + ["branch", "onto", "--branch", branchName, parentBranch], + ], + `git-spice reparent failed for ${branchName} -> ${parentBranch}`, + ); +} + +export async function gitSpiceTrackBranch(repoPath: string, branchName: string, parentBranch: string): Promise { + await runFallbacks( + repoPath, + [ + ["branch", "track", branchName, "--base", parentBranch, "--no-prompt"], + ["branch", "track", branchName, "--base", parentBranch], + ], + `git-spice track failed for ${branchName}`, + ); +} + +export function normalizeBaseBranchName(ref: string): string { + const trimmed = ref.trim(); + if (!trimmed) { + return "main"; + } + return trimmed.startsWith("origin/") ? trimmed.slice("origin/".length) : trimmed; +} + +export function describeSpiceCommandForLogs(repoPath: string): Promise { + return pickCommand(repoPath).then((cmd) => (cmd ? commandLabel(cmd) : null)); +} diff --git a/foundry/packages/backend/src/integrations/git/index.ts b/foundry/packages/backend/src/integrations/git/index.ts new file mode 100644 index 0000000..1b478c4 --- /dev/null +++ b/foundry/packages/backend/src/integrations/git/index.ts @@ -0,0 +1,313 @@ +import { execFile } from "node:child_process"; +import { chmodSync, existsSync, mkdirSync, mkdtempSync, writeFileSync } from "node:fs"; +import { tmpdir } from "node:os"; +import { dirname, resolve } from "node:path"; +import { promisify } from "node:util"; + +const execFileAsync = promisify(execFile); + +const DEFAULT_GIT_VALIDATE_REMOTE_TIMEOUT_MS = 15_000; +const DEFAULT_GIT_FETCH_TIMEOUT_MS = 2 * 60_000; +const DEFAULT_GIT_CLONE_TIMEOUT_MS = 5 * 60_000; + +interface GitAuthOptions { + githubToken?: string | null; +} + +function resolveGithubToken(options?: GitAuthOptions): string | null { + const token = options?.githubToken ?? process.env.GH_TOKEN ?? process.env.GITHUB_TOKEN ?? process.env.HF_GITHUB_TOKEN ?? process.env.HF_GH_TOKEN ?? null; + if (!token) return null; + const trimmed = token.trim(); + return trimmed.length > 0 ? trimmed : null; +} + +let cachedAskpassPath: string | null = null; +function ensureAskpassScript(): string { + if (cachedAskpassPath) { + return cachedAskpassPath; + } + + const dir = mkdtempSync(resolve(tmpdir(), "foundry-git-askpass-")); + const path = resolve(dir, "askpass.sh"); + + // Git invokes $GIT_ASKPASS with the prompt string as argv[1]. Provide both username and password. + // We avoid embedding the token in this file; it is read from env at runtime. + const content = [ + "#!/bin/sh", + 'prompt="$1"', + // Prefer GH_TOKEN/GITHUB_TOKEN but support HF_* aliases too. + 'token="${GH_TOKEN:-${GITHUB_TOKEN:-${HF_GITHUB_TOKEN:-${HF_GH_TOKEN:-}}}}"', + 'case "$prompt" in', + ' *Username*) echo "x-access-token" ;;', + ' *Password*) echo "$token" ;;', + ' *) echo "" ;;', + "esac", + "", + ].join("\n"); + + writeFileSync(path, content, "utf8"); + chmodSync(path, 0o700); + cachedAskpassPath = path; + return path; +} + +function gitEnv(options?: GitAuthOptions): Record { + const env: Record = { ...(process.env as Record) }; + env.GIT_TERMINAL_PROMPT = "0"; + + const token = resolveGithubToken(options); + if (token) { + env.GIT_ASKPASS = ensureAskpassScript(); + // Some tooling expects these vars; keep them aligned. + env.GITHUB_TOKEN = token; + env.GH_TOKEN = token; + } + + return env; +} + +async function configureGithubAuth(repoPath: string, options?: GitAuthOptions): Promise { + const token = resolveGithubToken(options); + if (!token) { + return; + } + + const authHeader = Buffer.from(`x-access-token:${token}`, "utf8").toString("base64"); + await execFileAsync("git", ["-C", repoPath, "config", "--local", "credential.helper", ""], { + env: gitEnv(options), + }); + await execFileAsync("git", ["-C", repoPath, "config", "--local", "http.https://github.com/.extraheader", `AUTHORIZATION: basic ${authHeader}`], { + env: gitEnv(options), + }); +} + +export interface BranchSnapshot { + branchName: string; + commitSha: string; +} + +export async function fetch(repoPath: string, options?: GitAuthOptions): Promise { + await execFileAsync("git", ["-C", repoPath, "fetch", "--prune"], { + timeout: DEFAULT_GIT_FETCH_TIMEOUT_MS, + env: gitEnv(options), + }); +} + +export async function revParse(repoPath: string, ref: string): Promise { + const { stdout } = await execFileAsync("git", ["-C", repoPath, "rev-parse", ref], { env: gitEnv() }); + return stdout.trim(); +} + +export async function validateRemote(remoteUrl: string, options?: GitAuthOptions): Promise { + const remote = remoteUrl.trim(); + if (!remote) { + throw new Error("remoteUrl is required"); + } + try { + await execFileAsync("git", ["ls-remote", "--exit-code", remote, "HEAD"], { + // This command does not need repo context. Running from a neutral directory + // avoids inheriting broken worktree .git indirection inside dev containers. + cwd: tmpdir(), + maxBuffer: 1024 * 1024, + timeout: DEFAULT_GIT_VALIDATE_REMOTE_TIMEOUT_MS, + env: gitEnv(options), + }); + } catch (error) { + const detail = error instanceof Error ? error.message : String(error); + throw new Error(`git remote validation failed: ${detail}`); + } +} + +function isGitRepo(path: string): boolean { + return existsSync(resolve(path, ".git")); +} + +export async function ensureCloned(remoteUrl: string, targetPath: string, options?: GitAuthOptions): Promise { + const remote = remoteUrl.trim(); + if (!remote) { + throw new Error("remoteUrl is required"); + } + + if (existsSync(targetPath)) { + if (!isGitRepo(targetPath)) { + throw new Error(`targetPath exists but is not a git repo: ${targetPath}`); + } + + // Keep origin aligned with the configured remote URL. + await execFileAsync("git", ["-C", targetPath, "remote", "set-url", "origin", remote], { + maxBuffer: 1024 * 1024, + timeout: DEFAULT_GIT_FETCH_TIMEOUT_MS, + env: gitEnv(options), + }); + await configureGithubAuth(targetPath, options); + await fetch(targetPath, options); + return; + } + + mkdirSync(dirname(targetPath), { recursive: true }); + await execFileAsync("git", ["clone", remote, targetPath], { + maxBuffer: 1024 * 1024 * 8, + timeout: DEFAULT_GIT_CLONE_TIMEOUT_MS, + env: gitEnv(options), + }); + await configureGithubAuth(targetPath, options); + await fetch(targetPath, options); + await ensureLocalBaseBranch(targetPath); +} + +async function hasLocalBranches(repoPath: string): Promise { + try { + const { stdout } = await execFileAsync("git", ["-C", repoPath, "for-each-ref", "--format=%(refname:short)", "refs/heads"], { + env: gitEnv(), + }); + return stdout + .split("\n") + .map((line) => line.trim()) + .some(Boolean); + } catch { + return false; + } +} + +async function ensureLocalBaseBranch(repoPath: string): Promise { + if (await hasLocalBranches(repoPath)) { + return; + } + + const baseRef = await remoteDefaultBaseRef(repoPath); + const localBranch = baseRef.replace(/^origin\//, ""); + + await execFileAsync("git", ["-C", repoPath, "checkout", "-B", localBranch, baseRef], { + maxBuffer: 1024 * 1024, + timeout: DEFAULT_GIT_FETCH_TIMEOUT_MS, + env: gitEnv(), + }); +} + +export async function remoteDefaultBaseRef(repoPath: string): Promise { + try { + const { stdout } = await execFileAsync("git", ["-C", repoPath, "symbolic-ref", "refs/remotes/origin/HEAD"], { env: gitEnv() }); + const ref = stdout.trim(); // refs/remotes/origin/main + const match = ref.match(/^refs\/remotes\/(.+)$/); + if (match?.[1]) { + return match[1]; + } + } catch { + // fall through + } + + const candidates = ["origin/main", "origin/master", "main", "master"]; + for (const ref of candidates) { + try { + await execFileAsync("git", ["-C", repoPath, "rev-parse", "--verify", ref], { env: gitEnv() }); + return ref; + } catch { + continue; + } + } + return "origin/main"; +} + +export async function listRemoteBranches(repoPath: string, options?: GitAuthOptions): Promise { + await fetch(repoPath, options); + const { stdout } = await execFileAsync("git", ["-C", repoPath, "for-each-ref", "--format=%(refname:short) %(objectname)", "refs/remotes/origin"], { + maxBuffer: 1024 * 1024, + env: gitEnv(options), + }); + + return stdout + .trim() + .split("\n") + .filter((line) => line.trim().length > 0) + .map((line) => { + const [refName, commitSha] = line.trim().split(/\s+/, 2); + const short = (refName ?? "").trim(); + const branchName = short.replace(/^origin\//, ""); + return { branchName, commitSha: commitSha ?? "" }; + }) + .filter((row) => row.branchName.length > 0 && row.branchName !== "HEAD" && row.branchName !== "origin" && row.commitSha.length > 0); +} + +async function remoteBranchExists(repoPath: string, branchName: string): Promise { + try { + await execFileAsync("git", ["-C", repoPath, "show-ref", "--verify", `refs/remotes/origin/${branchName}`], { env: gitEnv() }); + return true; + } catch { + return false; + } +} + +export async function ensureRemoteBranch(repoPath: string, branchName: string, options?: GitAuthOptions): Promise { + await fetch(repoPath, options); + await ensureLocalBaseBranch(repoPath); + if (await remoteBranchExists(repoPath, branchName)) { + return; + } + + const baseRef = await remoteDefaultBaseRef(repoPath); + await execFileAsync("git", ["-C", repoPath, "push", "origin", `${baseRef}:refs/heads/${branchName}`], { + maxBuffer: 1024 * 1024 * 2, + env: gitEnv(options), + }); + await fetch(repoPath, options); +} + +export async function diffStatForBranch(repoPath: string, branchName: string): Promise { + try { + const baseRef = await remoteDefaultBaseRef(repoPath); + const headRef = `origin/${branchName}`; + const { stdout } = await execFileAsync("git", ["-C", repoPath, "diff", "--shortstat", `${baseRef}...${headRef}`], { + maxBuffer: 1024 * 1024, + env: gitEnv(), + }); + const trimmed = stdout.trim(); + if (!trimmed) { + return "+0/-0"; + } + const insertMatch = trimmed.match(/(\d+)\s+insertion/); + const deleteMatch = trimmed.match(/(\d+)\s+deletion/); + const insertions = insertMatch ? insertMatch[1] : "0"; + const deletions = deleteMatch ? deleteMatch[1] : "0"; + return `+${insertions}/-${deletions}`; + } catch { + return "+0/-0"; + } +} + +export async function conflictsWithMain(repoPath: string, branchName: string): Promise { + try { + const baseRef = await remoteDefaultBaseRef(repoPath); + const headRef = `origin/${branchName}`; + // Use merge-tree (git 2.38+) for a clean conflict check. + try { + await execFileAsync("git", ["-C", repoPath, "merge-tree", "--write-tree", "--no-messages", baseRef, headRef], { env: gitEnv() }); + // If merge-tree exits 0, no conflicts. Non-zero exit means conflicts. + return false; + } catch { + // merge-tree exits non-zero when there are conflicts + return true; + } + } catch { + return false; + } +} + +export async function getOriginOwner(repoPath: string): Promise { + try { + const { stdout } = await execFileAsync("git", ["-C", repoPath, "remote", "get-url", "origin"], { env: gitEnv() }); + const url = stdout.trim(); + // Handle SSH: git@github.com:owner/repo.git + const sshMatch = url.match(/[:\/]([^\/]+)\/[^\/]+(?:\.git)?$/); + if (sshMatch) { + return sshMatch[1] ?? ""; + } + // Handle HTTPS: https://github.com/owner/repo.git + const httpsMatch = url.match(/\/\/[^\/]+\/([^\/]+)\//); + if (httpsMatch) { + return httpsMatch[1] ?? ""; + } + return ""; + } catch { + return ""; + } +} diff --git a/foundry/packages/backend/src/integrations/github/index.ts b/foundry/packages/backend/src/integrations/github/index.ts index 87fc996..536c9db 100644 --- a/foundry/packages/backend/src/integrations/github/index.ts +++ b/foundry/packages/backend/src/integrations/github/index.ts @@ -1,80 +1,262 @@ +import { execFile } from "node:child_process"; +import { promisify } from "node:util"; + +const execFileAsync = promisify(execFile); + interface GithubAuthOptions { githubToken?: string | null; - baseBranch?: string | null; } -function authHeaders(options?: GithubAuthOptions): HeadersInit { +function ghEnv(options?: GithubAuthOptions): Record { + const env: Record = { ...(process.env as Record) }; const token = options?.githubToken?.trim(); - if (!token) { - throw new Error("GitHub token is required for this operation"); + if (token) { + env.GH_TOKEN = token; + env.GITHUB_TOKEN = token; } + return env; +} + +export interface PullRequestSnapshot { + number: number; + headRefName: string; + state: string; + title: string; + url: string; + author: string; + isDraft: boolean; + ciStatus: string | null; + reviewStatus: string | null; + reviewer: string | null; +} + +interface GhPrListItem { + number: number; + headRefName: string; + state: string; + title: string; + url?: string; + author?: { login?: string }; + isDraft?: boolean; + statusCheckRollup?: Array<{ + state?: string; + status?: string; + conclusion?: string; + __typename?: string; + }>; + reviews?: Array<{ + state?: string; + author?: { login?: string }; + }>; +} + +function parseCiStatus(checks: GhPrListItem["statusCheckRollup"]): string | null { + if (!checks || checks.length === 0) return null; + + let total = 0; + let successes = 0; + let hasRunning = false; + + for (const check of checks) { + total++; + const conclusion = check.conclusion?.toUpperCase(); + const state = check.state?.toUpperCase(); + const status = check.status?.toUpperCase(); + + if (conclusion === "SUCCESS" || state === "SUCCESS") { + successes++; + } else if (status === "IN_PROGRESS" || status === "QUEUED" || status === "PENDING" || state === "PENDING") { + hasRunning = true; + } + } + + if (hasRunning && successes < total) { + return "running"; + } + + return `${successes}/${total}`; +} + +function parseReviewStatus(reviews: GhPrListItem["reviews"]): { status: string | null; reviewer: string | null } { + if (!reviews || reviews.length === 0) { + return { status: null, reviewer: null }; + } + + // Build a map of latest review per author + const latestByAuthor = new Map(); + for (const review of reviews) { + const login = review.author?.login ?? "unknown"; + const state = review.state?.toUpperCase() ?? ""; + if (state === "COMMENTED") continue; // Skip comments, only track actionable reviews + latestByAuthor.set(login, { state, login }); + } + + // Check for CHANGES_REQUESTED first (takes priority), then APPROVED + for (const [, entry] of latestByAuthor) { + if (entry.state === "CHANGES_REQUESTED") { + return { status: "CHANGES_REQUESTED", reviewer: entry.login }; + } + } + + for (const [, entry] of latestByAuthor) { + if (entry.state === "APPROVED") { + return { status: "APPROVED", reviewer: entry.login }; + } + } + + // If there are reviews but none are APPROVED or CHANGES_REQUESTED + if (latestByAuthor.size > 0) { + const first = latestByAuthor.values().next().value; + return { status: "PENDING", reviewer: first?.login ?? null }; + } + + return { status: null, reviewer: null }; +} + +function snapshotFromGhItem(item: GhPrListItem): PullRequestSnapshot { + const { status: reviewStatus, reviewer } = parseReviewStatus(item.reviews); return { - Accept: "application/vnd.github+json", - Authorization: `Bearer ${token}`, - "X-GitHub-Api-Version": "2022-11-28", + number: item.number, + headRefName: item.headRefName, + state: item.state, + title: item.title, + url: item.url ?? "", + author: item.author?.login ?? "", + isDraft: item.isDraft ?? false, + ciStatus: parseCiStatus(item.statusCheckRollup), + reviewStatus, + reviewer, }; } -async function githubRequest(path: string, init: RequestInit, options?: GithubAuthOptions): Promise { - return await fetch(`https://api.github.com${path}`, { - ...init, - headers: { - ...authHeaders(options), - ...(init.headers ?? {}), - }, - }); +const PR_JSON_FIELDS = "number,headRefName,state,title,url,author,isDraft,statusCheckRollup,reviews"; + +export async function listPullRequests(repoPath: string, options?: GithubAuthOptions): Promise { + try { + const { stdout } = await execFileAsync("gh", ["pr", "list", "--json", PR_JSON_FIELDS, "--limit", "200"], { + maxBuffer: 1024 * 1024 * 4, + cwd: repoPath, + env: ghEnv(options), + }); + + const parsed = JSON.parse(stdout) as GhPrListItem[]; + + return parsed.map((item) => { + // Handle fork PRs where headRefName may contain "owner:branch" + const headRefName = item.headRefName.includes(":") ? (item.headRefName.split(":").pop() ?? item.headRefName) : item.headRefName; + + return snapshotFromGhItem({ ...item, headRefName }); + }); + } catch { + return []; + } +} + +export async function getPrInfo(repoPath: string, branchName: string, options?: GithubAuthOptions): Promise { + try { + const { stdout } = await execFileAsync("gh", ["pr", "view", branchName, "--json", PR_JSON_FIELDS], { + maxBuffer: 1024 * 1024 * 4, + cwd: repoPath, + env: ghEnv(options), + }); + + const item = JSON.parse(stdout) as GhPrListItem; + return snapshotFromGhItem(item); + } catch { + return null; + } } export async function createPr( - repoFullName: string, + repoPath: string, headBranch: string, title: string, body?: string, options?: GithubAuthOptions, ): Promise<{ number: number; url: string }> { - const baseBranch = options?.baseBranch?.trim() || "main"; - const response = await githubRequest( - `/repos/${repoFullName}/pulls`, - { - method: "POST", - headers: { - "Content-Type": "application/json", - }, - body: JSON.stringify({ - title, - head: headBranch, - base: baseBranch, - body: body ?? "", - }), - }, - options, - ); - - const payload = (await response.json()) as { number?: number; html_url?: string; message?: string }; - if (!response.ok || !payload.number || !payload.html_url) { - throw new Error(payload.message ?? `Failed to create pull request for ${repoFullName}`); + const args = ["pr", "create", "--title", title, "--head", headBranch]; + if (body) { + args.push("--body", body); + } else { + args.push("--body", ""); } - return { - number: payload.number, - url: payload.html_url, - }; + const { stdout } = await execFileAsync("gh", args, { + maxBuffer: 1024 * 1024, + cwd: repoPath, + env: ghEnv(options), + }); + + // gh pr create outputs the PR URL on success + const url = stdout.trim(); + // Extract PR number from URL: https://github.com/owner/repo/pull/123 + const numberMatch = url.match(/\/pull\/(\d+)/); + const number = numberMatch ? parseInt(numberMatch[1]!, 10) : 0; + + return { number, url }; } export async function starRepository(repoFullName: string, options?: GithubAuthOptions): Promise { - const response = await githubRequest( - `/user/starred/${repoFullName}`, - { - method: "PUT", - headers: { - "Content-Length": "0", - }, - }, - options, - ); - - if (!response.ok) { - const payload = (await response.json().catch(() => null)) as { message?: string } | null; - throw new Error(payload?.message ?? `Failed to star GitHub repository ${repoFullName}`); + try { + await execFileAsync("gh", ["api", "--method", "PUT", `user/starred/${repoFullName}`], { + maxBuffer: 1024 * 1024, + env: ghEnv(options), + }); + } catch (error) { + const message = + error instanceof Error ? error.message : `Failed to star GitHub repository ${repoFullName}. Ensure GitHub auth is configured for the backend.`; + throw new Error(message); + } +} + +export async function getAllowedMergeMethod(repoPath: string, options?: GithubAuthOptions): Promise<"squash" | "rebase" | "merge"> { + try { + // Get the repo owner/name from gh + const { stdout: repoJson } = await execFileAsync("gh", ["repo", "view", "--json", "owner,name"], { cwd: repoPath, env: ghEnv(options) }); + const repo = JSON.parse(repoJson) as { owner: { login: string }; name: string }; + const repoFullName = `${repo.owner.login}/${repo.name}`; + + const { stdout } = await execFileAsync("gh", ["api", `repos/${repoFullName}`, "--jq", ".allow_squash_merge, .allow_rebase_merge, .allow_merge_commit"], { + maxBuffer: 1024 * 1024, + cwd: repoPath, + env: ghEnv(options), + }); + + const lines = stdout.trim().split("\n"); + const allowSquash = lines[0]?.trim() === "true"; + const allowRebase = lines[1]?.trim() === "true"; + const allowMerge = lines[2]?.trim() === "true"; + + if (allowSquash) return "squash"; + if (allowRebase) return "rebase"; + if (allowMerge) return "merge"; + return "squash"; + } catch { + return "squash"; + } +} + +export async function mergePr(repoPath: string, prNumber: number, options?: GithubAuthOptions): Promise { + const method = await getAllowedMergeMethod(repoPath, options); + await execFileAsync("gh", ["pr", "merge", String(prNumber), `--${method}`, "--delete-branch"], { cwd: repoPath, env: ghEnv(options) }); +} + +export async function isPrMerged(repoPath: string, branchName: string, options?: GithubAuthOptions): Promise { + try { + const { stdout } = await execFileAsync("gh", ["pr", "view", branchName, "--json", "state"], { cwd: repoPath, env: ghEnv(options) }); + const parsed = JSON.parse(stdout) as { state: string }; + return parsed.state.toUpperCase() === "MERGED"; + } catch { + return false; + } +} + +export async function getPrTitle(repoPath: string, branchName: string): Promise { + try { + const { stdout } = await execFileAsync("gh", ["pr", "view", branchName, "--json", "title"], { cwd: repoPath }); + const parsed = JSON.parse(stdout) as { title: string }; + return parsed.title; + } catch { + return null; } } diff --git a/foundry/packages/backend/src/integrations/graphite/index.ts b/foundry/packages/backend/src/integrations/graphite/index.ts new file mode 100644 index 0000000..4c708b0 --- /dev/null +++ b/foundry/packages/backend/src/integrations/graphite/index.ts @@ -0,0 +1,140 @@ +import { execFile } from "node:child_process"; +import { promisify } from "node:util"; + +const execFileAsync = promisify(execFile); + +export async function graphiteAvailable(repoPath: string): Promise { + try { + await execFileAsync("gt", ["trunk"], { cwd: repoPath }); + return true; + } catch { + return false; + } +} + +export async function graphiteGet(repoPath: string, branchName: string): Promise { + try { + await execFileAsync("gt", ["get", branchName], { cwd: repoPath }); + return true; + } catch { + return false; + } +} + +export async function graphiteCreateBranch(repoPath: string, branchName: string): Promise { + await execFileAsync("gt", ["create", branchName], { cwd: repoPath }); +} + +export async function graphiteCheckout(repoPath: string, branchName: string): Promise { + await execFileAsync("gt", ["checkout", branchName], { cwd: repoPath }); +} + +export async function graphiteSubmit(repoPath: string): Promise { + await execFileAsync("gt", ["submit", "--no-edit"], { cwd: repoPath }); +} + +export async function graphiteMergeBranch(repoPath: string, branchName: string): Promise { + await execFileAsync("gt", ["merge", branchName], { cwd: repoPath }); +} + +export async function graphiteAbandon(repoPath: string, branchName: string): Promise { + await execFileAsync("gt", ["abandon", branchName], { cwd: repoPath }); +} + +export interface GraphiteStackEntry { + branchName: string; + parentBranch: string | null; +} + +export async function graphiteGetStack(repoPath: string): Promise { + try { + // Try JSON output first + const { stdout } = await execFileAsync("gt", ["log", "--json"], { + cwd: repoPath, + maxBuffer: 1024 * 1024, + }); + + const parsed = JSON.parse(stdout) as Array<{ + branch?: string; + name?: string; + parent?: string; + parentBranch?: string; + }>; + + return parsed.map((entry) => ({ + branchName: entry.branch ?? entry.name ?? "", + parentBranch: entry.parent ?? entry.parentBranch ?? null, + })); + } catch { + // Fall back to text parsing of `gt log` + try { + const { stdout } = await execFileAsync("gt", ["log"], { + cwd: repoPath, + maxBuffer: 1024 * 1024, + }); + + const entries: GraphiteStackEntry[] = []; + const lines = stdout.split("\n").filter((l) => l.trim().length > 0); + + // Parse indented tree output: each line has tree chars (|, /, \, -, etc.) + // followed by branch names. Build parent-child from indentation level. + const branchStack: string[] = []; + + for (const line of lines) { + // Strip ANSI color codes + const clean = line.replace(/\x1b\[[0-9;]*m/g, ""); + // Extract branch name: skip tree characters and whitespace + const branchMatch = clean.match(/[│├└─|/\\*\s]*(?:◉|○|●)?\s*(.+)/); + if (!branchMatch) continue; + + const branchName = branchMatch[1]!.trim(); + if (!branchName || branchName.startsWith("(") || branchName === "") continue; + + // Determine indentation level by counting leading whitespace/tree chars + const indent = clean.search(/[a-zA-Z0-9]/); + const level = Math.max(0, Math.floor(indent / 2)); + + // Trim stack to current level + while (branchStack.length > level) { + branchStack.pop(); + } + + const parentBranch = branchStack.length > 0 ? (branchStack[branchStack.length - 1] ?? null) : null; + + entries.push({ branchName, parentBranch }); + branchStack.push(branchName); + } + + return entries; + } catch { + return []; + } + } +} + +export async function graphiteGetParent(repoPath: string, branchName: string): Promise { + try { + // Try `gt get ` to see parent info + const { stdout } = await execFileAsync("gt", ["get", branchName], { + cwd: repoPath, + maxBuffer: 1024 * 1024, + }); + + // Parse output for parent branch reference + const parentMatch = stdout.match(/parent:\s*(\S+)/i); + if (parentMatch) { + return parentMatch[1] ?? null; + } + } catch { + // Fall through to stack-based lookup + } + + // Fall back to stack info + try { + const stack = await graphiteGetStack(repoPath); + const entry = stack.find((e) => e.branchName === branchName); + return entry?.parentBranch ?? null; + } catch { + return null; + } +} diff --git a/foundry/packages/backend/src/providers/daytona/index.ts b/foundry/packages/backend/src/providers/daytona/index.ts new file mode 100644 index 0000000..8166668 --- /dev/null +++ b/foundry/packages/backend/src/providers/daytona/index.ts @@ -0,0 +1,485 @@ +import type { + AgentEndpoint, + AttachTarget, + AttachTargetRequest, + CreateSandboxRequest, + DestroySandboxRequest, + EnsureAgentRequest, + ExecuteSandboxCommandRequest, + ExecuteSandboxCommandResult, + ProviderCapabilities, + ReleaseSandboxRequest, + ResumeSandboxRequest, + SandboxHandle, + SandboxHealth, + SandboxHealthRequest, + SandboxProvider, +} from "../provider-api/index.js"; +import type { DaytonaDriver } from "../../driver.js"; +import { Image } from "@daytonaio/sdk"; + +export interface DaytonaProviderConfig { + endpoint?: string; + apiKey?: string; + image: string; + target?: string; + /** + * Auto-stop interval in minutes. If omitted, Daytona's default applies. + * Set to `0` to disable auto-stop. + */ + autoStopInterval?: number; +} + +export class DaytonaProvider implements SandboxProvider { + constructor( + private readonly config: DaytonaProviderConfig, + private readonly daytona?: DaytonaDriver, + ) {} + + private static readonly SANDBOX_AGENT_PORT = 2468; + private static readonly SANDBOX_AGENT_VERSION = "0.3.0"; + private static readonly DEFAULT_ACP_REQUEST_TIMEOUT_MS = 120_000; + private static readonly AGENT_IDS = ["codex", "claude"] as const; + private static readonly PASSTHROUGH_ENV_KEYS = [ + "ANTHROPIC_API_KEY", + "CLAUDE_API_KEY", + "OPENAI_API_KEY", + "CODEX_API_KEY", + "OPENCODE_API_KEY", + "CEREBRAS_API_KEY", + "GH_TOKEN", + "GITHUB_TOKEN", + ] as const; + + private getRequestTimeoutMs(): number { + const parsed = Number(process.env.HF_DAYTONA_REQUEST_TIMEOUT_MS ?? "120000"); + if (!Number.isFinite(parsed) || parsed <= 0) { + return 120_000; + } + return Math.floor(parsed); + } + + private getAcpRequestTimeoutMs(): number { + const parsed = Number(process.env.HF_SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS ?? DaytonaProvider.DEFAULT_ACP_REQUEST_TIMEOUT_MS.toString()); + if (!Number.isFinite(parsed) || parsed <= 0) { + return DaytonaProvider.DEFAULT_ACP_REQUEST_TIMEOUT_MS; + } + return Math.floor(parsed); + } + + private async withTimeout(label: string, fn: () => Promise): Promise { + const timeoutMs = this.getRequestTimeoutMs(); + let timer: ReturnType | null = null; + + try { + return await Promise.race([ + fn(), + new Promise((_, reject) => { + timer = setTimeout(() => { + reject(new Error(`daytona ${label} timed out after ${timeoutMs}ms`)); + }, timeoutMs); + }), + ]); + } finally { + if (timer) { + clearTimeout(timer); + } + } + } + + private getClient() { + const apiKey = this.config.apiKey?.trim(); + if (!apiKey) { + return undefined; + } + const endpoint = this.config.endpoint?.trim(); + + return this.daytona?.createClient({ + ...(endpoint ? { apiUrl: endpoint } : {}), + apiKey, + target: this.config.target, + }); + } + + private requireClient() { + const client = this.getClient(); + if (client) { + return client; + } + + if (!this.daytona) { + throw new Error("daytona provider requires backend daytona driver"); + } + + throw new Error( + "daytona provider is not configured: missing apiKey. " + + "Set HF_DAYTONA_API_KEY (or DAYTONA_API_KEY). " + + "Optionally set HF_DAYTONA_ENDPOINT (or DAYTONA_ENDPOINT).", + ); + } + + private async ensureStarted(sandboxId: string): Promise { + const client = this.requireClient(); + + const sandbox = await this.withTimeout("get sandbox", () => client.getSandbox(sandboxId)); + const state = String(sandbox.state ?? "unknown").toLowerCase(); + if (state === "started" || state === "running") { + return; + } + + // If the sandbox is stopped (or any non-started state), try starting it. + // Daytona preserves the filesystem across stop/start, which is what we rely on for faster git setup. + await this.withTimeout("start sandbox", () => client.startSandbox(sandboxId, 60)); + } + + private buildEnvVars(): Record { + const envVars: Record = {}; + + for (const key of DaytonaProvider.PASSTHROUGH_ENV_KEYS) { + const value = process.env[key]; + if (value) { + envVars[key] = value; + } + } + + return envVars; + } + + private buildShellExports(extra: Record = {}): string[] { + const merged = { + ...this.buildEnvVars(), + ...extra, + }; + + return Object.entries(merged).map(([key, value]) => { + const encoded = Buffer.from(value, "utf8").toString("base64"); + return `export ${key}="$(printf %s ${JSON.stringify(encoded)} | base64 -d)"`; + }); + } + + private buildSnapshotImage() { + // Use Daytona image build + snapshot caching so base tooling (git + sandbox-agent) + // is prepared once and reused for subsequent sandboxes. + return Image.base(this.config.image).runCommands( + "apt-get update && apt-get install -y curl ca-certificates git openssh-client nodejs npm", + `curl -fsSL https://releases.rivet.dev/sandbox-agent/${DaytonaProvider.SANDBOX_AGENT_VERSION}/install.sh | sh`, + `bash -lc 'export PATH="$HOME/.local/bin:$PATH"; sandbox-agent install-agent codex || true; sandbox-agent install-agent claude || true'`, + ); + } + + private async runCheckedCommand(sandboxId: string, command: string, label: string): Promise { + const client = this.requireClient(); + + const result = await this.withTimeout(`execute command (${label})`, () => client.executeCommand(sandboxId, command)); + if (result.exitCode !== 0) { + throw new Error(`daytona ${label} failed (${result.exitCode}): ${result.result}`); + } + } + + id() { + return "daytona" as const; + } + + capabilities(): ProviderCapabilities { + return { + remote: true, + supportsSessionReuse: true, + }; + } + + async validateConfig(input: unknown): Promise> { + return (input as Record | undefined) ?? {}; + } + + async createSandbox(req: CreateSandboxRequest): Promise { + const client = this.requireClient(); + const emitDebug = req.debug ?? (() => {}); + + emitDebug("daytona.createSandbox.start", { + workspaceId: req.workspaceId, + repoId: req.repoId, + taskId: req.taskId, + branchName: req.branchName, + }); + + const createStartedAt = Date.now(); + const sandbox = await this.withTimeout("create sandbox", () => + client.createSandbox({ + image: this.buildSnapshotImage(), + envVars: this.buildEnvVars(), + labels: { + "foundry.workspace": req.workspaceId, + "foundry.task": req.taskId, + "foundry.repo_id": req.repoId, + "foundry.repo_remote": req.repoRemote, + "foundry.branch": req.branchName, + }, + autoStopInterval: this.config.autoStopInterval, + }), + ); + emitDebug("daytona.createSandbox.created", { + sandboxId: sandbox.id, + durationMs: Date.now() - createStartedAt, + state: sandbox.state ?? null, + }); + + const repoDir = `/home/daytona/foundry/${req.workspaceId}/${req.repoId}/${req.taskId}/repo`; + + // Prepare a working directory for the agent. This must succeed for the task to work. + const installStartedAt = Date.now(); + await this.runCheckedCommand( + sandbox.id, + [ + "bash", + "-lc", + `'set -euo pipefail; export DEBIAN_FRONTEND=noninteractive; if command -v git >/dev/null 2>&1 && command -v npx >/dev/null 2>&1; then exit 0; fi; apt-get update -y >/tmp/apt-update.log 2>&1; apt-get install -y git openssh-client ca-certificates nodejs npm >/tmp/apt-install.log 2>&1'`, + ].join(" "), + "install git + node toolchain", + ); + emitDebug("daytona.createSandbox.install_toolchain.done", { + sandboxId: sandbox.id, + durationMs: Date.now() - installStartedAt, + }); + + const cloneStartedAt = Date.now(); + await this.runCheckedCommand( + sandbox.id, + [ + "bash", + "-lc", + `${JSON.stringify( + [ + "set -euo pipefail", + "export GIT_TERMINAL_PROMPT=0", + "export GIT_ASKPASS=/bin/echo", + `TOKEN=${JSON.stringify(req.githubToken ?? "")}`, + 'if [ -z "$TOKEN" ]; then TOKEN="${GH_TOKEN:-${GITHUB_TOKEN:-}}"; fi', + "GIT_AUTH_ARGS=()", + `if [ -n "$TOKEN" ] && [[ "${req.repoRemote}" == https://github.com/* ]]; then AUTH_HEADER="$(printf 'x-access-token:%s' "$TOKEN" | base64 | tr -d '\\n')"; GIT_AUTH_ARGS=(-c "http.https://github.com/.extraheader=AUTHORIZATION: basic $AUTH_HEADER"); fi`, + `rm -rf "${repoDir}"`, + `mkdir -p "${repoDir}"`, + `rmdir "${repoDir}"`, + // Foundry test repos can be private, so clone/fetch must use the sandbox's GitHub token when available. + `git "\${GIT_AUTH_ARGS[@]}" clone "${req.repoRemote}" "${repoDir}"`, + `cd "${repoDir}"`, + `if [ -n "$TOKEN" ] && [[ "${req.repoRemote}" == https://github.com/* ]]; then git config --local credential.helper ""; git config --local http.https://github.com/.extraheader "AUTHORIZATION: basic $AUTH_HEADER"; fi`, + `git "\${GIT_AUTH_ARGS[@]}" fetch origin --prune`, + // The task branch may not exist remotely yet (agent push creates it). Base off current branch (default branch). + `if git show-ref --verify --quiet "refs/remotes/origin/${req.branchName}"; then git checkout -B "${req.branchName}" "origin/${req.branchName}"; else git checkout -B "${req.branchName}" "$(git branch --show-current 2>/dev/null || echo main)"; fi`, + `git config user.email "foundry@local" >/dev/null 2>&1 || true`, + `git config user.name "Foundry" >/dev/null 2>&1 || true`, + ].join("; "), + )}`, + ].join(" "), + "clone repo", + ); + emitDebug("daytona.createSandbox.clone_repo.done", { + sandboxId: sandbox.id, + durationMs: Date.now() - cloneStartedAt, + }); + + return { + sandboxId: sandbox.id, + switchTarget: `daytona://${sandbox.id}`, + metadata: { + endpoint: this.config.endpoint ?? null, + image: this.config.image, + snapshot: sandbox.snapshot ?? null, + remote: true, + state: sandbox.state ?? null, + cwd: repoDir, + }, + }; + } + + async resumeSandbox(req: ResumeSandboxRequest): Promise { + const client = this.requireClient(); + + await this.ensureStarted(req.sandboxId); + + // Reconstruct cwd from sandbox labels written at create time. + const info = await this.withTimeout("resume get sandbox", () => client.getSandbox(req.sandboxId)); + const labels = info.labels ?? {}; + const workspaceId = labels["foundry.workspace"] ?? req.workspaceId; + const repoId = labels["foundry.repo_id"] ?? ""; + const taskId = labels["foundry.task"] ?? ""; + const cwd = repoId && taskId ? `/home/daytona/foundry/${workspaceId}/${repoId}/${taskId}/repo` : null; + + return { + sandboxId: req.sandboxId, + switchTarget: `daytona://${req.sandboxId}`, + metadata: { + resumed: true, + endpoint: this.config.endpoint ?? null, + ...(cwd ? { cwd } : {}), + }, + }; + } + + async destroySandbox(_req: DestroySandboxRequest): Promise { + const client = this.getClient(); + if (!client) { + return; + } + + try { + await this.withTimeout("delete sandbox", () => client.deleteSandbox(_req.sandboxId)); + } catch (error) { + // Ignore not-found style cleanup failures. + const text = error instanceof Error ? error.message : String(error); + if (text.toLowerCase().includes("not found")) { + return; + } + throw error; + } + } + + async releaseSandbox(req: ReleaseSandboxRequest): Promise { + const client = this.getClient(); + if (!client) { + return; + } + + try { + await this.withTimeout("stop sandbox", () => client.stopSandbox(req.sandboxId, 60)); + } catch (error) { + const text = error instanceof Error ? error.message : String(error); + if (text.toLowerCase().includes("not found")) { + return; + } + throw error; + } + } + + async ensureSandboxAgent(req: EnsureAgentRequest): Promise { + const client = this.requireClient(); + const acpRequestTimeoutMs = this.getAcpRequestTimeoutMs(); + const sandboxAgentExports = this.buildShellExports({ + SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS: acpRequestTimeoutMs.toString(), + }); + + await this.ensureStarted(req.sandboxId); + + await this.runCheckedCommand( + req.sandboxId, + [ + "bash", + "-lc", + `'set -euo pipefail; if command -v curl >/dev/null 2>&1; then exit 0; fi; export DEBIAN_FRONTEND=noninteractive; apt-get update -y >/tmp/apt-update.log 2>&1; apt-get install -y curl ca-certificates >/tmp/apt-install.log 2>&1'`, + ].join(" "), + "install curl", + ); + + await this.runCheckedCommand( + req.sandboxId, + [ + "bash", + "-lc", + `'set -euo pipefail; if command -v npx >/dev/null 2>&1; then exit 0; fi; export DEBIAN_FRONTEND=noninteractive; apt-get update -y >/tmp/apt-update.log 2>&1; apt-get install -y nodejs npm >/tmp/apt-install.log 2>&1'`, + ].join(" "), + "install node toolchain", + ); + + await this.runCheckedCommand( + req.sandboxId, + [ + "bash", + "-lc", + `'set -euo pipefail; export PATH="$HOME/.local/bin:$PATH"; if sandbox-agent --version 2>/dev/null | grep -q "${DaytonaProvider.SANDBOX_AGENT_VERSION}"; then exit 0; fi; curl -fsSL https://releases.rivet.dev/sandbox-agent/${DaytonaProvider.SANDBOX_AGENT_VERSION}/install.sh | sh'`, + ].join(" "), + "install sandbox-agent", + ); + + for (const agentId of DaytonaProvider.AGENT_IDS) { + try { + await this.runCheckedCommand( + req.sandboxId, + ["bash", "-lc", `'export PATH="$HOME/.local/bin:$PATH"; sandbox-agent install-agent ${agentId}'`].join(" "), + `install agent ${agentId}`, + ); + } catch { + // Some sandbox-agent builds may not ship every agent plugin; treat this as best-effort. + } + } + + await this.runCheckedCommand( + req.sandboxId, + [ + "bash", + "-lc", + JSON.stringify( + [ + "set -euo pipefail", + 'export PATH="$HOME/.local/bin:$PATH"', + ...sandboxAgentExports, + "command -v sandbox-agent >/dev/null 2>&1", + "if pgrep -x sandbox-agent >/dev/null; then exit 0; fi", + 'rm -f "$HOME/.codex/auth.json" "$HOME/.config/codex/auth.json"', + `nohup sandbox-agent server --no-token --host 0.0.0.0 --port ${DaytonaProvider.SANDBOX_AGENT_PORT} >/tmp/sandbox-agent.log 2>&1 &`, + ].join("; "), + ), + ].join(" "), + "start sandbox-agent", + ); + + await this.runCheckedCommand( + req.sandboxId, + [ + "bash", + "-lc", + `'for i in $(seq 1 45); do curl -fsS "http://127.0.0.1:${DaytonaProvider.SANDBOX_AGENT_PORT}/v1/health" >/dev/null && exit 0; sleep 1; done; echo "sandbox-agent failed to become healthy" >&2; tail -n 80 /tmp/sandbox-agent.log >&2; exit 1'`, + ].join(" "), + "wait for sandbox-agent health", + ); + + const preview = await this.withTimeout("get preview endpoint", () => client.getPreviewEndpoint(req.sandboxId, DaytonaProvider.SANDBOX_AGENT_PORT)); + + return { + endpoint: preview.url, + token: preview.token, + }; + } + + async health(req: SandboxHealthRequest): Promise { + const client = this.getClient(); + if (!client) { + return { + status: "degraded", + message: "daytona driver not configured", + }; + } + + try { + const sandbox = await this.withTimeout("health get sandbox", () => client.getSandbox(req.sandboxId)); + const state = String(sandbox.state ?? "unknown"); + if (state.toLowerCase().includes("error")) { + return { + status: "down", + message: `daytona sandbox in error state: ${state}`, + }; + } + return { + status: "healthy", + message: `daytona sandbox state: ${state}`, + }; + } catch (error) { + const text = error instanceof Error ? error.message : String(error); + return { + status: "down", + message: `daytona sandbox health check failed: ${text}`, + }; + } + } + + async attachTarget(req: AttachTargetRequest): Promise { + return { + target: `daytona://${req.sandboxId}`, + }; + } + + async executeCommand(req: ExecuteSandboxCommandRequest): Promise { + const client = this.requireClient(); + await this.ensureStarted(req.sandboxId); + return await this.withTimeout(`execute command (${req.label ?? "command"})`, () => client.executeCommand(req.sandboxId, req.command)); + } +} diff --git a/foundry/packages/backend/src/providers/index.ts b/foundry/packages/backend/src/providers/index.ts new file mode 100644 index 0000000..1f3af94 --- /dev/null +++ b/foundry/packages/backend/src/providers/index.ts @@ -0,0 +1,77 @@ +import type { ProviderId } from "@sandbox-agent/foundry-shared"; +import type { AppConfig } from "@sandbox-agent/foundry-shared"; +import type { BackendDriver } from "../driver.js"; +import { DaytonaProvider } from "./daytona/index.js"; +import { LocalProvider } from "./local/index.js"; +import type { SandboxProvider } from "./provider-api/index.js"; + +export interface ProviderRegistry { + get(providerId: ProviderId): SandboxProvider; + availableProviderIds(): ProviderId[]; + defaultProviderId(): ProviderId; +} + +export function createProviderRegistry(config: AppConfig, driver?: BackendDriver): ProviderRegistry { + const gitDriver = driver?.git ?? { + validateRemote: async () => { + throw new Error("local provider requires backend git driver"); + }, + ensureCloned: async () => { + throw new Error("local provider requires backend git driver"); + }, + fetch: async () => { + throw new Error("local provider requires backend git driver"); + }, + listRemoteBranches: async () => { + throw new Error("local provider requires backend git driver"); + }, + remoteDefaultBaseRef: async () => { + throw new Error("local provider requires backend git driver"); + }, + revParse: async () => { + throw new Error("local provider requires backend git driver"); + }, + ensureRemoteBranch: async () => { + throw new Error("local provider requires backend git driver"); + }, + diffStatForBranch: async () => { + throw new Error("local provider requires backend git driver"); + }, + conflictsWithMain: async () => { + throw new Error("local provider requires backend git driver"); + }, + }; + + const local = new LocalProvider( + { + rootDir: config.providers.local.rootDir, + sandboxAgentPort: config.providers.local.sandboxAgentPort, + }, + gitDriver, + ); + const daytona = new DaytonaProvider( + { + endpoint: config.providers.daytona.endpoint, + apiKey: config.providers.daytona.apiKey, + image: config.providers.daytona.image, + }, + driver?.daytona, + ); + + const map: Record = { + local, + daytona, + }; + + return { + get(providerId: ProviderId): SandboxProvider { + return map[providerId]; + }, + availableProviderIds(): ProviderId[] { + return Object.keys(map) as ProviderId[]; + }, + defaultProviderId(): ProviderId { + return config.providers.daytona.apiKey ? "daytona" : "local"; + }, + }; +} diff --git a/foundry/packages/backend/src/providers/local/index.ts b/foundry/packages/backend/src/providers/local/index.ts new file mode 100644 index 0000000..f18313a --- /dev/null +++ b/foundry/packages/backend/src/providers/local/index.ts @@ -0,0 +1,235 @@ +import { randomUUID } from "node:crypto"; +import { execFile } from "node:child_process"; +import { existsSync, mkdirSync, rmSync } from "node:fs"; +import { homedir } from "node:os"; +import { dirname, resolve } from "node:path"; +import { promisify } from "node:util"; +import { InMemorySessionPersistDriver, SandboxAgent } from "sandbox-agent"; +import type { + AgentEndpoint, + AttachTarget, + AttachTargetRequest, + CreateSandboxRequest, + DestroySandboxRequest, + EnsureAgentRequest, + ExecuteSandboxCommandRequest, + ExecuteSandboxCommandResult, + ProviderCapabilities, + ReleaseSandboxRequest, + ResumeSandboxRequest, + SandboxHandle, + SandboxHealth, + SandboxHealthRequest, + SandboxProvider, +} from "../provider-api/index.js"; +import type { GitDriver } from "../../driver.js"; + +const execFileAsync = promisify(execFile); +const DEFAULT_SANDBOX_AGENT_PORT = 2468; + +export interface LocalProviderConfig { + rootDir?: string; + sandboxAgentPort?: number; +} + +function expandHome(value: string): string { + if (value === "~") { + return homedir(); + } + if (value.startsWith("~/")) { + return resolve(homedir(), value.slice(2)); + } + return value; +} + +async function branchExists(repoPath: string, branchName: string): Promise { + try { + await execFileAsync("git", ["-C", repoPath, "show-ref", "--verify", `refs/remotes/origin/${branchName}`]); + return true; + } catch { + return false; + } +} + +async function checkoutBranch(repoPath: string, branchName: string, git: GitDriver): Promise { + await git.fetch(repoPath); + const targetRef = (await branchExists(repoPath, branchName)) ? `origin/${branchName}` : await git.remoteDefaultBaseRef(repoPath); + await execFileAsync("git", ["-C", repoPath, "checkout", "-B", branchName, targetRef], { + env: process.env as Record, + }); +} + +export class LocalProvider implements SandboxProvider { + private sdkPromise: Promise | null = null; + + constructor( + private readonly config: LocalProviderConfig, + private readonly git: GitDriver, + ) {} + + private rootDir(): string { + return expandHome(this.config.rootDir?.trim() || "~/.local/share/foundry/local-sandboxes"); + } + + private sandboxRoot(workspaceId: string, sandboxId: string): string { + return resolve(this.rootDir(), workspaceId, sandboxId); + } + + private repoDir(workspaceId: string, sandboxId: string): string { + return resolve(this.sandboxRoot(workspaceId, sandboxId), "repo"); + } + + private sandboxHandle(workspaceId: string, sandboxId: string, repoDir: string): SandboxHandle { + return { + sandboxId, + switchTarget: `local://${repoDir}`, + metadata: { + cwd: repoDir, + repoDir, + }, + }; + } + + private async sandboxAgent(): Promise { + if (!this.sdkPromise) { + const sandboxAgentHome = resolve(this.rootDir(), ".sandbox-agent-home"); + mkdirSync(sandboxAgentHome, { recursive: true }); + const spawnHome = process.env.HOME?.trim() || sandboxAgentHome; + this.sdkPromise = SandboxAgent.start({ + persist: new InMemorySessionPersistDriver(), + spawn: { + enabled: true, + host: "127.0.0.1", + port: this.config.sandboxAgentPort ?? DEFAULT_SANDBOX_AGENT_PORT, + log: "silent", + env: { + HOME: spawnHome, + ...(process.env.ANTHROPIC_API_KEY ? { ANTHROPIC_API_KEY: process.env.ANTHROPIC_API_KEY } : {}), + ...(process.env.CLAUDE_API_KEY ? { CLAUDE_API_KEY: process.env.CLAUDE_API_KEY } : {}), + ...(process.env.OPENAI_API_KEY ? { OPENAI_API_KEY: process.env.OPENAI_API_KEY } : {}), + ...(process.env.CODEX_API_KEY ? { CODEX_API_KEY: process.env.CODEX_API_KEY } : {}), + ...(process.env.GH_TOKEN ? { GH_TOKEN: process.env.GH_TOKEN } : {}), + ...(process.env.GITHUB_TOKEN ? { GITHUB_TOKEN: process.env.GITHUB_TOKEN } : {}), + }, + }, + }).then(async (sdk) => { + for (const agentName of ["claude", "codex"] as const) { + try { + const agent = await sdk.getAgent(agentName, { config: true }); + if (!agent.installed) { + await sdk.installAgent(agentName); + } + } catch { + // The local provider can still function if the agent is already available + // through the user's PATH or the install check is unsupported. + } + } + return sdk; + }); + } + return this.sdkPromise; + } + + id() { + return "local" as const; + } + + capabilities(): ProviderCapabilities { + return { + remote: false, + supportsSessionReuse: true, + }; + } + + async validateConfig(input: unknown): Promise> { + return (input as Record | undefined) ?? {}; + } + + async createSandbox(req: CreateSandboxRequest): Promise { + const sandboxId = req.taskId || `local-${randomUUID()}`; + const repoDir = this.repoDir(req.workspaceId, sandboxId); + mkdirSync(dirname(repoDir), { recursive: true }); + await this.git.ensureCloned(req.repoRemote, repoDir, { githubToken: req.githubToken }); + await checkoutBranch(repoDir, req.branchName, this.git); + return this.sandboxHandle(req.workspaceId, sandboxId, repoDir); + } + + async resumeSandbox(req: ResumeSandboxRequest): Promise { + const repoDir = this.repoDir(req.workspaceId, req.sandboxId); + if (!existsSync(repoDir)) { + throw new Error(`local sandbox repo is missing: ${repoDir}`); + } + return this.sandboxHandle(req.workspaceId, req.sandboxId, repoDir); + } + + async destroySandbox(req: DestroySandboxRequest): Promise { + rmSync(this.sandboxRoot(req.workspaceId, req.sandboxId), { + force: true, + recursive: true, + }); + } + + async releaseSandbox(_req: ReleaseSandboxRequest): Promise { + // Local sandboxes stay warm on disk to preserve session state and repo context. + } + + async ensureSandboxAgent(_req: EnsureAgentRequest): Promise { + const sdk = await this.sandboxAgent(); + const { baseUrl, token } = sdk as unknown as { + baseUrl?: string; + token?: string; + }; + if (!baseUrl) { + throw new Error("sandbox-agent baseUrl is unavailable"); + } + return token ? { endpoint: baseUrl, token } : { endpoint: baseUrl }; + } + + async health(req: SandboxHealthRequest): Promise { + try { + const repoDir = this.repoDir(req.workspaceId, req.sandboxId); + if (!existsSync(repoDir)) { + return { + status: "down", + message: "local sandbox repo is missing", + }; + } + const sdk = await this.sandboxAgent(); + const health = await sdk.getHealth(); + return { + status: health.status === "ok" ? "healthy" : "degraded", + message: health.status, + }; + } catch (error) { + return { + status: "down", + message: error instanceof Error ? error.message : String(error), + }; + } + } + + async attachTarget(req: AttachTargetRequest): Promise { + return { target: this.repoDir(req.workspaceId, req.sandboxId) }; + } + + async executeCommand(req: ExecuteSandboxCommandRequest): Promise { + const cwd = this.repoDir(req.workspaceId, req.sandboxId); + try { + const { stdout, stderr } = await execFileAsync("bash", ["-lc", req.command], { + cwd, + env: process.env as Record, + maxBuffer: 1024 * 1024 * 16, + }); + return { + exitCode: 0, + result: [stdout, stderr].filter(Boolean).join(""), + }; + } catch (error) { + const detail = error as { stdout?: string; stderr?: string; code?: number }; + return { + exitCode: typeof detail.code === "number" ? detail.code : 1, + result: [detail.stdout, detail.stderr, error instanceof Error ? error.message : String(error)].filter(Boolean).join(""), + }; + } + } +} diff --git a/foundry/packages/backend/src/providers/provider-api/index.ts b/foundry/packages/backend/src/providers/provider-api/index.ts new file mode 100644 index 0000000..a15109d --- /dev/null +++ b/foundry/packages/backend/src/providers/provider-api/index.ts @@ -0,0 +1,100 @@ +import type { ProviderId } from "@sandbox-agent/foundry-shared"; + +export interface ProviderCapabilities { + remote: boolean; + supportsSessionReuse: boolean; +} + +export interface CreateSandboxRequest { + workspaceId: string; + repoId: string; + repoRemote: string; + branchName: string; + taskId: string; + githubToken?: string | null; + debug?: (message: string, context?: Record) => void; + options?: Record; +} + +export interface ResumeSandboxRequest { + workspaceId: string; + sandboxId: string; + options?: Record; +} + +export interface DestroySandboxRequest { + workspaceId: string; + sandboxId: string; +} + +export interface ReleaseSandboxRequest { + workspaceId: string; + sandboxId: string; +} + +export interface EnsureAgentRequest { + workspaceId: string; + sandboxId: string; +} + +export interface SandboxHealthRequest { + workspaceId: string; + sandboxId: string; +} + +export interface AttachTargetRequest { + workspaceId: string; + sandboxId: string; +} + +export interface ExecuteSandboxCommandRequest { + workspaceId: string; + sandboxId: string; + command: string; + label?: string; +} + +export interface SandboxHandle { + sandboxId: string; + switchTarget: string; + metadata: Record; +} + +export interface AgentEndpoint { + endpoint: string; + token?: string; +} + +export interface SandboxHealth { + status: "healthy" | "degraded" | "down"; + message: string; +} + +export interface AttachTarget { + target: string; +} + +export interface ExecuteSandboxCommandResult { + exitCode: number; + result: string; +} + +export interface SandboxProvider { + id(): ProviderId; + capabilities(): ProviderCapabilities; + validateConfig(input: unknown): Promise>; + + createSandbox(req: CreateSandboxRequest): Promise; + resumeSandbox(req: ResumeSandboxRequest): Promise; + destroySandbox(req: DestroySandboxRequest): Promise; + /** + * Release resources for a sandbox without deleting its filesystem/state. + * For remote providers, this typically maps to "stop"/"suspend". + */ + releaseSandbox(req: ReleaseSandboxRequest): Promise; + + ensureSandboxAgent(req: EnsureAgentRequest): Promise; + health(req: SandboxHealthRequest): Promise; + attachTarget(req: AttachTargetRequest): Promise; + executeCommand(req: ExecuteSandboxCommandRequest): Promise; +} diff --git a/foundry/packages/backend/src/sandbox-config.ts b/foundry/packages/backend/src/sandbox-config.ts deleted file mode 100644 index 9d85f51..0000000 --- a/foundry/packages/backend/src/sandbox-config.ts +++ /dev/null @@ -1,39 +0,0 @@ -import type { AppConfig, SandboxProviderId } from "@sandbox-agent/foundry-shared"; - -function hasE2BApiKey(config: AppConfig): boolean { - return Boolean(config.sandboxProviders.e2b.apiKey?.trim()); -} - -function forcedSandboxProviderId(): SandboxProviderId | null { - const raw = process.env.FOUNDRY_SANDBOX_PROVIDER?.trim() ?? process.env.HF_SANDBOX_PROVIDER?.trim() ?? null; - if (raw === "local" || raw === "e2b") { - return raw; - } - return null; -} - -export function defaultSandboxProviderId(config: AppConfig): SandboxProviderId { - const forced = forcedSandboxProviderId(); - if (forced === "local") { - return "local"; - } - if (forced === "e2b") { - if (!hasE2BApiKey(config)) { - throw new Error("FOUNDRY_SANDBOX_PROVIDER=e2b requires E2B_API_KEY to be configured."); - } - return "e2b"; - } - return hasE2BApiKey(config) ? "e2b" : "local"; -} - -export function availableSandboxProviderIds(config: AppConfig): SandboxProviderId[] { - return hasE2BApiKey(config) ? ["e2b", "local"] : ["local"]; -} - -export function resolveSandboxProviderId(config: AppConfig, requested?: SandboxProviderId | null): SandboxProviderId { - if (requested === "e2b" && !hasE2BApiKey(config)) { - throw new Error("E2B provider is not configured. Set E2B_API_KEY before selecting the e2b provider."); - } - - return requested ?? defaultSandboxProviderId(config); -} diff --git a/foundry/packages/backend/src/services/app-github.ts b/foundry/packages/backend/src/services/app-github.ts index 52e5308..1f04fe3 100644 --- a/foundry/packages/backend/src/services/app-github.ts +++ b/foundry/packages/backend/src/services/app-github.ts @@ -38,31 +38,6 @@ export interface GitHubRepositoryRecord { fullName: string; cloneUrl: string; private: boolean; - defaultBranch: string; -} - -export interface GitHubMemberRecord { - id: string; - login: string; - name: string; - email: string | null; - role: string | null; - state: string; -} - -export interface GitHubPullRequestRecord { - repoFullName: string; - cloneUrl: string; - number: number; - title: string; - body: string | null; - state: string; - url: string; - headRefName: string; - baseRefName: string; - authorLogin: string | null; - isDraft: boolean; - merged: boolean; } interface GitHubTokenResponse { @@ -83,23 +58,11 @@ const githubOAuthLogger = logger.child({ export interface GitHubWebhookEvent { action?: string; - organization?: { login?: string; id?: number }; installation?: { id: number; account?: { login?: string; type?: string; id?: number } | null }; repositories_added?: Array<{ id: number; full_name: string; private: boolean }>; repositories_removed?: Array<{ id: number; full_name: string }>; repository?: { id: number; full_name: string; clone_url?: string; private?: boolean; owner?: { login?: string } }; - pull_request?: { - number: number; - title?: string; - body?: string | null; - state?: string; - html_url?: string; - draft?: boolean; - merged?: boolean; - user?: { login?: string } | null; - head?: { ref?: string }; - base?: { ref?: string }; - }; + pull_request?: { number: number; title?: string; state?: string; head?: { ref?: string }; base?: { ref?: string } }; sender?: { login?: string; id?: number }; [key: string]: unknown; } @@ -342,14 +305,12 @@ export class GitHubAppClient { full_name: string; clone_url: string; private: boolean; - default_branch: string; }>("/user/repos?per_page=100&affiliation=owner,collaborator,organization_member&sort=updated", accessToken); return repositories.map((repository) => ({ fullName: repository.full_name, cloneUrl: repository.clone_url, private: repository.private, - defaultBranch: repository.default_branch, })); } @@ -359,143 +320,15 @@ export class GitHubAppClient { full_name: string; clone_url: string; private: boolean; - default_branch: string; }>("/installation/repositories?per_page=100", accessToken); return repositories.map((repository) => ({ fullName: repository.full_name, cloneUrl: repository.clone_url, private: repository.private, - defaultBranch: repository.default_branch, })); } - async getUserRepository(accessToken: string, fullName: string): Promise { - try { - const repository = await this.requestJson<{ - full_name: string; - clone_url: string; - private: boolean; - default_branch: string; - }>(`/repos/${fullName}`, accessToken); - return { - fullName: repository.full_name, - cloneUrl: repository.clone_url, - private: repository.private, - defaultBranch: repository.default_branch, - }; - } catch (error) { - if (error instanceof GitHubAppError && error.status === 404) { - return null; - } - throw error; - } - } - - async getInstallationRepository(installationId: number, fullName: string): Promise { - const accessToken = await this.createInstallationAccessToken(installationId); - return await this.getUserRepository(accessToken, fullName); - } - - async listOrganizationMembers(accessToken: string, organizationLogin: string): Promise { - const members = await this.paginate<{ - id: number; - login: string; - role?: string | null; - }>(`/orgs/${organizationLogin}/members?per_page=100&role=all`, accessToken); - - const detailedMembers = await Promise.all( - members.map(async (member) => { - try { - const detail = await this.requestJson<{ - id: number; - login: string; - name?: string | null; - email?: string | null; - }>(`/users/${member.login}`, accessToken); - return { - id: String(detail.id), - login: detail.login, - name: detail.name?.trim() || detail.login, - email: detail.email ?? null, - role: member.role ?? null, - state: "active", - }; - } catch { - return { - id: String(member.id), - login: member.login, - name: member.login, - email: null, - role: member.role ?? null, - state: "active", - }; - } - }), - ); - - return detailedMembers; - } - - async listInstallationMembers(installationId: number, organizationLogin: string): Promise { - const accessToken = await this.createInstallationAccessToken(installationId); - return await this.listOrganizationMembers(accessToken, organizationLogin); - } - - async listPullRequestsForUserRepositories(accessToken: string, repositories: GitHubRepositoryRecord[]): Promise { - return (await Promise.all(repositories.map((repository) => this.listRepositoryPullRequests(accessToken, repository.fullName, repository.cloneUrl)))).flat(); - } - - async listInstallationPullRequestsForRepositories(installationId: number, repositories: GitHubRepositoryRecord[]): Promise { - const accessToken = await this.createInstallationAccessToken(installationId); - return await this.listPullRequestsForUserRepositories(accessToken, repositories); - } - - async getUserPullRequest(accessToken: string, fullName: string, prNumber: number): Promise { - try { - const pullRequest = await this.requestJson<{ - number: number; - title: string; - body?: string | null; - state: string; - html_url: string; - draft?: boolean; - merged?: boolean; - user?: { login?: string } | null; - head?: { ref?: string } | null; - base?: { ref?: string } | null; - }>(`/repos/${fullName}/pulls/${prNumber}`, accessToken); - const repository = await this.getUserRepository(accessToken, fullName); - if (!repository) { - return null; - } - return { - repoFullName: fullName, - cloneUrl: repository.cloneUrl, - number: pullRequest.number, - title: pullRequest.title, - body: pullRequest.body ?? null, - state: pullRequest.state, - url: pullRequest.html_url, - headRefName: pullRequest.head?.ref?.trim() ?? "", - baseRefName: pullRequest.base?.ref?.trim() ?? "", - authorLogin: pullRequest.user?.login?.trim() ?? null, - isDraft: Boolean(pullRequest.draft), - merged: Boolean(pullRequest.merged), - }; - } catch (error) { - if (error instanceof GitHubAppError && error.status === 404) { - return null; - } - throw error; - } - } - - async getInstallationPullRequest(installationId: number, fullName: string, prNumber: number): Promise { - const accessToken = await this.createInstallationAccessToken(installationId); - return await this.getUserPullRequest(accessToken, fullName, prNumber); - } - async buildInstallationUrl(organizationLogin: string, state: string): Promise { if (!this.isAppConfigured()) { throw new GitHubAppError("GitHub App is not configured", 500); @@ -604,36 +437,6 @@ export class GitHubAppClient { return payload as T; } - private async listRepositoryPullRequests(accessToken: string, fullName: string, cloneUrl: string): Promise { - const pullRequests = await this.paginate<{ - number: number; - title: string; - body?: string | null; - state: string; - html_url: string; - draft?: boolean; - merged?: boolean; - user?: { login?: string } | null; - head?: { ref?: string } | null; - base?: { ref?: string } | null; - }>(`/repos/${fullName}/pulls?state=open&per_page=100&sort=updated&direction=desc`, accessToken); - - return pullRequests.map((pullRequest) => ({ - repoFullName: fullName, - cloneUrl, - number: pullRequest.number, - title: pullRequest.title, - body: pullRequest.body ?? null, - state: pullRequest.state, - url: pullRequest.html_url, - headRefName: pullRequest.head?.ref?.trim() ?? "", - baseRefName: pullRequest.base?.ref?.trim() ?? "", - authorLogin: pullRequest.user?.login?.trim() ?? null, - isDraft: Boolean(pullRequest.draft), - merged: Boolean(pullRequest.merged), - })); - } - private async paginate(path: string, accessToken: string): Promise { let nextUrl = `${this.apiBaseUrl}${path.startsWith("/") ? path : `/${path}`}`; const items: T[] = []; diff --git a/foundry/packages/backend/src/services/better-auth.ts b/foundry/packages/backend/src/services/better-auth.ts index 23d227f..325ea59 100644 --- a/foundry/packages/backend/src/services/better-auth.ts +++ b/foundry/packages/backend/src/services/better-auth.ts @@ -1,7 +1,7 @@ import { betterAuth } from "better-auth"; import { createAdapterFactory } from "better-auth/adapters"; -import { APP_SHELL_ORGANIZATION_ID } from "../actors/organization/constants.js"; -import { organizationKey, userKey } from "../actors/keys.js"; +import { APP_SHELL_WORKSPACE_ID } from "../actors/workspace/app-shell.js"; +import { authUserKey, workspaceKey } from "../actors/keys.js"; import { logger } from "../logging.js"; const AUTH_BASE_PATH = "/v1/auth"; @@ -43,7 +43,7 @@ async function callAuthEndpoint(auth: any, url: string, init?: RequestInit): Pro return await auth.handler(new Request(url, init)); } -function resolveRouteUserId(organization: any, resolved: any): string | null { +function resolveRouteUserId(workspace: any, resolved: any): string | null { if (!resolved) { return null; } @@ -75,17 +75,17 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin } // getOrCreate is intentional here: the adapter runs during Better Auth callbacks - // which can fire before any explicit create path. The app organization and user + // which can fire before any explicit create path. The app workspace and auth user // actors must exist by the time the adapter needs them. - const appOrganization = () => - actorClient.organization.getOrCreate(organizationKey(APP_SHELL_ORGANIZATION_ID), { - createWithInput: APP_SHELL_ORGANIZATION_ID, + const appWorkspace = () => + actorClient.workspace.getOrCreate(workspaceKey(APP_SHELL_WORKSPACE_ID), { + createWithInput: APP_SHELL_WORKSPACE_ID, }); // getOrCreate is intentional: Better Auth creates user records during OAuth - // callbacks, so the user actor must be lazily provisioned on first access. - const getUser = async (userId: string) => - await actorClient.user.getOrCreate(userKey(userId), { + // callbacks, so the auth-user actor must be lazily provisioned on first access. + const getAuthUser = async (userId: string) => + await actorClient.authUser.getOrCreate(authUserKey(userId), { createWithInput: { userId }, }); @@ -109,9 +109,9 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin } const email = direct("email"); if (typeof email === "string" && email.length > 0) { - const organization = await appOrganization(); - const resolved = await organization.betterAuthFindEmailIndex({ email: email.toLowerCase() }); - return resolveRouteUserId(organization, resolved); + const workspace = await appWorkspace(); + const resolved = await workspace.authFindEmailIndex({ email: email.toLowerCase() }); + return resolveRouteUserId(workspace, resolved); } return null; } @@ -124,12 +124,12 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const sessionId = direct("id") ?? data?.id; const sessionToken = direct("token") ?? data?.token; if (typeof sessionId === "string" || typeof sessionToken === "string") { - const organization = await appOrganization(); - const resolved = await organization.betterAuthFindSessionIndex({ + const workspace = await appWorkspace(); + const resolved = await workspace.authFindSessionIndex({ ...(typeof sessionId === "string" ? { sessionId } : {}), ...(typeof sessionToken === "string" ? { sessionToken } : {}), }); - return resolveRouteUserId(organization, resolved); + return resolveRouteUserId(workspace, resolved); } return null; } @@ -142,14 +142,14 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const accountRecordId = direct("id") ?? data?.id; const providerId = direct("providerId") ?? data?.providerId; const accountId = direct("accountId") ?? data?.accountId; - const organization = await appOrganization(); + const workspace = await appWorkspace(); if (typeof accountRecordId === "string" && accountRecordId.length > 0) { - const resolved = await organization.betterAuthFindAccountIndex({ id: accountRecordId }); - return resolveRouteUserId(organization, resolved); + const resolved = await workspace.authFindAccountIndex({ id: accountRecordId }); + return resolveRouteUserId(workspace, resolved); } if (typeof providerId === "string" && providerId.length > 0 && typeof accountId === "string" && accountId.length > 0) { - const resolved = await organization.betterAuthFindAccountIndex({ providerId, accountId }); - return resolveRouteUserId(organization, resolved); + const resolved = await workspace.authFindAccountIndex({ providerId, accountId }); + return resolveRouteUserId(workspace, resolved); } return null; } @@ -157,6 +157,11 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return null; }; + const ensureWorkspaceVerification = async (method: string, payload: Record) => { + const workspace = await appWorkspace(); + return await workspace[method](payload); + }; + return { options: { useDatabaseGeneratedIds: false, @@ -165,8 +170,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin create: async ({ model, data }) => { const transformed = await transformInput(data, model, "create", true); if (model === "verification") { - const organization = await appOrganization(); - return await organization.betterAuthCreateVerification({ data: transformed }); + return await ensureWorkspaceVerification("authCreateVerification", { data: transformed }); } const userId = await resolveUserIdForQuery(model, undefined, transformed); @@ -174,19 +178,19 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin throw new Error(`Unable to resolve auth actor for create(${model})`); } - const userActor = await getUser(userId); - const created = await userActor.betterAuthCreateRecord({ model, data: transformed }); - const organization = await appOrganization(); + const userActor = await getAuthUser(userId); + const created = await userActor.createAuthRecord({ model, data: transformed }); + const workspace = await appWorkspace(); if (model === "user" && typeof transformed.email === "string" && transformed.email.length > 0) { - await organization.betterAuthUpsertEmailIndex({ + await workspace.authUpsertEmailIndex({ email: transformed.email.toLowerCase(), userId, }); } if (model === "session") { - await organization.betterAuthUpsertSessionIndex({ + await workspace.authUpsertSessionIndex({ sessionId: String(created.id), sessionToken: String(created.token), userId, @@ -194,7 +198,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin } if (model === "account") { - await organization.betterAuthUpsertAccountIndex({ + await workspace.authUpsertAccountIndex({ id: String(created.id), providerId: String(created.providerId), accountId: String(created.accountId), @@ -208,8 +212,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin findOne: async ({ model, where, join }) => { const transformedWhere = transformWhereClause({ model, where, action: "findOne" }); if (model === "verification") { - const organization = await appOrganization(); - return await organization.betterAuthFindOneVerification({ where: transformedWhere, join }); + return await ensureWorkspaceVerification("authFindOneVerification", { where: transformedWhere, join }); } const userId = await resolveUserIdForQuery(model, transformedWhere); @@ -217,16 +220,15 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return null; } - const userActor = await getUser(userId); - const found = await userActor.betterAuthFindOneRecord({ model, where: transformedWhere, join }); + const userActor = await getAuthUser(userId); + const found = await userActor.findOneAuthRecord({ model, where: transformedWhere, join }); return found ? ((await transformOutput(found, model, undefined, join)) as any) : null; }, findMany: async ({ model, where, limit, sortBy, offset, join }) => { const transformedWhere = transformWhereClause({ model, where, action: "findMany" }); if (model === "verification") { - const organization = await appOrganization(); - return await organization.betterAuthFindManyVerification({ + return await ensureWorkspaceVerification("authFindManyVerification", { where: transformedWhere, limit, sortBy, @@ -238,11 +240,11 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin if (model === "session") { const tokenClause = transformedWhere?.find((entry: any) => entry.field === "token" && entry.operator === "in"); if (tokenClause && Array.isArray(tokenClause.value)) { - const organization = await appOrganization(); + const workspace = await appWorkspace(); const resolved = await Promise.all( (tokenClause.value as string[]).map(async (sessionToken: string) => ({ sessionToken, - route: await organization.betterAuthFindSessionIndex({ sessionToken }), + route: await workspace.authFindSessionIndex({ sessionToken }), })), ); const byUser = new Map(); @@ -257,11 +259,11 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const rows = []; for (const [userId, tokens] of byUser) { - const userActor = await getUser(userId); + const userActor = await getAuthUser(userId); const scopedWhere = transformedWhere.map((entry: any) => entry.field === "token" && entry.operator === "in" ? { ...entry, value: tokens } : entry, ); - const found = await userActor.betterAuthFindManyRecords({ model, where: scopedWhere, limit, sortBy, offset, join }); + const found = await userActor.findManyAuthRecords({ model, where: scopedWhere, limit, sortBy, offset, join }); rows.push(...found); } return await Promise.all(rows.map(async (row: any) => await transformOutput(row, model, undefined, join))); @@ -273,8 +275,8 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return []; } - const userActor = await getUser(userId); - const found = await userActor.betterAuthFindManyRecords({ model, where: transformedWhere, limit, sortBy, offset, join }); + const userActor = await getAuthUser(userId); + const found = await userActor.findManyAuthRecords({ model, where: transformedWhere, limit, sortBy, offset, join }); return await Promise.all(found.map(async (row: any) => await transformOutput(row, model, undefined, join))); }, @@ -282,11 +284,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const transformedWhere = transformWhereClause({ model, where, action: "update" }); const transformedUpdate = (await transformInput(update as Record, model, "update", true)) as Record; if (model === "verification") { - const organization = await appOrganization(); - return await organization.betterAuthUpdateVerification({ - where: transformedWhere, - update: transformedUpdate, - }); + return await ensureWorkspaceVerification("authUpdateVerification", { where: transformedWhere, update: transformedUpdate }); } const userId = await resolveUserIdForQuery(model, transformedWhere, transformedUpdate); @@ -294,38 +292,29 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return null; } - const userActor = await getUser(userId); + const userActor = await getAuthUser(userId); const before = model === "user" - ? await userActor.betterAuthFindOneRecord({ model, where: transformedWhere }) + ? await userActor.findOneAuthRecord({ model, where: transformedWhere }) : model === "account" - ? await userActor.betterAuthFindOneRecord({ model, where: transformedWhere }) + ? await userActor.findOneAuthRecord({ model, where: transformedWhere }) : model === "session" - ? await userActor.betterAuthFindOneRecord({ model, where: transformedWhere }) + ? await userActor.findOneAuthRecord({ model, where: transformedWhere }) : null; - const updated = await userActor.betterAuthUpdateRecord({ - model, - where: transformedWhere, - update: transformedUpdate, - }); - const organization = await appOrganization(); + const updated = await userActor.updateAuthRecord({ model, where: transformedWhere, update: transformedUpdate }); + const workspace = await appWorkspace(); if (model === "user" && updated) { if (before?.email && before.email !== updated.email) { - await organization.betterAuthDeleteEmailIndex({ - email: before.email.toLowerCase(), - }); + await workspace.authDeleteEmailIndex({ email: before.email.toLowerCase() }); } if (updated.email) { - await organization.betterAuthUpsertEmailIndex({ - email: updated.email.toLowerCase(), - userId, - }); + await workspace.authUpsertEmailIndex({ email: updated.email.toLowerCase(), userId }); } } if (model === "session" && updated) { - await organization.betterAuthUpsertSessionIndex({ + await workspace.authUpsertSessionIndex({ sessionId: String(updated.id), sessionToken: String(updated.token), userId, @@ -333,7 +322,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin } if (model === "account" && updated) { - await organization.betterAuthUpsertAccountIndex({ + await workspace.authUpsertAccountIndex({ id: String(updated.id), providerId: String(updated.providerId), accountId: String(updated.accountId), @@ -348,11 +337,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const transformedWhere = transformWhereClause({ model, where, action: "updateMany" }); const transformedUpdate = (await transformInput(update as Record, model, "update", true)) as Record; if (model === "verification") { - const organization = await appOrganization(); - return await organization.betterAuthUpdateManyVerification({ - where: transformedWhere, - update: transformedUpdate, - }); + return await ensureWorkspaceVerification("authUpdateManyVerification", { where: transformedWhere, update: transformedUpdate }); } const userId = await resolveUserIdForQuery(model, transformedWhere, transformedUpdate); @@ -360,19 +345,14 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return 0; } - const userActor = await getUser(userId); - return await userActor.betterAuthUpdateManyRecords({ - model, - where: transformedWhere, - update: transformedUpdate, - }); + const userActor = await getAuthUser(userId); + return await userActor.updateManyAuthRecords({ model, where: transformedWhere, update: transformedUpdate }); }, delete: async ({ model, where }) => { const transformedWhere = transformWhereClause({ model, where, action: "delete" }); if (model === "verification") { - const organization = await appOrganization(); - await organization.betterAuthDeleteVerification({ where: transformedWhere }); + await ensureWorkspaceVerification("authDeleteVerification", { where: transformedWhere }); return; } @@ -381,20 +361,20 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return; } - const userActor = await getUser(userId); - const organization = await appOrganization(); - const before = await userActor.betterAuthFindOneRecord({ model, where: transformedWhere }); - await userActor.betterAuthDeleteRecord({ model, where: transformedWhere }); + const userActor = await getAuthUser(userId); + const workspace = await appWorkspace(); + const before = await userActor.findOneAuthRecord({ model, where: transformedWhere }); + await userActor.deleteAuthRecord({ model, where: transformedWhere }); if (model === "session" && before) { - await organization.betterAuthDeleteSessionIndex({ + await workspace.authDeleteSessionIndex({ sessionId: before.id, sessionToken: before.token, }); } if (model === "account" && before) { - await organization.betterAuthDeleteAccountIndex({ + await workspace.authDeleteAccountIndex({ id: before.id, providerId: before.providerId, accountId: before.accountId, @@ -402,17 +382,14 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin } if (model === "user" && before?.email) { - await organization.betterAuthDeleteEmailIndex({ - email: before.email.toLowerCase(), - }); + await workspace.authDeleteEmailIndex({ email: before.email.toLowerCase() }); } }, deleteMany: async ({ model, where }) => { const transformedWhere = transformWhereClause({ model, where, action: "deleteMany" }); if (model === "verification") { - const organization = await appOrganization(); - return await organization.betterAuthDeleteManyVerification({ where: transformedWhere }); + return await ensureWorkspaceVerification("authDeleteManyVerification", { where: transformedWhere }); } if (model === "session") { @@ -420,12 +397,12 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin if (!userId) { return 0; } - const userActor = await getUser(userId); - const organization = await appOrganization(); - const sessions = await userActor.betterAuthFindManyRecords({ model, where: transformedWhere, limit: 5000 }); - const deleted = await userActor.betterAuthDeleteManyRecords({ model, where: transformedWhere }); + const userActor = await getAuthUser(userId); + const workspace = await appWorkspace(); + const sessions = await userActor.findManyAuthRecords({ model, where: transformedWhere, limit: 5000 }); + const deleted = await userActor.deleteManyAuthRecords({ model, where: transformedWhere }); for (const session of sessions) { - await organization.betterAuthDeleteSessionIndex({ + await workspace.authDeleteSessionIndex({ sessionId: session.id, sessionToken: session.token, }); @@ -438,15 +415,15 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return 0; } - const userActor = await getUser(userId); - return await userActor.betterAuthDeleteManyRecords({ model, where: transformedWhere }); + const userActor = await getAuthUser(userId); + const deleted = await userActor.deleteManyAuthRecords({ model, where: transformedWhere }); + return deleted; }, count: async ({ model, where }) => { const transformedWhere = transformWhereClause({ model, where, action: "count" }); if (model === "verification") { - const organization = await appOrganization(); - return await organization.betterAuthCountVerification({ where: transformedWhere }); + return await ensureWorkspaceVerification("authCountVerification", { where: transformedWhere }); } const userId = await resolveUserIdForQuery(model, transformedWhere); @@ -454,8 +431,8 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return 0; } - const userActor = await getUser(userId); - return await userActor.betterAuthCountRecords({ model, where: transformedWhere }); + const userActor = await getAuthUser(userId); + return await userActor.countAuthRecords({ model, where: transformedWhere }); }, }; }, @@ -474,9 +451,6 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin strategy: "compact", }, }, - onAPIError: { - errorURL: stripTrailingSlash(options.appUrl) + "/signin", - }, socialProviders: { github: { clientId: requireEnv("GITHUB_CLIENT_ID"), @@ -502,18 +476,18 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin }, async getAuthState(sessionId: string) { - const organization = await appOrganization(); - const route = await organization.betterAuthFindSessionIndex({ sessionId }); + const workspace = await appWorkspace(); + const route = await workspace.authFindSessionIndex({ sessionId }); if (!route?.userId) { return null; } - const userActor = await getUser(route.userId); + const userActor = await getAuthUser(route.userId); return await userActor.getAppAuthState({ sessionId }); }, async upsertUserProfile(userId: string, patch: Record) { - const userActor = await getUser(userId); - return await userActor.upsertProfile({ userId, patch }); + const userActor = await getAuthUser(userId); + return await userActor.upsertUserProfile({ userId, patch }); }, async setActiveOrganization(sessionId: string, activeOrganizationId: string | null) { @@ -521,7 +495,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin if (!authState?.user?.id) { throw new Error(`Unknown auth session ${sessionId}`); } - const userActor = await getUser(authState.user.id); + const userActor = await getAuthUser(authState.user.id); return await userActor.upsertSessionState({ sessionId, activeOrganizationId }); }, diff --git a/foundry/packages/backend/src/services/branch-name-prefixes.ts b/foundry/packages/backend/src/services/branch-name-prefixes.ts deleted file mode 100644 index aaccaee..0000000 --- a/foundry/packages/backend/src/services/branch-name-prefixes.ts +++ /dev/null @@ -1,584 +0,0 @@ -// Auto-generated list of branch name prefixes. -// Source: McMaster-Carr product catalog. -export const BRANCH_NAME_PREFIXES: readonly string[] = [ - "abrasive-blasters", - "ac-motors", - "access-doors", - "adjustable-handles", - "aerosol-paint", - "air-cleaners", - "air-cylinders", - "air-filters", - "air-hose", - "air-knives", - "air-nozzles", - "air-regulators", - "air-ride-wheels", - "air-slides", - "alligator-clips", - "alloy-steel", - "aluminum-honeycomb", - "angle-indicators", - "antiseize-lubricants", - "antislip-fluid", - "backlight-panel-kits", - "ball-bearings", - "ball-end-mills", - "ball-joint-linkages", - "ball-transfers", - "band-clamps", - "band-saw-blades", - "bar-clamps", - "bar-grating", - "barbed-hose-fittings", - "barbed-tube-fittings", - "basket-strainers", - "batch-cans", - "battery-chargers", - "battery-holders", - "bead-chain", - "beam-clamps", - "belt-conveyors", - "bench-scales", - "bench-vises", - "bin-boxes", - "bin-storage", - "binding-posts", - "blank-tags", - "blasting-cabinets", - "blind-rivets", - "bluetooth-padlocks", - "boring-lathe-tools", - "box-reducers", - "box-wrenches", - "braided-hose", - "brass-pipe-fittings", - "breather-vents", - "butt-splices", - "c-clamps", - "cable-cutters", - "cable-holders", - "cable-tie-mounts", - "cable-ties", - "cam-handles", - "cam-latches", - "cam-locks", - "cap-nuts", - "captive-panel-screws", - "carbide-burs", - "carbide-inserts", - "carbon-fiber", - "carbon-steel", - "cardstock-tags", - "carriage-bolts", - "cast-acrylic", - "cast-iron", - "cast-nylon", - "casting-compounds", - "ceiling-lights", - "ceramic-adhesives", - "chain-slings", - "check-valves", - "chemical-hose", - "chemistry-meters", - "chemistry-testing", - "chip-clearing-tools", - "chucking-reamers", - "cinching-straps", - "circuit-breakers", - "circular-saw-blades", - "circular-saws", - "clamping-hangers", - "clevis-pins", - "clevis-rod-ends", - "clip-on-nuts", - "coaxial-connectors", - "coaxial-cords", - "coiled-spring-pins", - "compact-connectors", - "computer-adapters", - "concrete-adhesives", - "concrete-repair", - "contour-transfers", - "conveyor-belt-lacing", - "conveyor-belting", - "conveyor-brushes", - "conveyor-rollers", - "coolant-hose", - "copper-tube-fittings", - "copper-tubing", - "cord-grips", - "cord-reels", - "cotter-pins", - "coupling-nuts", - "cpvc-pipe-fittings", - "cup-brushes", - "cutoff-wheels", - "cylinder-hones", - "cylinder-racks", - "cylinder-trucks", - "data-cable", - "data-connectors", - "dc-motors", - "dead-blow-hammers", - "delrin-acetal-resin", - "desiccant-air-dryers", - "desktop-cranes", - "dial-calipers", - "dial-indicators", - "die-springs", - "direct-heaters", - "disconnect-switches", - "dispensing-needles", - "dispensing-pumps", - "disposable-clothing", - "disposable-gloves", - "document-protectors", - "door-closers", - "door-handles", - "door-holders", - "dowel-pins", - "drafting-equipment", - "drain-cleaners", - "drainage-mats", - "draw-latches", - "drawer-cabinets", - "drawer-slides", - "drill-bit-sets", - "drill-bits", - "drill-bushings", - "drill-chucks", - "drill-presses", - "drilling-screws", - "drinking-fountains", - "drive-anchors", - "drive-rollers", - "drive-shafts", - "drum-faucets", - "drum-pumps", - "drum-top-vacuums", - "drum-trucks", - "dry-box-gloves", - "dry-erase-boards", - "dry-film-lubricants", - "duct-fans", - "duct-hose", - "duct-tape", - "dust-collectors", - "dustless-chalk", - "edge-trim", - "electric-actuators", - "electric-drills", - "electric-drum-pumps", - "electric-mixers", - "electrical-switches", - "electrical-tape", - "electronic-calipers", - "enclosure-heaters", - "enclosure-panels", - "ethernet-cords", - "exhaust-fans", - "exit-lights", - "expansion-joints", - "expansion-plugs", - "extension-cords", - "extension-springs", - "fabric-snaps", - "fan-blades", - "fep-tubing", - "fiberglass-grating", - "file-holders", - "filter-bag-housings", - "filter-bags", - "filter-cartridges", - "fire-fighting-hose", - "first-aid-supplies", - "fixture-clamps", - "flange-locknuts", - "flange-mount-seals", - "flap-sanding-discs", - "flap-sanding-wheels", - "flared-tube-fittings", - "flashing-lights", - "flat-washers", - "flexible-shafts", - "flexible-shank-burs", - "flexible-trays", - "float-valves", - "floor-locks", - "floor-marking-tape", - "floor-scales", - "floor-squeegees", - "flow-sights", - "flow-switches", - "flowmeter-totalizers", - "foot-switches", - "force-gauges", - "fume-exhausters", - "garbage-bags", - "garden-hose", - "gas-hose", - "gas-regulators", - "gas-springs", - "gauge-blocks", - "glass-sights", - "gold-wire", - "grab-latches", - "grease-fittings", - "grinding-bits", - "grinding-wheels", - "hand-brushes", - "hand-chain-hoists", - "hand-reamers", - "hand-trucks", - "hand-wheels", - "hand-winches", - "hanging-scales", - "hard-hats", - "hardened-shafts", - "hardness-testers", - "heat-exchangers", - "heat-guns", - "heat-lamps", - "heat-sealable-bags", - "heat-set-inserts", - "heat-shrink-tubing", - "heat-sinks", - "heated-scrapers", - "helical-inserts", - "hex-bit-sockets", - "hex-head-screws", - "hex-nuts", - "high-accuracy-rulers", - "high-amp-relays", - "high-vacuum-filters", - "high-vacuum-sights", - "hinge-adjusters", - "hoist-rings", - "hole-saws", - "hose-couplings", - "hose-reels", - "hot-melt-glue", - "hydraulic-cylinders", - "hydraulic-hose", - "hydraulic-jacks", - "iec-connectors", - "immersion-heaters", - "impression-foam", - "indicating-lights", - "inflatable-wedges", - "ink-markers", - "insertion-heaters", - "inspection-mirrors", - "instrument-carts", - "insulation-jacketing", - "jam-removers", - "jigsaw-blades", - "key-cabinets", - "key-locking-inserts", - "key-stock", - "keyed-drive-shafts", - "keyseat-end-mills", - "l-key-sets", - "l-keys", - "label-holders", - "latching-connectors", - "lathe-tools", - "lavatory-partitions", - "lead-screws", - "leveling-lasers", - "leveling-mounts", - "lid-supports", - "lift-off-hinges", - "lift-trucks", - "light-bulbs", - "limit-switches", - "linear-ball-bearings", - "liquid-level-gauges", - "lock-washers", - "lockout-devices", - "loop-clamps", - "loop-hangers", - "machine-brackets", - "machine-handles", - "machine-keys", - "magnetic-base-drills", - "magnetic-bumpers", - "masking-tape", - "masonry-drill-bits", - "medium-amp-relays", - "metal-cable-ties", - "metal-panels", - "metal-plates", - "metal-tags", - "metering-pumps", - "metric-o-rings", - "mil-spec-connectors", - "mobile-lift-tables", - "motor-controls", - "motor-starters", - "mountable-cable-ties", - "mounting-tape", - "neoprene-foam", - "nickel-titanium", - "nonmarring-hammers", - "nonslip-bumpers", - "nylon-rivets", - "nylon-tubing", - "o-rings", - "oil-level-indicators", - "oil-reservoirs", - "oil-skimmers", - "on-off-valves", - "open-end-wrenches", - "outlet-boxes", - "outlet-strips", - "packaging-tape", - "paint-brushes", - "paint-markers", - "paint-sprayers", - "pallet-racks", - "pallet-trucks", - "panel-air-filters", - "parts-baskets", - "pendant-switches", - "perforated-sheets", - "pest-control", - "petroleum-hose", - "piano-hinges", - "pipe-couplings", - "pipe-gaskets", - "pipe-markers", - "pipe-wrenches", - "plank-grating", - "plastic-clamps", - "plastic-mesh", - "plate-lifting-clamps", - "platinum-wire", - "plier-clamps", - "plug-gauges", - "portable-lights", - "power-cords", - "power-supplied", - "power-supplies", - "precision-knives", - "press-fit-nuts", - "press-in-nuts", - "protecting-tape", - "protective-coatings", - "protective-curtains", - "protective-panels", - "protective-wrap", - "proximity-switches", - "pull-handles", - "push-brooms", - "push-nuts", - "push-on-seals", - "pvc-pipe-fittings", - "pvc-tubing", - "quick-release-pins", - "ratchet-pullers", - "recycled-plastics", - "repair-adhesives", - "repair-clamps", - "reusable-cable-ties", - "ring-terminals", - "rivet-nuts", - "robot-base-mounts", - "robot-bases", - "rocker-switches", - "rod-wipers", - "roller-bearings", - "roller-chain", - "roller-conveyors", - "roof-exhaust-fans", - "roof-repair", - "rotary-broaches", - "rotary-hammers", - "rotary-shaft-seals", - "rotating-cranes", - "rotating-joints", - "router-bits", - "rtd-probes", - "rubber-edge-seals", - "rubber-tread-wheels", - "rubber-tubing", - "safety-cabinets", - "safety-glasses", - "safety-mirrors", - "sanding-belts", - "sanding-discs", - "sanding-guides", - "sanding-rolls", - "sanding-sheets", - "screw-extractors", - "screw-jacks", - "scrub-brushes", - "sealing-washers", - "security-lights", - "sensor-connectors", - "set-screws", - "setup-clamps", - "shaft-collars", - "shaft-couplings", - "shaft-repair-sleeves", - "shaft-supports", - "sharpening-stones", - "sheet-metal-cutters", - "shelf-cabinets", - "shim-stock", - "shim-tape", - "shipping-pails", - "shock-absorbers", - "shoulder-screws", - "shower-stations", - "silicone-foam", - "sleeve-bearings", - "slide-bolts", - "slitting-saws", - "slotted-spring-pins", - "sludge-samplers", - "small-parts-storage", - "snap-acting-switches", - "soap-dispensers", - "socket-head-screws", - "socket-organizers", - "socket-wrenches", - "soldering-irons", - "solid-rivets", - "solid-rod-ends", - "sound-insulation", - "space-heaters", - "spacing-beads", - "spanner-wrenches", - "specialty-pliers", - "specialty-vises", - "specialty-washers", - "speed-reducers", - "splicing-connectors", - "spray-bottles", - "spray-nozzles", - "spring-clamps", - "spring-plungers", - "spring-steel", - "square-drive-sockets", - "square-end-mills", - "square-nuts", - "squeeze-bottles", - "stack-lights", - "stainless-steel", - "stair-treads", - "static-control-mats", - "steel-carts", - "steel-pipe-fittings", - "steel-pipe-flanges", - "steel-stamps", - "steel-tubing", - "step-ladders", - "stepper-motors", - "storage-bags", - "storage-boxes", - "storage-chests", - "straight-ladders", - "strap-hinges", - "stretch-wrap", - "strip-doors", - "strip-springs", - "strobe-lights", - "structural-adhesives", - "strut-channel", - "strut-channel-nuts", - "strut-mount-clamps", - "suction-cup-lifters", - "suction-strainers", - "super-absorbent-foam", - "super-flexible-glass", - "surface-fillers", - "surface-mount-hinges", - "t-handle-keys", - "t-slotted-framing", - "tamper-seals", - "tank-level-measurers", - "tape-dispensers", - "tape-measures", - "taper-pins", - "tapping-screws", - "teflon-ptfe", - "terminal-blocks", - "test-indicators", - "test-leads", - "test-weights", - "tethered-knobs", - "thermal-insulation", - "thread-adapters", - "thread-sealant-tape", - "thread-sealants", - "threaded-inserts", - "threaded-standoffs", - "threaded-studs", - "thrust-ball-bearings", - "thrust-bearings", - "thumb-nuts", - "thumb-screws", - "tie-down-rings", - "time-clocks", - "timer-relays", - "timer-switches", - "toggle-clamps", - "toggle-switches", - "tool-holders", - "tool-sets", - "tool-steel", - "torque-wrenches", - "torsion-springs", - "tote-boxes", - "touch-bars", - "track-casters", - "track-rollers", - "track-wheels", - "traction-mats", - "trolley-systems", - "tube-brushes", - "tube-fittings", - "tubular-light-bulbs", - "turn-lock-connectors", - "twist-ties", - "u-bolts", - "u-joints", - "ul-class-fuses", - "unthreaded-spacers", - "usb-adapters", - "usb-cords", - "utility-knives", - "v-belts", - "vacuum-cups", - "vacuum-pumps", - "wall-louvers", - "wash-fountains", - "wash-guns", - "waste-containers", - "water-deionizers", - "water-filters", - "water-hose", - "water-removal-pumps", - "weather-stations", - "web-slings", - "weld-nuts", - "welding-clothing", - "welding-helmets", - "wet-dry-vacuums", - "wet-mops", - "wheel-brushes", - "wing-nuts", - "wire-cloth", - "wire-connectors", - "wire-cutting-pliers", - "wire-partitions", - "wire-rope", - "wire-rope-clamps", - "wire-wrap", - "wool-felt", - "work-platforms", - "workbench-legs", - "woven-wire-cloth", -] as const; diff --git a/foundry/packages/backend/src/services/create-flow.ts b/foundry/packages/backend/src/services/create-flow.ts index eb9e53f..8341399 100644 --- a/foundry/packages/backend/src/services/create-flow.ts +++ b/foundry/packages/backend/src/services/create-flow.ts @@ -1,5 +1,3 @@ -import { BRANCH_NAME_PREFIXES } from "./branch-name-prefixes.js"; - export interface ResolveCreateFlowDecisionInput { task: string; explicitTitle?: string; @@ -91,42 +89,30 @@ export function sanitizeBranchName(input: string): string { return trimmed.slice(0, 50).replace(/-+$/g, ""); } -function generateRandomSuffix(length: number): string { - const chars = "abcdefghijklmnopqrstuvwxyz0123456789"; - let result = ""; - for (let i = 0; i < length; i++) { - result += chars[Math.floor(Math.random() * chars.length)]; - } - return result; -} - -function generateBranchName(): string { - const prefix = BRANCH_NAME_PREFIXES[Math.floor(Math.random() * BRANCH_NAME_PREFIXES.length)]!; - const suffix = generateRandomSuffix(4); - return `${prefix}-${suffix}`; -} - export function resolveCreateFlowDecision(input: ResolveCreateFlowDecisionInput): ResolveCreateFlowDecisionResult { const explicitBranch = input.explicitBranchName?.trim(); const title = deriveFallbackTitle(input.task, input.explicitTitle); + const generatedBase = sanitizeBranchName(title) || "task"; + + const branchBase = explicitBranch && explicitBranch.length > 0 ? explicitBranch : generatedBase; const existingBranches = new Set(input.localBranches.map((value) => value.trim()).filter((value) => value.length > 0)); const existingTaskBranches = new Set(input.taskBranches.map((value) => value.trim()).filter((value) => value.length > 0)); const conflicts = (name: string): boolean => existingBranches.has(name) || existingTaskBranches.has(name); - if (explicitBranch && explicitBranch.length > 0) { - if (conflicts(explicitBranch)) { - throw new Error(`Branch '${explicitBranch}' already exists. Choose a different --name/--branch value.`); - } - return { title, branchName: explicitBranch }; + if (explicitBranch && conflicts(branchBase)) { + throw new Error(`Branch '${branchBase}' already exists. Choose a different --name/--branch value.`); } - // Generate a random McMaster-Carr-style branch name, retrying on conflicts - let candidate = generateBranchName(); - let attempts = 0; - while (conflicts(candidate) && attempts < 100) { - candidate = generateBranchName(); - attempts += 1; + if (explicitBranch) { + return { title, branchName: branchBase }; + } + + let candidate = branchBase; + let index = 2; + while (conflicts(candidate)) { + candidate = `${branchBase}-${index}`; + index += 1; } return { diff --git a/foundry/packages/backend/src/services/foundry-paths.ts b/foundry/packages/backend/src/services/foundry-paths.ts new file mode 100644 index 0000000..d56c38d --- /dev/null +++ b/foundry/packages/backend/src/services/foundry-paths.ts @@ -0,0 +1,20 @@ +import type { AppConfig } from "@sandbox-agent/foundry-shared"; +import { homedir } from "node:os"; +import { dirname, join, resolve } from "node:path"; + +function expandPath(input: string): string { + if (input.startsWith("~/")) { + return `${homedir()}/${input.slice(2)}`; + } + return input; +} + +export function foundryDataDir(config: AppConfig): string { + // Keep data collocated with the backend DB by default. + const dbPath = expandPath(config.backend.dbPath); + return resolve(dirname(dbPath)); +} + +export function foundryRepoClonePath(config: AppConfig, workspaceId: string, repoId: string): string { + return resolve(join(foundryDataDir(config), "repos", workspaceId, repoId)); +} diff --git a/foundry/packages/backend/src/services/github-auth.ts b/foundry/packages/backend/src/services/github-auth.ts index aa475b0..8249927 100644 --- a/foundry/packages/backend/src/services/github-auth.ts +++ b/foundry/packages/backend/src/services/github-auth.ts @@ -1,20 +1,20 @@ -import { getOrCreateOrganization } from "../actors/handles.js"; -import { APP_SHELL_ORGANIZATION_ID } from "../actors/organization/constants.js"; +import { getOrCreateWorkspace } from "../actors/handles.js"; +import { APP_SHELL_WORKSPACE_ID } from "../actors/workspace/app-shell.js"; export interface ResolvedGithubAuth { githubToken: string; scopes: string[]; } -export async function resolveOrganizationGithubAuth(c: any, organizationId: string): Promise { - if (!organizationId || organizationId === APP_SHELL_ORGANIZATION_ID) { +export async function resolveWorkspaceGithubAuth(c: any, workspaceId: string): Promise { + if (!workspaceId || workspaceId === APP_SHELL_WORKSPACE_ID) { return null; } try { - const appOrganization = await getOrCreateOrganization(c, APP_SHELL_ORGANIZATION_ID); - const resolved = await appOrganization.resolveAppGithubToken({ - organizationId: organizationId, + const appWorkspace = await getOrCreateWorkspace(c, APP_SHELL_WORKSPACE_ID); + const resolved = await appWorkspace.resolveAppGithubToken({ + organizationId: workspaceId, requireRepoScope: true, }); if (!resolved?.accessToken) { diff --git a/foundry/packages/backend/src/services/queue.ts b/foundry/packages/backend/src/services/queue.ts index 34e697c..b366375 100644 --- a/foundry/packages/backend/src/services/queue.ts +++ b/foundry/packages/backend/src/services/queue.ts @@ -7,14 +7,6 @@ export function expectQueueResponse(result: QueueSendResult | void): T { if (!result || result.status === "timedOut") { throw new Error("Queue command timed out"); } - if ( - result.response && - typeof result.response === "object" && - "error" in result.response && - typeof (result.response as { error?: unknown }).error === "string" - ) { - throw new Error((result.response as { error: string }).error); - } return result.response as T; } diff --git a/foundry/packages/backend/src/services/repo-git-lock.ts b/foundry/packages/backend/src/services/repo-git-lock.ts new file mode 100644 index 0000000..971b95c --- /dev/null +++ b/foundry/packages/backend/src/services/repo-git-lock.ts @@ -0,0 +1,45 @@ +interface RepoLockState { + locked: boolean; + waiters: Array<() => void>; +} + +const repoLocks = new Map(); + +async function acquireRepoLock(repoPath: string): Promise<() => void> { + let state = repoLocks.get(repoPath); + if (!state) { + state = { locked: false, waiters: [] }; + repoLocks.set(repoPath, state); + } + + if (!state.locked) { + state.locked = true; + return () => releaseRepoLock(repoPath, state); + } + + await new Promise((resolve) => { + state!.waiters.push(resolve); + }); + + return () => releaseRepoLock(repoPath, state!); +} + +function releaseRepoLock(repoPath: string, state: RepoLockState): void { + const next = state.waiters.shift(); + if (next) { + next(); + return; + } + + state.locked = false; + repoLocks.delete(repoPath); +} + +export async function withRepoGitLock(repoPath: string, fn: () => Promise): Promise { + const release = await acquireRepoLock(repoPath); + try { + return await fn(); + } finally { + release(); + } +} diff --git a/foundry/packages/backend/src/services/repo.ts b/foundry/packages/backend/src/services/repo.ts index fb673cc..910f4e8 100644 --- a/foundry/packages/backend/src/services/repo.ts +++ b/foundry/packages/backend/src/services/repo.ts @@ -82,30 +82,3 @@ export function repoLabelFromRemote(remoteUrl: string): string { return basename(trimmed.replace(/\.git$/i, "")); } - -export function githubRepoFullNameFromRemote(remoteUrl: string): string | null { - const normalized = normalizeRemoteUrl(remoteUrl); - if (!normalized) { - return null; - } - - try { - const url = new URL(normalized); - const hostname = url.hostname.replace(/^www\./i, "").toLowerCase(); - if (hostname !== "github.com") { - return null; - } - const parts = url.pathname.replace(/\/+$/, "").split("/").filter(Boolean); - if (parts.length < 2) { - return null; - } - const owner = parts[0]?.trim(); - const repo = (parts[1] ?? "").replace(/\.git$/i, "").trim(); - if (!owner || !repo) { - return null; - } - return `${owner}/${repo}`; - } catch { - return null; - } -} diff --git a/foundry/packages/backend/test/create-flow.test.ts b/foundry/packages/backend/test/create-flow.test.ts index 8c66cb4..498c4dc 100644 --- a/foundry/packages/backend/test/create-flow.test.ts +++ b/foundry/packages/backend/test/create-flow.test.ts @@ -1,6 +1,5 @@ import { describe, expect, it } from "vitest"; import { deriveFallbackTitle, resolveCreateFlowDecision, sanitizeBranchName } from "../src/services/create-flow.js"; -import { BRANCH_NAME_PREFIXES } from "../src/services/branch-name-prefixes.js"; describe("create flow decision", () => { it("derives a conventional-style fallback title from task text", () => { @@ -18,49 +17,15 @@ describe("create flow decision", () => { expect(sanitizeBranchName(" spaces everywhere ")).toBe("spaces-everywhere"); }); - it("generates a McMaster-Carr-style branch name with random suffix", () => { + it("auto-increments generated branch names for conflicts", () => { const resolved = resolveCreateFlowDecision({ task: "Add auth", - localBranches: [], - taskBranches: [], + localBranches: ["feat-add-auth"], + taskBranches: ["feat-add-auth-2"], }); expect(resolved.title).toBe("feat: Add auth"); - // Branch name should be "-<4-char-suffix>" where prefix is from BRANCH_NAME_PREFIXES - const lastDash = resolved.branchName.lastIndexOf("-"); - const prefix = resolved.branchName.slice(0, lastDash); - const suffix = resolved.branchName.slice(lastDash + 1); - expect(BRANCH_NAME_PREFIXES).toContain(prefix); - expect(suffix).toMatch(/^[a-z0-9]{4}$/); - }); - - it("avoids conflicts by generating a different random name", () => { - // Even with a conflicting branch, it should produce something different - const resolved = resolveCreateFlowDecision({ - task: "Add auth", - localBranches: [], - taskBranches: [], - }); - - // Running again with the first result as a conflict should produce a different name - const resolved2 = resolveCreateFlowDecision({ - task: "Add auth", - localBranches: [resolved.branchName], - taskBranches: [], - }); - - expect(resolved2.branchName).not.toBe(resolved.branchName); - }); - - it("uses explicit branch name when provided", () => { - const resolved = resolveCreateFlowDecision({ - task: "new task", - explicitBranchName: "my-branch", - localBranches: [], - taskBranches: [], - }); - - expect(resolved.branchName).toBe("my-branch"); + expect(resolved.branchName).toBe("feat-add-auth-3"); }); it("fails when explicit branch already exists", () => { diff --git a/foundry/packages/backend/test/daytona-provider.test.ts b/foundry/packages/backend/test/daytona-provider.test.ts new file mode 100644 index 0000000..363b405 --- /dev/null +++ b/foundry/packages/backend/test/daytona-provider.test.ts @@ -0,0 +1,184 @@ +import { describe, expect, it } from "vitest"; +import type { DaytonaClientLike, DaytonaDriver } from "../src/driver.js"; +import type { DaytonaCreateSandboxOptions } from "../src/integrations/daytona/client.js"; +import { DaytonaProvider } from "../src/providers/daytona/index.js"; + +class RecordingDaytonaClient implements DaytonaClientLike { + createSandboxCalls: DaytonaCreateSandboxOptions[] = []; + executedCommands: string[] = []; + + async createSandbox(options: DaytonaCreateSandboxOptions) { + this.createSandboxCalls.push(options); + return { + id: "sandbox-1", + state: "started", + snapshot: "snapshot-foundry", + labels: {}, + }; + } + + async getSandbox(sandboxId: string) { + return { + id: sandboxId, + state: "started", + snapshot: "snapshot-foundry", + labels: {}, + }; + } + + async startSandbox(_sandboxId: string, _timeoutSeconds?: number) {} + + async stopSandbox(_sandboxId: string, _timeoutSeconds?: number) {} + + async deleteSandbox(_sandboxId: string) {} + + async executeCommand(_sandboxId: string, command: string) { + this.executedCommands.push(command); + return { exitCode: 0, result: "" }; + } + + async getPreviewEndpoint(sandboxId: string, port: number) { + return { + url: `https://preview.example/sandbox/${sandboxId}/port/${port}`, + token: "preview-token", + }; + } +} + +function createProviderWithClient(client: DaytonaClientLike): DaytonaProvider { + const daytonaDriver: DaytonaDriver = { + createClient: () => client, + }; + + return new DaytonaProvider( + { + apiKey: "test-key", + image: "ubuntu:24.04", + }, + daytonaDriver, + ); +} + +describe("daytona provider snapshot image behavior", () => { + it("creates sandboxes using a snapshot-capable image recipe", async () => { + const client = new RecordingDaytonaClient(); + const provider = createProviderWithClient(client); + + const handle = await provider.createSandbox({ + workspaceId: "default", + repoId: "repo-1", + repoRemote: "https://github.com/acme/repo.git", + branchName: "feature/test", + taskId: "task-1", + }); + + expect(client.createSandboxCalls).toHaveLength(1); + const createCall = client.createSandboxCalls[0]; + if (!createCall) { + throw new Error("expected create sandbox call"); + } + + expect(typeof createCall.image).not.toBe("string"); + if (typeof createCall.image === "string") { + throw new Error("expected daytona image recipe object"); + } + + const dockerfile = createCall.image.dockerfile; + expect(dockerfile).toContain("apt-get install -y curl ca-certificates git openssh-client nodejs npm"); + expect(dockerfile).toContain("sandbox-agent/0.3.0/install.sh"); + const installAgentLines = dockerfile.match(/sandbox-agent install-agent [a-z0-9-]+/gi) ?? []; + expect(installAgentLines.length).toBeGreaterThanOrEqual(2); + const commands = client.executedCommands.join("\n"); + expect(commands).toContain("GIT_TERMINAL_PROMPT=0"); + expect(commands).toContain("GIT_ASKPASS=/bin/echo"); + + expect(handle.metadata.snapshot).toBe("snapshot-foundry"); + expect(handle.metadata.image).toBe("ubuntu:24.04"); + expect(handle.metadata.cwd).toBe("/home/daytona/foundry/default/repo-1/task-1/repo"); + expect(client.executedCommands.length).toBeGreaterThan(0); + }); + + it("starts sandbox-agent with ACP timeout env override", async () => { + const previous = process.env.HF_SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS; + process.env.HF_SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS = "240000"; + + try { + const client = new RecordingDaytonaClient(); + const provider = createProviderWithClient(client); + + await provider.ensureSandboxAgent({ + workspaceId: "default", + sandboxId: "sandbox-1", + }); + + const startCommand = client.executedCommands.find((command) => + command.includes("nohup env SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS=240000 sandbox-agent server"), + ); + + const joined = client.executedCommands.join("\n"); + expect(joined).toContain("sandbox-agent/0.3.0/install.sh"); + expect(joined).toContain("SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS=240000"); + expect(joined).toContain("apt-get install -y nodejs npm"); + expect(joined).toContain("sandbox-agent server --no-token --host 0.0.0.0 --port 2468"); + expect(startCommand).toBeTruthy(); + } finally { + if (previous === undefined) { + delete process.env.HF_SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS; + } else { + process.env.HF_SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS = previous; + } + } + }); + + it("fails with explicit timeout when daytona createSandbox hangs", async () => { + const previous = process.env.HF_DAYTONA_REQUEST_TIMEOUT_MS; + process.env.HF_DAYTONA_REQUEST_TIMEOUT_MS = "120"; + + const hangingClient: DaytonaClientLike = { + createSandbox: async () => await new Promise(() => {}), + getSandbox: async (sandboxId) => ({ id: sandboxId, state: "started" }), + startSandbox: async () => {}, + stopSandbox: async () => {}, + deleteSandbox: async () => {}, + executeCommand: async () => ({ exitCode: 0, result: "" }), + getPreviewEndpoint: async (sandboxId, port) => ({ + url: `https://preview.example/sandbox/${sandboxId}/port/${port}`, + token: "preview-token", + }), + }; + + try { + const provider = createProviderWithClient(hangingClient); + await expect( + provider.createSandbox({ + workspaceId: "default", + repoId: "repo-1", + repoRemote: "https://github.com/acme/repo.git", + branchName: "feature/test", + taskId: "task-timeout", + }), + ).rejects.toThrow("daytona create sandbox timed out after 120ms"); + } finally { + if (previous === undefined) { + delete process.env.HF_DAYTONA_REQUEST_TIMEOUT_MS; + } else { + process.env.HF_DAYTONA_REQUEST_TIMEOUT_MS = previous; + } + } + }); + + it("executes backend-managed sandbox commands through provider API", async () => { + const client = new RecordingDaytonaClient(); + const provider = createProviderWithClient(client); + + const result = await provider.executeCommand({ + workspaceId: "default", + sandboxId: "sandbox-1", + command: "echo backend-push", + label: "manual push", + }); + + expect(result.exitCode).toBe(0); + expect(client.executedCommands).toContain("echo backend-push"); + }); +}); diff --git a/foundry/packages/backend/test/git-spice.test.ts b/foundry/packages/backend/test/git-spice.test.ts new file mode 100644 index 0000000..d0b0455 --- /dev/null +++ b/foundry/packages/backend/test/git-spice.test.ts @@ -0,0 +1,129 @@ +import { chmodSync, mkdtempSync, writeFileSync, readFileSync } from "node:fs"; +import { tmpdir } from "node:os"; +import { join } from "node:path"; +import { describe, expect, it } from "vitest"; +import { gitSpiceAvailable, gitSpiceListStack, gitSpiceRestackSubtree } from "../src/integrations/git-spice/index.js"; + +function makeTempDir(prefix: string): string { + return mkdtempSync(join(tmpdir(), prefix)); +} + +function writeScript(path: string, body: string): void { + writeFileSync(path, body, "utf8"); + chmodSync(path, 0o755); +} + +async function withEnv(updates: Record, fn: () => Promise): Promise { + const previous = new Map(); + for (const [key, value] of Object.entries(updates)) { + previous.set(key, process.env[key]); + if (value == null) { + delete process.env[key]; + } else { + process.env[key] = value; + } + } + + try { + return await fn(); + } finally { + for (const [key, value] of previous) { + if (value == null) { + delete process.env[key]; + } else { + process.env[key] = value; + } + } + } +} + +describe("git-spice integration", () => { + it("parses stack rows from mixed/malformed json output", async () => { + const repoPath = makeTempDir("hf-git-spice-parse-"); + const scriptPath = join(repoPath, "fake-git-spice.sh"); + writeScript( + scriptPath, + [ + "#!/bin/sh", + 'if [ \"$1\" = \"--help\" ]; then', + " exit 0", + "fi", + 'if [ \"$1\" = \"log\" ]; then', + " echo 'noise line'", + ' echo \'{"branch":"feature/a","parent":"main"}\'', + " echo '{bad json'", + ' echo \'{"name":"feature/b","parentBranch":"feature/a"}\'', + ' echo \'{"name":"feature/a","parent":"main"}\'', + " exit 0", + "fi", + "exit 1", + ].join("\n"), + ); + + await withEnv({ HF_GIT_SPICE_BIN: scriptPath }, async () => { + const rows = await gitSpiceListStack(repoPath); + expect(rows).toEqual([ + { branchName: "feature/a", parentBranch: "main" }, + { branchName: "feature/b", parentBranch: "feature/a" }, + ]); + }); + }); + + it("falls back across versioned subtree restack command variants", async () => { + const repoPath = makeTempDir("hf-git-spice-fallback-"); + const scriptPath = join(repoPath, "fake-git-spice.sh"); + const logPath = join(repoPath, "calls.log"); + writeScript( + scriptPath, + [ + "#!/bin/sh", + 'echo \"$*\" >> \"$SPICE_LOG_PATH\"', + 'if [ \"$1\" = \"--help\" ]; then', + " exit 0", + "fi", + 'if [ \"$1\" = \"upstack\" ] && [ \"$2\" = \"restack\" ]; then', + " exit 1", + "fi", + 'if [ \"$1\" = \"branch\" ] && [ \"$2\" = \"restack\" ] && [ \"$5\" = \"--no-prompt\" ]; then', + " exit 0", + "fi", + "exit 1", + ].join("\n"), + ); + + await withEnv( + { + HF_GIT_SPICE_BIN: scriptPath, + SPICE_LOG_PATH: logPath, + }, + async () => { + await gitSpiceRestackSubtree(repoPath, "feature/a"); + }, + ); + + const lines = readFileSync(logPath, "utf8") + .trim() + .split("\n") + .filter((line) => line.trim().length > 0); + + expect(lines).toContain("upstack restack --branch feature/a --no-prompt"); + expect(lines).toContain("upstack restack --branch feature/a"); + expect(lines).toContain("branch restack --branch feature/a --no-prompt"); + expect(lines).not.toContain("branch restack --branch feature/a"); + }); + + it("reports unavailable when explicit binary and PATH are missing", async () => { + const repoPath = makeTempDir("hf-git-spice-missing-"); + + await withEnv( + { + HF_GIT_SPICE_BIN: "/non-existent/hf-git-spice-binary", + PATH: "/non-existent/bin", + }, + async () => { + const available = await gitSpiceAvailable(repoPath); + expect(available).toBe(false); + }, + ); + }); +}); diff --git a/foundry/packages/backend/test/git-validate-remote.test.ts b/foundry/packages/backend/test/git-validate-remote.test.ts new file mode 100644 index 0000000..47849a2 --- /dev/null +++ b/foundry/packages/backend/test/git-validate-remote.test.ts @@ -0,0 +1,40 @@ +import { afterEach, beforeEach, describe, expect, test } from "vitest"; +import { mkdtempSync, mkdirSync, writeFileSync } from "node:fs"; +import { tmpdir } from "node:os"; +import { join, resolve } from "node:path"; +import { promisify } from "node:util"; +import { execFile } from "node:child_process"; +import { validateRemote } from "../src/integrations/git/index.js"; + +const execFileAsync = promisify(execFile); + +describe("validateRemote", () => { + const originalCwd = process.cwd(); + + beforeEach(() => { + process.chdir(originalCwd); + }); + + afterEach(() => { + process.chdir(originalCwd); + }); + + test("ignores broken worktree gitdir in current directory", async () => { + const sandboxDir = mkdtempSync(join(tmpdir(), "validate-remote-cwd-")); + const brokenRepoDir = resolve(sandboxDir, "broken-worktree"); + const remoteRepoDir = resolve(sandboxDir, "remote"); + + mkdirSync(brokenRepoDir, { recursive: true }); + writeFileSync(resolve(brokenRepoDir, ".git"), "gitdir: /definitely/missing/worktree\n", "utf8"); + await execFileAsync("git", ["init", remoteRepoDir]); + await execFileAsync("git", ["-C", remoteRepoDir, "config", "user.name", "Foundry Test"]); + await execFileAsync("git", ["-C", remoteRepoDir, "config", "user.email", "test@example.com"]); + writeFileSync(resolve(remoteRepoDir, "README.md"), "# test\n", "utf8"); + await execFileAsync("git", ["-C", remoteRepoDir, "add", "README.md"]); + await execFileAsync("git", ["-C", remoteRepoDir, "commit", "-m", "init"]); + + process.chdir(brokenRepoDir); + + await expect(validateRemote(remoteRepoDir)).resolves.toBeUndefined(); + }); +}); diff --git a/foundry/packages/backend/test/helpers/test-context.ts b/foundry/packages/backend/test/helpers/test-context.ts index be169a8..07107ac 100644 --- a/foundry/packages/backend/test/helpers/test-context.ts +++ b/foundry/packages/backend/test/helpers/test-context.ts @@ -3,13 +3,14 @@ import { join } from "node:path"; import { ConfigSchema, type AppConfig } from "@sandbox-agent/foundry-shared"; import type { BackendDriver } from "../../src/driver.js"; import { initActorRuntimeContext } from "../../src/actors/context.js"; +import { createProviderRegistry } from "../../src/providers/index.js"; import { createDefaultAppShellServices } from "../../src/services/app-shell-runtime.js"; export function createTestConfig(overrides?: Partial): AppConfig { return ConfigSchema.parse({ auto_submit: true, notify: ["terminal" as const], - organization: { default: "default" }, + workspace: { default: "default" }, backend: { host: "127.0.0.1", port: 7741, @@ -19,9 +20,8 @@ export function createTestConfig(overrides?: Partial): AppConfig { backup_interval_secs: 3600, backup_retention_days: 7, }, - sandboxProviders: { - local: {}, - e2b: {}, + providers: { + daytona: { image: "ubuntu:24.04" }, }, ...overrides, }); @@ -29,6 +29,7 @@ export function createTestConfig(overrides?: Partial): AppConfig { export function createTestRuntimeContext(driver: BackendDriver, configOverrides?: Partial): { config: AppConfig } { const config = createTestConfig(configOverrides); - initActorRuntimeContext(config, undefined, driver, createDefaultAppShellServices()); + const providers = createProviderRegistry(config, driver); + initActorRuntimeContext(config, providers, undefined, driver, createDefaultAppShellServices()); return { config }; } diff --git a/foundry/packages/backend/test/helpers/test-driver.ts b/foundry/packages/backend/test/helpers/test-driver.ts index 39975e5..c5b8bc4 100644 --- a/foundry/packages/backend/test/helpers/test-driver.ts +++ b/foundry/packages/backend/test/helpers/test-driver.ts @@ -1,15 +1,60 @@ -import type { BackendDriver, GithubDriver, TmuxDriver } from "../../src/driver.js"; +import type { + BackendDriver, + DaytonaClientLike, + DaytonaDriver, + GitDriver, + GithubDriver, + StackDriver, + SandboxAgentDriver, + SandboxAgentClientLike, + TmuxDriver, +} from "../../src/driver.js"; +import type { ListEventsRequest, ListPage, ListPageRequest, ProcessInfo, ProcessLogsResponse, SessionEvent, SessionRecord } from "sandbox-agent"; export function createTestDriver(overrides?: Partial): BackendDriver { return { + git: overrides?.git ?? createTestGitDriver(), + stack: overrides?.stack ?? createTestStackDriver(), github: overrides?.github ?? createTestGithubDriver(), + sandboxAgent: overrides?.sandboxAgent ?? createTestSandboxAgentDriver(), + daytona: overrides?.daytona ?? createTestDaytonaDriver(), tmux: overrides?.tmux ?? createTestTmuxDriver(), }; } +export function createTestGitDriver(overrides?: Partial): GitDriver { + return { + validateRemote: async () => {}, + ensureCloned: async () => {}, + fetch: async () => {}, + listRemoteBranches: async () => [], + remoteDefaultBaseRef: async () => "origin/main", + revParse: async () => "abc1234567890", + ensureRemoteBranch: async () => {}, + diffStatForBranch: async () => "+0/-0", + conflictsWithMain: async () => false, + ...overrides, + }; +} + +export function createTestStackDriver(overrides?: Partial): StackDriver { + return { + available: async () => false, + listStack: async () => [], + syncRepo: async () => {}, + restackRepo: async () => {}, + restackSubtree: async () => {}, + rebaseBranch: async () => {}, + reparentBranch: async () => {}, + trackBranch: async () => {}, + ...overrides, + }; +} + export function createTestGithubDriver(overrides?: Partial): GithubDriver { return { - createPr: async (_repoFullName, _headBranch, _title) => ({ + listPullRequests: async () => [], + createPr: async (_repoPath, _headBranch, _title) => ({ number: 1, url: `https://github.com/test/repo/pull/1`, }), @@ -18,6 +63,79 @@ export function createTestGithubDriver(overrides?: Partial): Githu }; } +export function createTestSandboxAgentDriver(overrides?: Partial): SandboxAgentDriver { + return { + createClient: (_opts) => createTestSandboxAgentClient(), + ...overrides, + }; +} + +export function createTestSandboxAgentClient(overrides?: Partial): SandboxAgentClientLike { + const defaultProcess: ProcessInfo = { + id: "process-1", + command: "bash", + args: ["-lc", "echo test"], + createdAtMs: Date.now(), + cwd: "/workspace", + exitCode: null, + exitedAtMs: null, + interactive: true, + pid: 123, + status: "running", + tty: true, + }; + const defaultLogs: ProcessLogsResponse = { + processId: defaultProcess.id, + stream: "combined", + entries: [], + }; + return { + createSession: async (_prompt) => ({ id: "test-session-1", status: "running" }), + sessionStatus: async (sessionId) => ({ id: sessionId, status: "running" }), + listSessions: async (_request?: ListPageRequest): Promise> => ({ + items: [], + nextCursor: undefined, + }), + listEvents: async (_request: ListEventsRequest): Promise> => ({ + items: [], + nextCursor: undefined, + }), + createProcess: async () => defaultProcess, + listProcesses: async () => ({ processes: [defaultProcess] }), + getProcessLogs: async () => defaultLogs, + stopProcess: async () => ({ ...defaultProcess, status: "exited", exitCode: 0, exitedAtMs: Date.now() }), + killProcess: async () => ({ ...defaultProcess, status: "exited", exitCode: 137, exitedAtMs: Date.now() }), + deleteProcess: async () => {}, + sendPrompt: async (_request) => {}, + cancelSession: async (_sessionId) => {}, + destroySession: async (_sessionId) => {}, + ...overrides, + }; +} + +export function createTestDaytonaDriver(overrides?: Partial): DaytonaDriver { + return { + createClient: (_opts) => createTestDaytonaClient(), + ...overrides, + }; +} + +export function createTestDaytonaClient(overrides?: Partial): DaytonaClientLike { + return { + createSandbox: async () => ({ id: "sandbox-test-1", state: "started" }), + getSandbox: async (sandboxId) => ({ id: sandboxId, state: "started" }), + startSandbox: async () => {}, + stopSandbox: async () => {}, + deleteSandbox: async () => {}, + executeCommand: async () => ({ exitCode: 0, result: "" }), + getPreviewEndpoint: async (sandboxId, port) => ({ + url: `https://preview.example/sandbox/${sandboxId}/port/${port}`, + token: "preview-token", + }), + ...overrides, + }; +} + export function createTestTmuxDriver(overrides?: Partial): TmuxDriver { return { setWindowStatus: () => 0, diff --git a/foundry/packages/backend/test/keys.test.ts b/foundry/packages/backend/test/keys.test.ts index c3b2a10..b00a54d 100644 --- a/foundry/packages/backend/test/keys.test.ts +++ b/foundry/packages/backend/test/keys.test.ts @@ -1,18 +1,30 @@ import { describe, expect, it } from "vitest"; -import { auditLogKey, githubDataKey, organizationKey, taskKey, taskSandboxKey } from "../src/actors/keys.js"; +import { + taskKey, + taskStatusSyncKey, + historyKey, + projectBranchSyncKey, + projectKey, + projectPrSyncKey, + sandboxInstanceKey, + workspaceKey, +} from "../src/actors/keys.js"; describe("actor keys", () => { - it("prefixes every key with organization namespace", () => { + it("prefixes every key with workspace namespace", () => { const keys = [ - organizationKey("default"), + workspaceKey("default"), + projectKey("default", "repo"), taskKey("default", "repo", "task"), - taskSandboxKey("default", "sbx"), - auditLogKey("default"), - githubDataKey("default"), + sandboxInstanceKey("default", "daytona", "sbx"), + historyKey("default", "repo"), + projectPrSyncKey("default", "repo"), + projectBranchSyncKey("default", "repo"), + taskStatusSyncKey("default", "repo", "task", "sandbox-1", "session-1"), ]; for (const key of keys) { - expect(key[0]).toBe("org"); + expect(key[0]).toBe("ws"); expect(key[1]).toBe("default"); } }); diff --git a/foundry/packages/backend/test/providers.test.ts b/foundry/packages/backend/test/providers.test.ts new file mode 100644 index 0000000..f659e27 --- /dev/null +++ b/foundry/packages/backend/test/providers.test.ts @@ -0,0 +1,52 @@ +import { describe, expect, it } from "vitest"; +import { ConfigSchema, type AppConfig } from "@sandbox-agent/foundry-shared"; +import { createProviderRegistry } from "../src/providers/index.js"; + +function makeConfig(): AppConfig { + return ConfigSchema.parse({ + auto_submit: true, + notify: ["terminal"], + workspace: { default: "default" }, + backend: { + host: "127.0.0.1", + port: 7741, + dbPath: "~/.local/share/foundry/task.db", + opencode_poll_interval: 2, + github_poll_interval: 30, + backup_interval_secs: 3600, + backup_retention_days: 7, + }, + providers: { + local: {}, + daytona: { image: "ubuntu:24.04" }, + }, + }); +} + +describe("provider registry", () => { + it("defaults to local when daytona is not configured", () => { + const registry = createProviderRegistry(makeConfig()); + expect(registry.defaultProviderId()).toBe("local"); + }); + + it("prefers daytona when an api key is configured", () => { + const registry = createProviderRegistry( + ConfigSchema.parse({ + ...makeConfig(), + providers: { + ...makeConfig().providers, + daytona: { + ...makeConfig().providers.daytona, + apiKey: "test-token", + }, + }, + }), + ); + expect(registry.defaultProviderId()).toBe("daytona"); + }); + + it("returns the built-in provider", () => { + const registry = createProviderRegistry(makeConfig()); + expect(registry.get("daytona").id()).toBe("daytona"); + }); +}); diff --git a/foundry/packages/backend/test/sandbox-config.test.ts b/foundry/packages/backend/test/sandbox-config.test.ts deleted file mode 100644 index 354f794..0000000 --- a/foundry/packages/backend/test/sandbox-config.test.ts +++ /dev/null @@ -1,50 +0,0 @@ -import { describe, expect, it } from "vitest"; -import { ConfigSchema, type AppConfig } from "@sandbox-agent/foundry-shared"; -import { availableSandboxProviderIds, defaultSandboxProviderId, resolveSandboxProviderId } from "../src/sandbox-config.js"; - -function makeConfig(overrides?: Partial): AppConfig { - return ConfigSchema.parse({ - auto_submit: true, - notify: ["terminal"], - organization: { default: "default" }, - backend: { - host: "127.0.0.1", - port: 7741, - dbPath: "~/.local/share/foundry/task.db", - opencode_poll_interval: 2, - github_poll_interval: 30, - backup_interval_secs: 3600, - backup_retention_days: 7, - }, - sandboxProviders: { - local: {}, - e2b: {}, - }, - ...overrides, - }); -} - -describe("sandbox config", () => { - it("defaults to local when e2b is not configured", () => { - const config = makeConfig(); - expect(defaultSandboxProviderId(config)).toBe("local"); - expect(availableSandboxProviderIds(config)).toEqual(["local"]); - }); - - it("prefers e2b when an api key is configured", () => { - const config = makeConfig({ - sandboxProviders: { - local: {}, - e2b: { apiKey: "test-token" }, - }, - }); - expect(defaultSandboxProviderId(config)).toBe("e2b"); - expect(availableSandboxProviderIds(config)).toEqual(["e2b", "local"]); - expect(resolveSandboxProviderId(config, "e2b")).toBe("e2b"); - }); - - it("rejects selecting e2b without an api key", () => { - const config = makeConfig(); - expect(() => resolveSandboxProviderId(config, "e2b")).toThrow("E2B provider is not configured"); - }); -}); diff --git a/foundry/packages/backend/test/sandbox-instance-persist.test.ts b/foundry/packages/backend/test/sandbox-instance-persist.test.ts new file mode 100644 index 0000000..a3692ea --- /dev/null +++ b/foundry/packages/backend/test/sandbox-instance-persist.test.ts @@ -0,0 +1,21 @@ +import { describe, expect, it } from "vitest"; +import { resolveEventListOffset } from "../src/actors/sandbox-instance/persist.js"; + +describe("sandbox-instance persist event offset", () => { + it("returns newest tail when cursor is omitted", () => { + expect(resolveEventListOffset({ total: 180, limit: 50 })).toBe(130); + }); + + it("returns zero when total rows are below page size", () => { + expect(resolveEventListOffset({ total: 20, limit: 50 })).toBe(0); + }); + + it("uses explicit cursor when provided", () => { + expect(resolveEventListOffset({ cursor: "7", total: 180, limit: 50 })).toBe(7); + }); + + it("normalizes invalid cursors to zero", () => { + expect(resolveEventListOffset({ cursor: "-3", total: 180, limit: 50 })).toBe(0); + expect(resolveEventListOffset({ cursor: "not-a-number", total: 180, limit: 50 })).toBe(0); + }); +}); diff --git a/foundry/packages/backend/test/stack-model.test.ts b/foundry/packages/backend/test/stack-model.test.ts new file mode 100644 index 0000000..ca0a79f --- /dev/null +++ b/foundry/packages/backend/test/stack-model.test.ts @@ -0,0 +1,34 @@ +import { describe, expect, it } from "vitest"; +import { normalizeParentBranch, parentLookupFromStack, sortBranchesForOverview } from "../src/actors/project/stack-model.js"; + +describe("stack-model", () => { + it("normalizes self-parent references to null", () => { + expect(normalizeParentBranch("feature/a", "feature/a")).toBeNull(); + expect(normalizeParentBranch("feature/a", "main")).toBe("main"); + expect(normalizeParentBranch("feature/a", null)).toBeNull(); + }); + + it("builds parent lookup with sanitized entries", () => { + const lookup = parentLookupFromStack([ + { branchName: "feature/a", parentBranch: "main" }, + { branchName: "feature/b", parentBranch: "feature/b" }, + { branchName: " ", parentBranch: "main" }, + ]); + + expect(lookup.get("feature/a")).toBe("main"); + expect(lookup.get("feature/b")).toBeNull(); + expect(lookup.has(" ")).toBe(false); + }); + + it("orders branches by graph depth and handles cycles safely", () => { + const rows = sortBranchesForOverview([ + { branchName: "feature/b", parentBranch: "feature/a", updatedAt: 200 }, + { branchName: "feature/a", parentBranch: "main", updatedAt: 100 }, + { branchName: "main", parentBranch: null, updatedAt: 50 }, + { branchName: "cycle-a", parentBranch: "cycle-b", updatedAt: 300 }, + { branchName: "cycle-b", parentBranch: "cycle-a", updatedAt: 250 }, + ]); + + expect(rows.map((row) => row.branchName)).toEqual(["main", "feature/a", "feature/b", "cycle-a", "cycle-b"]); + }); +}); diff --git a/foundry/packages/backend/test/workbench-unread.test.ts b/foundry/packages/backend/test/workbench-unread.test.ts new file mode 100644 index 0000000..f7ed201 --- /dev/null +++ b/foundry/packages/backend/test/workbench-unread.test.ts @@ -0,0 +1,16 @@ +import { describe, expect, it } from "vitest"; +import { shouldMarkSessionUnreadForStatus } from "../src/actors/task/workbench.js"; + +describe("workbench unread status transitions", () => { + it("marks unread when a running session first becomes idle", () => { + expect(shouldMarkSessionUnreadForStatus({ thinkingSinceMs: Date.now() - 1_000 }, "idle")).toBe(true); + }); + + it("does not re-mark unread on repeated idle polls after thinking has cleared", () => { + expect(shouldMarkSessionUnreadForStatus({ thinkingSinceMs: null }, "idle")).toBe(false); + }); + + it("does not mark unread while the session is still running", () => { + expect(shouldMarkSessionUnreadForStatus({ thinkingSinceMs: Date.now() - 1_000 }, "running")).toBe(false); + }); +}); diff --git a/foundry/packages/backend/test/organization-isolation.test.ts b/foundry/packages/backend/test/workspace-isolation.test.ts similarity index 59% rename from foundry/packages/backend/test/organization-isolation.test.ts rename to foundry/packages/backend/test/workspace-isolation.test.ts index f5d58f2..fd0689d 100644 --- a/foundry/packages/backend/test/organization-isolation.test.ts +++ b/foundry/packages/backend/test/workspace-isolation.test.ts @@ -6,10 +6,8 @@ import { execFileSync } from "node:child_process"; import { setTimeout as delay } from "node:timers/promises"; import { describe, expect, it } from "vitest"; import { setupTest } from "rivetkit/test"; -import { organizationKey } from "../src/actors/keys.js"; +import { workspaceKey } from "../src/actors/keys.js"; import { registry } from "../src/actors/index.js"; -import { organizationWorkflowQueueName } from "../src/actors/organization/queues.js"; -import { repoIdFromRemote } from "../src/services/repo.js"; import { createTestDriver } from "./helpers/test-driver.js"; import { createTestRuntimeContext } from "./helpers/test-context.js"; @@ -26,60 +24,59 @@ function createRepo(): { repoPath: string } { return { repoPath }; } -async function waitForOrganizationRows(ws: any, organizationId: string, expectedCount: number) { +async function waitForWorkspaceRows(ws: any, workspaceId: string, expectedCount: number) { for (let attempt = 0; attempt < 40; attempt += 1) { - const rows = await ws.listTasks({ organizationId }); + const rows = await ws.listTasks({ workspaceId }); if (rows.length >= expectedCount) { return rows; } await delay(50); } - return ws.listTasks({ organizationId }); + return ws.listTasks({ workspaceId }); } -describe("organization isolation", () => { - it.skipIf(!runActorIntegration)("keeps task lists isolated by organization", async (t) => { +describe("workspace isolation", () => { + it.skipIf(!runActorIntegration)("keeps task lists isolated by workspace", async (t) => { const testDriver = createTestDriver(); createTestRuntimeContext(testDriver); const { client } = await setupTest(t, registry); - const wsA = await client.organization.getOrCreate(organizationKey("alpha"), { + const wsA = await client.workspace.getOrCreate(workspaceKey("alpha"), { createWithInput: "alpha", }); - const wsB = await client.organization.getOrCreate(organizationKey("beta"), { + const wsB = await client.workspace.getOrCreate(workspaceKey("beta"), { createWithInput: "beta", }); const { repoPath } = createRepo(); - const repoId = repoIdFromRemote(repoPath); - await wsA.send(organizationWorkflowQueueName("organization.command.github.repository_projection.apply"), { repoId, remoteUrl: repoPath }, { wait: true }); - await wsB.send(organizationWorkflowQueueName("organization.command.github.repository_projection.apply"), { repoId, remoteUrl: repoPath }, { wait: true }); + const repoA = await wsA.addRepo({ workspaceId: "alpha", remoteUrl: repoPath }); + const repoB = await wsB.addRepo({ workspaceId: "beta", remoteUrl: repoPath }); await wsA.createTask({ - organizationId: "alpha", - repoId, + workspaceId: "alpha", + repoId: repoA.repoId, task: "task A", - sandboxProviderId: "local", + providerId: "daytona", explicitBranchName: "feature/a", explicitTitle: "A", }); await wsB.createTask({ - organizationId: "beta", - repoId, + workspaceId: "beta", + repoId: repoB.repoId, task: "task B", - sandboxProviderId: "local", + providerId: "daytona", explicitBranchName: "feature/b", explicitTitle: "B", }); - const aRows = await waitForOrganizationRows(wsA, "alpha", 1); - const bRows = await waitForOrganizationRows(wsB, "beta", 1); + const aRows = await waitForWorkspaceRows(wsA, "alpha", 1); + const bRows = await waitForWorkspaceRows(wsB, "beta", 1); expect(aRows.length).toBe(1); expect(bRows.length).toBe(1); - expect(aRows[0]?.organizationId).toBe("alpha"); - expect(bRows[0]?.organizationId).toBe("beta"); + expect(aRows[0]?.workspaceId).toBe("alpha"); + expect(bRows[0]?.workspaceId).toBe("beta"); expect(aRows[0]?.taskId).not.toBe(bRows[0]?.taskId); }); }); diff --git a/foundry/packages/backend/test/organization-star-sandbox-agent-repo.test.ts b/foundry/packages/backend/test/workspace-star-sandbox-agent-repo.test.ts similarity index 80% rename from foundry/packages/backend/test/organization-star-sandbox-agent-repo.test.ts rename to foundry/packages/backend/test/workspace-star-sandbox-agent-repo.test.ts index b3a2410..8eabb99 100644 --- a/foundry/packages/backend/test/organization-star-sandbox-agent-repo.test.ts +++ b/foundry/packages/backend/test/workspace-star-sandbox-agent-repo.test.ts @@ -1,14 +1,14 @@ // @ts-nocheck import { describe, expect, it } from "vitest"; import { setupTest } from "rivetkit/test"; -import { organizationKey } from "../src/actors/keys.js"; +import { workspaceKey } from "../src/actors/keys.js"; import { registry } from "../src/actors/index.js"; import { createTestDriver } from "./helpers/test-driver.js"; import { createTestRuntimeContext } from "./helpers/test-context.js"; const runActorIntegration = process.env.HF_ENABLE_ACTOR_INTEGRATION_TESTS === "1"; -describe("organization star sandbox agent repo", () => { +describe("workspace star sandbox agent repo", () => { it.skipIf(!runActorIntegration)("stars the sandbox agent repo through the github driver", async (t) => { const calls: string[] = []; const testDriver = createTestDriver({ @@ -26,11 +26,11 @@ describe("organization star sandbox agent repo", () => { createTestRuntimeContext(testDriver); const { client } = await setupTest(t, registry); - const ws = await client.organization.getOrCreate(organizationKey("alpha"), { + const ws = await client.workspace.getOrCreate(workspaceKey("alpha"), { createWithInput: "alpha", }); - const result = await ws.starSandboxAgentRepo({ organizationId: "alpha" }); + const result = await ws.starSandboxAgentRepo({ workspaceId: "alpha" }); expect(calls).toEqual(["rivet-dev/sandbox-agent"]); expect(result.repo).toBe("rivet-dev/sandbox-agent"); diff --git a/foundry/packages/backend/test/workspace-unread.test.ts b/foundry/packages/backend/test/workspace-unread.test.ts deleted file mode 100644 index 5f7221a..0000000 --- a/foundry/packages/backend/test/workspace-unread.test.ts +++ /dev/null @@ -1,86 +0,0 @@ -import { describe, expect, it } from "vitest"; -import { requireSendableSessionMeta, shouldMarkSessionUnreadForStatus, shouldRecreateSessionForModelChange } from "../src/actors/task/workspace.js"; - -describe("workspace unread status transitions", () => { - it("marks unread when a running session first becomes idle", () => { - expect(shouldMarkSessionUnreadForStatus({ thinkingSinceMs: Date.now() - 1_000 }, "idle")).toBe(true); - }); - - it("does not re-mark unread on repeated idle polls after thinking has cleared", () => { - expect(shouldMarkSessionUnreadForStatus({ thinkingSinceMs: null }, "idle")).toBe(false); - }); - - it("does not mark unread while the session is still running", () => { - expect(shouldMarkSessionUnreadForStatus({ thinkingSinceMs: Date.now() - 1_000 }, "running")).toBe(false); - }); -}); - -describe("workspace model changes", () => { - it("recreates an unused ready session so the selected model takes effect", () => { - expect( - shouldRecreateSessionForModelChange({ - status: "ready", - sandboxSessionId: "session-1", - created: false, - transcript: [], - }), - ).toBe(true); - }); - - it("does not recreate a session once the conversation has started", () => { - expect( - shouldRecreateSessionForModelChange({ - status: "ready", - sandboxSessionId: "session-1", - created: true, - transcript: [], - }), - ).toBe(false); - }); - - it("does not recreate pending or anonymous sessions", () => { - expect( - shouldRecreateSessionForModelChange({ - status: "pending_session_create", - sandboxSessionId: "session-1", - created: false, - transcript: [], - }), - ).toBe(false); - expect( - shouldRecreateSessionForModelChange({ - status: "ready", - sandboxSessionId: null, - created: false, - transcript: [], - }), - ).toBe(false); - }); -}); - -describe("workspace send readiness", () => { - it("rejects unknown sessions", () => { - expect(() => requireSendableSessionMeta(null, "session-1")).toThrow("Unknown workspace session: session-1"); - }); - - it("rejects pending sessions", () => { - expect(() => - requireSendableSessionMeta( - { - status: "pending_session_create", - sandboxSessionId: null, - }, - "session-2", - ), - ).toThrow("Session is not ready (status: pending_session_create). Wait for session provisioning to complete."); - }); - - it("accepts ready sessions with a sandbox session id", () => { - const meta = { - status: "ready", - sandboxSessionId: "session-1", - }; - - expect(requireSendableSessionMeta(meta, "session-3")).toBe(meta); - }); -}); diff --git a/foundry/packages/cli/src/index.ts b/foundry/packages/cli/src/index.ts index fdf5a19..3e77291 100644 --- a/foundry/packages/cli/src/index.ts +++ b/foundry/packages/cli/src/index.ts @@ -8,7 +8,7 @@ import { ensureBackendRunning, getBackendStatus, parseBackendPort, stopBackend } import { writeStderr, writeStdout } from "./io.js"; import { openEditorForTask } from "./task-editor.js"; import { spawnCreateTmuxWindow } from "./tmux.js"; -import { loadConfig, resolveOrganization, saveConfig } from "./organization/config.js"; +import { loadConfig, resolveWorkspace, saveConfig } from "./workspace/config.js"; async function ensureBunRuntime(): Promise { if (typeof (globalThis as { Bun?: unknown }).Bun !== "undefined") { @@ -41,9 +41,9 @@ async function ensureBunRuntime(): Promise { throw new Error("hf requires Bun runtime. Set HF_BUN or install Bun at ~/.bun/bin/bun."); } -async function runTuiCommand(config: ReturnType, organizationId: string): Promise { +async function runTuiCommand(config: ReturnType, workspaceId: string): Promise { const mod = await import("./tui.js"); - await mod.runTui(config, organizationId); + await mod.runTui(config, workspaceId); } function readOption(args: string[], flag: string): string | undefined { @@ -87,92 +87,6 @@ function positionals(args: string[]): string[] { return out; } -function normalizeRepoSelector(value: string): string { - let normalized = value.trim(); - if (!normalized) { - return ""; - } - - normalized = normalized.replace(/\/+$/, ""); - if (/^[A-Za-z0-9_.-]+\/[A-Za-z0-9_.-]+$/.test(normalized)) { - return `https://github.com/${normalized}.git`; - } - - if (/^(?:www\.)?github\.com\/.+/i.test(normalized)) { - normalized = `https://${normalized.replace(/^www\./i, "")}`; - } - - try { - if (/^https?:\/\//i.test(normalized)) { - const url = new URL(normalized); - const hostname = url.hostname.replace(/^www\./i, ""); - if (hostname.toLowerCase() === "github.com") { - const parts = url.pathname.split("/").filter(Boolean); - if (parts.length >= 2) { - return `${url.protocol}//${hostname}/${parts[0]}/${(parts[1] ?? "").replace(/\.git$/i, "")}.git`; - } - } - url.search = ""; - url.hash = ""; - return url.toString().replace(/\/+$/, ""); - } - } catch { - // Keep the selector as-is for matching below. - } - - return normalized; -} - -function githubRepoFullNameFromSelector(value: string): string | null { - const normalized = normalizeRepoSelector(value); - try { - const url = new URL(normalized); - if (url.hostname.replace(/^www\./i, "").toLowerCase() !== "github.com") { - return null; - } - const parts = url.pathname.replace(/\/+$/, "").split("/").filter(Boolean); - if (parts.length < 2) { - return null; - } - return `${parts[0]}/${(parts[1] ?? "").replace(/\.git$/i, "")}`; - } catch { - return null; - } -} - -async function resolveImportedRepo( - client: ReturnType, - organizationId: string, - repoSelector: string, -): Promise>[number]> { - const selector = repoSelector.trim(); - if (!selector) { - throw new Error("Missing required --repo "); - } - - const normalizedSelector = normalizeRepoSelector(selector); - const selectorFullName = githubRepoFullNameFromSelector(selector); - const repos = await client.listRepos(organizationId); - const match = repos.find((repo) => { - if (repo.repoId === selector) { - return true; - } - if (normalizeRepoSelector(repo.remoteUrl) === normalizedSelector) { - return true; - } - const repoFullName = githubRepoFullNameFromSelector(repo.remoteUrl); - return Boolean(selectorFullName && repoFullName && repoFullName === selectorFullName); - }); - - if (!match) { - throw new Error( - `Repo not available in organization ${organizationId}: ${repoSelector}. Create it in GitHub first, then sync repos in Foundry before running hf create.`, - ); - } - - return match; -} - function printUsage(): void { writeStdout(` Usage: @@ -180,22 +94,22 @@ Usage: hf backend stop [--host HOST] [--port PORT] hf backend status hf backend inspect - hf status [--organization ORG] [--json] - hf history [--organization ORG] [--limit N] [--branch NAME] [--task ID] [--json] - hf organization use - hf tui [--organization ORG] + hf status [--workspace WS] [--json] + hf history [--workspace WS] [--limit N] [--branch NAME] [--task ID] [--json] + hf workspace use + hf tui [--workspace WS] - hf create [task] [--organization ORG] --repo [--name NAME|--branch NAME] [--title TITLE] [--agent claude|codex] [--on BRANCH] - hf list [--organization ORG] [--format table|json] [--full] - hf switch [task-id | -] [--organization ORG] - hf attach [--organization ORG] - hf merge [--organization ORG] - hf archive [--organization ORG] - hf push [--organization ORG] - hf sync [--organization ORG] - hf kill [--organization ORG] [--delete-branch] [--abandon] - hf prune [--organization ORG] [--dry-run] [--yes] - hf statusline [--organization ORG] [--format table|claude-code] + hf create [task] [--workspace WS] --repo [--name NAME|--branch NAME] [--title TITLE] [--agent claude|codex] [--on BRANCH] + hf list [--workspace WS] [--format table|json] [--full] + hf switch [task-id | -] [--workspace WS] + hf attach [--workspace WS] + hf merge [--workspace WS] + hf archive [--workspace WS] + hf push [--workspace WS] + hf sync [--workspace WS] + hf kill [--workspace WS] [--delete-branch] [--abandon] + hf prune [--workspace WS] [--dry-run] [--yes] + hf statusline [--workspace WS] [--format table|claude-code] hf db path hf db nuke @@ -209,24 +123,24 @@ Tips: function printStatusUsage(): void { writeStdout(` Usage: - hf status [--organization ORG] [--json] + hf status [--workspace WS] [--json] Text Output: - organization= + workspace= backend running= pid= version= tasks total= status queued= running= idle= archived= killed= error= - sandboxProviders = ... - sandboxProviders - + providers = ... + providers - JSON Output: { - "organizationId": "default", + "workspaceId": "default", "backend": { ...backend status object... }, "tasks": { "total": 4, "byStatus": { "queued": 0, "running": 1, "idle": 2, "archived": 1, "killed": 0, "error": 0 }, - "byProvider": { "local": 4 } + "byProvider": { "daytona": 4 } } } `); @@ -235,7 +149,7 @@ JSON Output: function printHistoryUsage(): void { writeStdout(` Usage: - hf history [--organization ORG] [--limit N] [--branch NAME] [--task ID] [--json] + hf history [--workspace WS] [--limit N] [--branch NAME] [--task ID] [--json] Text Output: \t\t\t @@ -250,23 +164,18 @@ JSON Output: [ { "id": "...", - "organizationId": "default", + "workspaceId": "default", "kind": "task.created", "taskId": "...", "repoId": "...", "branchName": "feature/foo", - "payloadJson": "{\\"sandboxProviderId\\":\\"local\\"}", + "payloadJson": "{\\"providerId\\":\\"daytona\\"}", "createdAt": 1770607522229 } ] `); } -async function listDetailedTasks(client: ReturnType, organizationId: string): Promise { - const rows = await client.listTasks(organizationId); - return await Promise.all(rows.map(async (row) => await client.getTask(organizationId, row.taskId))); -} - async function handleBackend(args: string[]): Promise { const sub = args[0] ?? "start"; const config = loadConfig(); @@ -323,38 +232,38 @@ async function handleBackend(args: string[]): Promise { throw new Error(`Unknown backend subcommand: ${sub}`); } -async function handleOrganization(args: string[]): Promise { +async function handleWorkspace(args: string[]): Promise { const sub = args[0]; if (sub !== "use") { - throw new Error("Usage: hf organization use "); + throw new Error("Usage: hf workspace use "); } const name = args[1]; if (!name) { - throw new Error("Missing organization name"); + throw new Error("Missing workspace name"); } const config = loadConfig(); - config.organization.default = name; + config.workspace.default = name; saveConfig(config); const client = createBackendClientFromConfig(config); try { - await client.useOrganization(name); + await client.useWorkspace(name); } catch { // Backend may not be running yet. Config is already updated. } - writeStdout(`organization=${name}`); + writeStdout(`workspace=${name}`); } async function handleList(args: string[]): Promise { const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); const format = readOption(args, "--format") ?? "table"; const full = hasFlag(args, "--full"); const client = createBackendClientFromConfig(config); - const rows = await listDetailedTasks(client, organizationId); + const rows = await client.listTasks(workspaceId); if (format === "json") { writeStdout(JSON.stringify(rows, null, 2)); @@ -368,10 +277,10 @@ async function handleList(args: string[]): Promise { for (const row of rows) { const age = formatRelativeAge(row.updatedAt); - let line = `${row.taskId}\t${row.branchName}\t${row.status}\t${row.sandboxProviderId}\t${age}`; + let line = `${row.taskId}\t${row.branchName}\t${row.status}\t${row.providerId}\t${age}`; if (full) { - const preview = row.task.length > 60 ? `${row.task.slice(0, 57)}...` : row.task; - line += `\t${row.title}\t${preview}\t${row.activeSessionId ?? "-"}\t${row.activeSandboxId ?? "-"}`; + const task = row.task.length > 60 ? `${row.task.slice(0, 57)}...` : row.task; + line += `\t${row.title}\t${task}\t${row.activeSessionId ?? "-"}\t${row.activeSandboxId ?? "-"}`; } writeStdout(line); } @@ -383,9 +292,9 @@ async function handlePush(args: string[]): Promise { throw new Error("Missing task id for push"); } const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); const client = createBackendClientFromConfig(config); - await client.runAction(organizationId, taskId, "push"); + await client.runAction(workspaceId, taskId, "push"); writeStdout("ok"); } @@ -395,9 +304,9 @@ async function handleSync(args: string[]): Promise { throw new Error("Missing task id for sync"); } const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); const client = createBackendClientFromConfig(config); - await client.runAction(organizationId, taskId, "sync"); + await client.runAction(workspaceId, taskId, "sync"); writeStdout("ok"); } @@ -407,7 +316,7 @@ async function handleKill(args: string[]): Promise { throw new Error("Missing task id for kill"); } const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); const deleteBranch = hasFlag(args, "--delete-branch"); const abandon = hasFlag(args, "--abandon"); @@ -419,17 +328,17 @@ async function handleKill(args: string[]): Promise { } const client = createBackendClientFromConfig(config); - await client.runAction(organizationId, taskId, "kill"); + await client.runAction(workspaceId, taskId, "kill"); writeStdout("ok"); } async function handlePrune(args: string[]): Promise { const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); const dryRun = hasFlag(args, "--dry-run"); const yes = hasFlag(args, "--yes"); const client = createBackendClientFromConfig(config); - const rows = await listDetailedTasks(client, organizationId); + const rows = await client.listTasks(workspaceId); const prunable = rows.filter((r) => r.status === "archived" || r.status === "killed"); if (prunable.length === 0) { @@ -457,10 +366,10 @@ async function handlePrune(args: string[]): Promise { async function handleStatusline(args: string[]): Promise { const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); const format = readOption(args, "--format") ?? "table"; const client = createBackendClientFromConfig(config); - const rows = await listDetailedTasks(client, organizationId); + const rows = await client.listTasks(workspaceId); const summary = summarizeTasks(rows); const running = summary.byStatus.running; const idle = summary.byStatus.idle; @@ -493,7 +402,7 @@ async function handleDb(args: string[]): Promise { async function waitForTaskReady( client: ReturnType, - organizationId: string, + workspaceId: string, taskId: string, timeoutMs: number, ): Promise { @@ -501,7 +410,7 @@ async function waitForTaskReady( let delayMs = 250; for (;;) { - const record = await client.getTask(organizationId, taskId); + const record = await client.getTask(workspaceId, taskId); const hasName = Boolean(record.branchName && record.title); const hasSandbox = Boolean(record.activeSandboxId); @@ -523,11 +432,11 @@ async function waitForTaskReady( async function handleCreate(args: string[]): Promise { const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); - const repoSelector = readOption(args, "--repo"); - if (!repoSelector) { - throw new Error("Missing required --repo "); + const repoRemote = readOption(args, "--repo"); + if (!repoRemote) { + throw new Error("Missing required --repo "); } const explicitBranchName = readOption(args, "--name") ?? readOption(args, "--branch"); const explicitTitle = readOption(args, "--title"); @@ -537,15 +446,15 @@ async function handleCreate(args: string[]): Promise { const onBranch = readOption(args, "--on"); const taskFromArgs = positionals(args).join(" ").trim(); - const taskPrompt = taskFromArgs || openEditorForTask(); + const task = taskFromArgs || openEditorForTask(); const client = createBackendClientFromConfig(config); - const repo = await resolveImportedRepo(client, organizationId, repoSelector); + const repo = await client.addRepo(workspaceId, repoRemote); const payload = CreateTaskInputSchema.parse({ - organizationId, + workspaceId, repoId: repo.repoId, - task: taskPrompt, + task, explicitTitle: explicitTitle || undefined, explicitBranchName: explicitBranchName || undefined, agentType, @@ -553,30 +462,30 @@ async function handleCreate(args: string[]): Promise { }); const created = await client.createTask(payload); - const createdTask = await waitForTaskReady(client, organizationId, created.taskId, 180_000); - const switched = await client.switchTask(organizationId, createdTask.taskId); - const attached = await client.attachTask(organizationId, createdTask.taskId); + const task = await waitForTaskReady(client, workspaceId, created.taskId, 180_000); + const switched = await client.switchTask(workspaceId, task.taskId); + const attached = await client.attachTask(workspaceId, task.taskId); - writeStdout(`Branch: ${createdTask.branchName ?? "-"}`); - writeStdout(`Task: ${createdTask.taskId}`); - writeStdout(`Provider: ${createdTask.sandboxProviderId}`); + writeStdout(`Branch: ${task.branchName ?? "-"}`); + writeStdout(`Task: ${task.taskId}`); + writeStdout(`Provider: ${task.providerId}`); writeStdout(`Session: ${attached.sessionId ?? "none"}`); writeStdout(`Target: ${switched.switchTarget || attached.target}`); - writeStdout(`Title: ${createdTask.title ?? "-"}`); + writeStdout(`Title: ${task.title ?? "-"}`); const tmuxResult = spawnCreateTmuxWindow({ - branchName: createdTask.branchName ?? createdTask.taskId, + branchName: task.branchName ?? task.taskId, targetPath: switched.switchTarget || attached.target, sessionId: attached.sessionId, }); if (tmuxResult.created) { - writeStdout(`Window: created (${createdTask.branchName})`); + writeStdout(`Window: created (${task.branchName})`); return; } writeStdout(""); - writeStdout(`Run: hf switch ${createdTask.taskId}`); + writeStdout(`Run: hf switch ${task.taskId}`); if ((switched.switchTarget || attached.target).startsWith("/")) { writeStdout(`cd ${switched.switchTarget || attached.target}`); } @@ -584,8 +493,8 @@ async function handleCreate(args: string[]): Promise { async function handleTui(args: string[]): Promise { const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); - await runTuiCommand(config, organizationId); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); + await runTuiCommand(config, workspaceId); } async function handleStatus(args: string[]): Promise { @@ -595,17 +504,17 @@ async function handleStatus(args: string[]): Promise { } const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); const client = createBackendClientFromConfig(config); const backendStatus = await getBackendStatus(config.backend.host, config.backend.port); - const rows = await listDetailedTasks(client, organizationId); + const rows = await client.listTasks(workspaceId); const summary = summarizeTasks(rows); if (hasFlag(args, "--json")) { writeStdout( JSON.stringify( { - organizationId, + workspaceId, backend: backendStatus, tasks: { total: summary.total, @@ -620,7 +529,7 @@ async function handleStatus(args: string[]): Promise { return; } - writeStdout(`organization=${organizationId}`); + writeStdout(`workspace=${workspaceId}`); writeStdout(`backend running=${backendStatus.running} pid=${backendStatus.pid ?? "unknown"} version=${backendStatus.version ?? "unknown"}`); writeStdout(`tasks total=${summary.total}`); writeStdout( @@ -629,7 +538,7 @@ async function handleStatus(args: string[]): Promise { const providerSummary = Object.entries(summary.byProvider) .map(([provider, count]) => `${provider}=${count}`) .join(" "); - writeStdout(`sandboxProviders ${providerSummary || "-"}`); + writeStdout(`providers ${providerSummary || "-"}`); } async function handleHistory(args: string[]): Promise { @@ -639,13 +548,13 @@ async function handleHistory(args: string[]): Promise { } const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); const limit = parseIntOption(readOption(args, "--limit"), 20, "limit"); const branch = readOption(args, "--branch"); const taskId = readOption(args, "--task"); const client = createBackendClientFromConfig(config); const rows = await client.listHistory({ - organizationId, + workspaceId, limit, branch: branch || undefined, taskId: taskId || undefined, @@ -684,11 +593,11 @@ async function handleSwitchLike(cmd: string, args: string[]): Promise { } const config = loadConfig(); - const organizationId = resolveOrganization(readOption(args, "--organization"), config); + const workspaceId = resolveWorkspace(readOption(args, "--workspace"), config); const client = createBackendClientFromConfig(config); if (cmd === "switch" && taskId === "-") { - const rows = await listDetailedTasks(client, organizationId); + const rows = await client.listTasks(workspaceId); const active = rows.filter((r) => { const group = groupTaskStatus(r.status); return group === "running" || group === "idle" || group === "queued"; @@ -702,19 +611,19 @@ async function handleSwitchLike(cmd: string, args: string[]): Promise { } if (cmd === "switch") { - const result = await client.switchTask(organizationId, taskId); + const result = await client.switchTask(workspaceId, taskId); writeStdout(`cd ${result.switchTarget}`); return; } if (cmd === "attach") { - const result = await client.attachTask(organizationId, taskId); + const result = await client.attachTask(workspaceId, taskId); writeStdout(`target=${result.target} session=${result.sessionId ?? "none"}`); return; } if (cmd === "merge" || cmd === "archive") { - await client.runAction(organizationId, taskId, cmd); + await client.runAction(workspaceId, taskId, cmd); writeStdout("ok"); return; } @@ -747,8 +656,8 @@ async function main(): Promise { return; } - if (cmd === "organization") { - await handleOrganization(rest); + if (cmd === "workspace") { + await handleWorkspace(rest); return; } diff --git a/foundry/packages/cli/src/theme.ts b/foundry/packages/cli/src/theme.ts index 633c079..5315a44 100644 --- a/foundry/packages/cli/src/theme.ts +++ b/foundry/packages/cli/src/theme.ts @@ -588,7 +588,7 @@ function pointer(obj: JsonObject, parts: string[]): unknown { function opencodeConfigPaths(baseDir: string): string[] { const paths: string[] = []; - const rootish = opencodeRepositoryConfigPaths(baseDir); + const rootish = opencodeProjectConfigPaths(baseDir); paths.push(...rootish); const configDir = process.env.XDG_CONFIG_HOME || join(homedir(), ".config"); @@ -611,12 +611,12 @@ function opencodeThemeDirs(configDir: string | undefined, baseDir: string): stri dirs.push(join(xdgConfig, "opencode", "themes")); dirs.push(join(homedir(), ".opencode", "themes")); - dirs.push(...opencodeRepositoryThemeDirs(baseDir)); + dirs.push(...opencodeProjectThemeDirs(baseDir)); return dirs; } -function opencodeRepositoryConfigPaths(baseDir: string): string[] { +function opencodeProjectConfigPaths(baseDir: string): string[] { const dirs = ancestorDirs(baseDir); const out: string[] = []; for (const dir of dirs) { @@ -628,7 +628,7 @@ function opencodeRepositoryConfigPaths(baseDir: string): string[] { return out; } -function opencodeRepositoryThemeDirs(baseDir: string): string[] { +function opencodeProjectThemeDirs(baseDir: string): string[] { const dirs = ancestorDirs(baseDir); const out: string[] = []; for (const dir of dirs) { diff --git a/foundry/packages/cli/src/tui.ts b/foundry/packages/cli/src/tui.ts index 062bb95..d561565 100644 --- a/foundry/packages/cli/src/tui.ts +++ b/foundry/packages/cli/src/tui.ts @@ -1,4 +1,4 @@ -import type { AppConfig, TaskRecord, WorkspaceTaskDetail } from "@sandbox-agent/foundry-shared"; +import type { AppConfig, TaskRecord } from "@sandbox-agent/foundry-shared"; import { spawnSync } from "node:child_process"; import { createBackendClientFromConfig, filterTasks, formatRelativeAge, groupTaskStatus } from "@sandbox-agent/foundry-client"; import { CLI_BUILD_ID } from "./build-id.js"; @@ -51,30 +51,11 @@ interface DisplayRow { age: string; } -type TuiTaskRow = TaskRecord & Pick & { activeSessionId?: string | null }; - interface RenderOptions { width?: number; height?: number; } -async function listDetailedTasks(client: ReturnType, organizationId: string): Promise { - const rows = await client.listTasks(organizationId); - return await Promise.all( - rows.map(async (row) => { - const [task, detail] = await Promise.all([ - client.getTask(organizationId, row.repoId, row.taskId), - client.getTaskDetail(organizationId, row.repoId, row.taskId).catch(() => null), - ]); - return { - ...task, - pullRequest: detail?.pullRequest ?? null, - activeSessionId: detail?.activeSessionId ?? null, - }; - }), - ); -} - function pad(input: string, width: number): string { if (width <= 0) { return ""; @@ -157,17 +138,29 @@ function agentSymbol(status: TaskRecord["status"]): string { return "-"; } -function toDisplayRow(row: TuiTaskRow): DisplayRow { - const prLabel = row.pullRequest ? `#${row.pullRequest.number}` : "-"; - const reviewLabel = row.pullRequest ? (row.pullRequest.isDraft ? "draft" : row.pullRequest.state.toLowerCase()) : "-"; +function toDisplayRow(row: TaskRecord): DisplayRow { + const conflictPrefix = row.conflictsWithMain === "true" ? "\u26A0 " : ""; + + const prLabel = row.prUrl ? `#${row.prUrl.match(/\/pull\/(\d+)/)?.[1] ?? "?"}` : row.prSubmitted ? "sub" : "-"; + + const ciLabel = row.ciStatus ?? "-"; + const reviewLabel = row.reviewStatus + ? row.reviewStatus === "approved" + ? "ok" + : row.reviewStatus === "changes_requested" + ? "chg" + : row.reviewStatus === "pending" + ? "..." + : row.reviewStatus + : "-"; return { - name: row.title || row.branchName || row.taskId, - diff: "-", + name: `${conflictPrefix}${row.title || row.branchName}`, + diff: row.diffStat ?? "-", agent: agentSymbol(row.status), pr: prLabel, - author: row.pullRequest?.authorLogin ?? "-", - ci: "-", + author: row.prAuthor ?? "-", + ci: ciLabel, review: reviewLabel, age: formatRelativeAge(row.updatedAt), }; @@ -188,9 +181,9 @@ function helpLines(width: number): string[] { } export function formatRows( - rows: TuiTaskRow[], + rows: TaskRecord[], selected: number, - organizationId: string, + workspaceId: string, status: string, searchQuery = "", showHelp = false, @@ -219,7 +212,7 @@ export function formatRows( return `${marker}${pad(display.name, branchWidth)} ${pad(display.diff, COLUMN_WIDTHS.diff)} ${pad(display.agent, COLUMN_WIDTHS.agent)} ${pad(display.pr, COLUMN_WIDTHS.pr)} ${pad(display.author, COLUMN_WIDTHS.author)} ${pad(display.ci, COLUMN_WIDTHS.ci)} ${pad(display.review, COLUMN_WIDTHS.review)} ${pad(display.age, COLUMN_WIDTHS.age)}`; }); - const footer = fitLine(buildFooterLine(totalWidth, ["Ctrl-H:cheatsheet", `organization:${organizationId}`, status], `v${CLI_BUILD_ID}`), totalWidth); + const footer = fitLine(buildFooterLine(totalWidth, ["Ctrl-H:cheatsheet", `workspace:${workspaceId}`, status], `v${CLI_BUILD_ID}`), totalWidth); const contentHeight = totalHeight - 1; const lines = [...header, ...body].map((line) => fitLine(line, totalWidth)); @@ -316,7 +309,7 @@ function buildStyledContent(content: string, theme: TuiTheme, api: StyledTextApi return new api.StyledText(chunks); } -export async function runTui(config: AppConfig, organizationId: string): Promise { +export async function runTui(config: AppConfig, workspaceId: string): Promise { const core = (await import("@opentui/core")) as OpenTuiLike; const createCliRenderer = core.createCliRenderer; const TextRenderable = core.TextRenderable; @@ -338,8 +331,8 @@ export async function runTui(config: AppConfig, organizationId: string): Promise renderer.root.add(text); renderer.start(); - let allRows: TuiTaskRow[] = []; - let filteredRows: TuiTaskRow[] = []; + let allRows: TaskRecord[] = []; + let filteredRows: TaskRecord[] = []; let selected = 0; let searchQuery = ""; let showHelp = false; @@ -366,7 +359,7 @@ export async function runTui(config: AppConfig, organizationId: string): Promise if (closed) { return; } - const output = formatRows(filteredRows, selected, organizationId, status, searchQuery, showHelp, { + const output = formatRows(filteredRows, selected, workspaceId, status, searchQuery, showHelp, { width: renderer.width ?? process.stdout.columns, height: renderer.height ?? process.stdout.rows, }); @@ -379,7 +372,7 @@ export async function runTui(config: AppConfig, organizationId: string): Promise return; } try { - allRows = await listDetailedTasks(client, organizationId); + allRows = await client.listTasks(workspaceId); if (closed) { return; } @@ -395,7 +388,7 @@ export async function runTui(config: AppConfig, organizationId: string): Promise render(); }; - const selectedRow = (): TuiTaskRow | null => { + const selectedRow = (): TaskRecord | null => { if (filteredRows.length === 0) { return null; } @@ -524,7 +517,7 @@ export async function runTui(config: AppConfig, organizationId: string): Promise render(); void (async () => { try { - const result = await client.switchTask(organizationId, row.repoId, row.taskId); + const result = await client.switchTask(workspaceId, row.taskId); close(`cd ${result.switchTarget}`); } catch (err) { busy = false; @@ -545,7 +538,7 @@ export async function runTui(config: AppConfig, organizationId: string): Promise render(); void (async () => { try { - const result = await client.attachTask(organizationId, row.repoId, row.taskId); + const result = await client.attachTask(workspaceId, row.taskId); close(`target=${result.target} session=${result.sessionId ?? "none"}`); } catch (err) { busy = false; @@ -561,11 +554,7 @@ export async function runTui(config: AppConfig, organizationId: string): Promise if (!row) { return; } - void runActionWithRefresh( - `archiving ${row.taskId}`, - async () => client.runAction(organizationId, row.repoId, row.taskId, "archive"), - `archived ${row.taskId}`, - ); + void runActionWithRefresh(`archiving ${row.taskId}`, async () => client.runAction(workspaceId, row.taskId, "archive"), `archived ${row.taskId}`); return; } @@ -574,11 +563,7 @@ export async function runTui(config: AppConfig, organizationId: string): Promise if (!row) { return; } - void runActionWithRefresh( - `syncing ${row.taskId}`, - async () => client.runAction(organizationId, row.repoId, row.taskId, "sync"), - `synced ${row.taskId}`, - ); + void runActionWithRefresh(`syncing ${row.taskId}`, async () => client.runAction(workspaceId, row.taskId, "sync"), `synced ${row.taskId}`); return; } @@ -590,8 +575,8 @@ export async function runTui(config: AppConfig, organizationId: string): Promise void runActionWithRefresh( `merging ${row.taskId}`, async () => { - await client.runAction(organizationId, row.repoId, row.taskId, "merge"); - await client.runAction(organizationId, row.repoId, row.taskId, "archive"); + await client.runAction(workspaceId, row.taskId, "merge"); + await client.runAction(workspaceId, row.taskId, "archive"); }, `merged+archived ${row.taskId}`, ); @@ -600,15 +585,14 @@ export async function runTui(config: AppConfig, organizationId: string): Promise if (ctrl && name === "o") { const row = selectedRow(); - const prUrl = row?.pullRequest?.url ?? null; - if (!prUrl) { + if (!row?.prUrl) { status = "no PR URL available for this task"; render(); return; } const openCmd = process.platform === "darwin" ? "open" : "xdg-open"; - spawnSync(openCmd, [prUrl], { stdio: "ignore" }); - status = `opened ${prUrl}`; + spawnSync(openCmd, [row.prUrl], { stdio: "ignore" }); + status = `opened ${row.prUrl}`; render(); return; } diff --git a/foundry/packages/cli/src/organization/config.ts b/foundry/packages/cli/src/workspace/config.ts similarity index 71% rename from foundry/packages/cli/src/organization/config.ts rename to foundry/packages/cli/src/workspace/config.ts index cfaebfe..5b05dc4 100644 --- a/foundry/packages/cli/src/organization/config.ts +++ b/foundry/packages/cli/src/workspace/config.ts @@ -2,7 +2,7 @@ import { existsSync, mkdirSync, readFileSync, writeFileSync } from "node:fs"; import { dirname } from "node:path"; import { homedir } from "node:os"; import * as toml from "@iarna/toml"; -import { ConfigSchema, resolveOrganizationId, type AppConfig } from "@sandbox-agent/foundry-shared"; +import { ConfigSchema, resolveWorkspaceId, type AppConfig } from "@sandbox-agent/foundry-shared"; export const CONFIG_PATH = `${homedir()}/.config/foundry/config.toml`; @@ -20,6 +20,6 @@ export function saveConfig(config: AppConfig, path = CONFIG_PATH): void { writeFileSync(path, toml.stringify(config), "utf8"); } -export function resolveOrganization(flagOrganization: string | undefined, config: AppConfig): string { - return resolveOrganizationId(flagOrganization, config); +export function resolveWorkspace(flagWorkspace: string | undefined, config: AppConfig): string { + return resolveWorkspaceId(flagWorkspace, config); } diff --git a/foundry/packages/cli/test/backend-manager.test.ts b/foundry/packages/cli/test/backend-manager.test.ts index a6089c5..ab1892e 100644 --- a/foundry/packages/cli/test/backend-manager.test.ts +++ b/foundry/packages/cli/test/backend-manager.test.ts @@ -37,7 +37,7 @@ function healthyMetadataResponse(): { ok: boolean; json: () => Promise json: async () => ({ runtime: "rivetkit", actorNames: { - organization: {}, + workspace: {}, }, }), }; @@ -58,7 +58,7 @@ describe("backend manager", () => { const config: AppConfig = ConfigSchema.parse({ auto_submit: true, notify: ["terminal"], - organization: { default: "default" }, + workspace: { default: "default" }, backend: { host: "127.0.0.1", port: 7741, @@ -68,9 +68,8 @@ describe("backend manager", () => { backup_interval_secs: 3600, backup_retention_days: 7, }, - sandboxProviders: { - local: {}, - e2b: {}, + providers: { + daytona: { image: "ubuntu:24.04" }, }, }); diff --git a/foundry/packages/cli/test/theme.test.ts b/foundry/packages/cli/test/theme.test.ts index 2a0d7e3..6b49c75 100644 --- a/foundry/packages/cli/test/theme.test.ts +++ b/foundry/packages/cli/test/theme.test.ts @@ -21,7 +21,7 @@ describe("resolveTuiTheme", () => { const baseConfig: AppConfig = ConfigSchema.parse({ auto_submit: true, notify: ["terminal"], - organization: { default: "default" }, + workspace: { default: "default" }, backend: { host: "127.0.0.1", port: 7741, @@ -31,9 +31,8 @@ describe("resolveTuiTheme", () => { backup_interval_secs: 3600, backup_retention_days: 7, }, - sandboxProviders: { - local: {}, - e2b: {}, + providers: { + daytona: { image: "ubuntu:24.04" }, }, }); diff --git a/foundry/packages/cli/test/tui-format.test.ts b/foundry/packages/cli/test/tui-format.test.ts index 15d3fe8..e60c839 100644 --- a/foundry/packages/cli/test/tui-format.test.ts +++ b/foundry/packages/cli/test/tui-format.test.ts @@ -3,32 +3,43 @@ import type { TaskRecord } from "@sandbox-agent/foundry-shared"; import { filterTasks, fuzzyMatch } from "@sandbox-agent/foundry-client"; import { formatRows } from "../src/tui.js"; -const sample = { - organizationId: "default", +const sample: TaskRecord = { + workspaceId: "default", repoId: "repo-a", repoRemote: "https://example.com/repo-a.git", taskId: "task-1", branchName: "feature/test", title: "Test Title", task: "Do test", - sandboxProviderId: "local", + providerId: "daytona", status: "running", + statusMessage: null, activeSandboxId: "sandbox-1", - pullRequest: null, + activeSessionId: "session-1", sandboxes: [ { sandboxId: "sandbox-1", - sandboxProviderId: "local", - sandboxActorId: null, - switchTarget: "sandbox://local/sandbox-1", + providerId: "daytona", + switchTarget: "daytona://sandbox-1", cwd: null, createdAt: 1, updatedAt: 1, }, ], + agentType: null, + prSubmitted: false, + diffStat: null, + prUrl: null, + prAuthor: null, + ciStatus: null, + reviewStatus: null, + reviewer: null, + conflictsWithMain: null, + hasUnpushed: null, + parentBranch: null, createdAt: 1, updatedAt: 1, -} satisfies TaskRecord & { pullRequest: null; activeSessionId?: null }; +}; describe("formatRows", () => { it("renders rust-style table header and empty state", () => { diff --git a/foundry/packages/cli/test/organization-config.test.ts b/foundry/packages/cli/test/workspace-config.test.ts similarity index 55% rename from foundry/packages/cli/test/organization-config.test.ts rename to foundry/packages/cli/test/workspace-config.test.ts index 5053ec2..1f2e33a 100644 --- a/foundry/packages/cli/test/organization-config.test.ts +++ b/foundry/packages/cli/test/workspace-config.test.ts @@ -1,13 +1,13 @@ import { describe, expect, it } from "vitest"; import { ConfigSchema } from "@sandbox-agent/foundry-shared"; -import { resolveOrganization } from "../src/organization/config.js"; +import { resolveWorkspace } from "../src/workspace/config.js"; -describe("cli organization resolution", () => { - it("uses default organization when no flag", () => { +describe("cli workspace resolution", () => { + it("uses default workspace when no flag", () => { const config = ConfigSchema.parse({ auto_submit: true as const, notify: ["terminal" as const], - organization: { default: "team" }, + workspace: { default: "team" }, backend: { host: "127.0.0.1", port: 7741, @@ -17,13 +17,12 @@ describe("cli organization resolution", () => { backup_interval_secs: 3600, backup_retention_days: 7, }, - sandboxProviders: { - local: {}, - e2b: {}, + providers: { + daytona: { image: "ubuntu:24.04" }, }, }); - expect(resolveOrganization(undefined, config)).toBe("team"); - expect(resolveOrganization("alpha", config)).toBe("alpha"); + expect(resolveWorkspace(undefined, config)).toBe("team"); + expect(resolveWorkspace("alpha", config)).toBe("alpha"); }); }); diff --git a/foundry/packages/client/package.json b/foundry/packages/client/package.json index fa73dab..98079d5 100644 --- a/foundry/packages/client/package.json +++ b/foundry/packages/client/package.json @@ -6,12 +6,12 @@ "main": "dist/index.js", "types": "dist/index.d.ts", "scripts": { - "build": "tsup src/index.ts --format esm --dts --tsconfig tsconfig.build.json", + "build": "tsup src/index.ts --format esm --dts", "typecheck": "tsc --noEmit", "test": "vitest run", "test:e2e:full": "HF_ENABLE_DAEMON_FULL_E2E=1 vitest run test/e2e/full-integration-e2e.test.ts", - "test:e2e:workspace": "HF_ENABLE_DAEMON_WORKBENCH_E2E=1 vitest run test/e2e/workspace-e2e.test.ts", - "test:e2e:workspace-load": "HF_ENABLE_DAEMON_WORKBENCH_LOAD_E2E=1 vitest run test/e2e/workspace-load-e2e.test.ts" + "test:e2e:workbench": "HF_ENABLE_DAEMON_WORKBENCH_E2E=1 vitest run test/e2e/workbench-e2e.test.ts", + "test:e2e:workbench-load": "HF_ENABLE_DAEMON_WORKBENCH_LOAD_E2E=1 vitest run test/e2e/workbench-load-e2e.test.ts" }, "dependencies": { "@sandbox-agent/foundry-shared": "workspace:*", diff --git a/foundry/packages/client/src/app-client.ts b/foundry/packages/client/src/app-client.ts index 0bf5526..1fb95d2 100644 --- a/foundry/packages/client/src/app-client.ts +++ b/foundry/packages/client/src/app-client.ts @@ -4,7 +4,6 @@ import type { FoundryOrganization, FoundryUser, UpdateFoundryOrganizationProfileInput, - WorkspaceModelId, } from "@sandbox-agent/foundry-shared"; import type { BackendClient } from "./backend-client.js"; import { getMockFoundryAppClient } from "./mock-app.js"; @@ -18,7 +17,6 @@ export interface FoundryAppClient { skipStarterRepo(): Promise; starStarterRepo(organizationId: string): Promise; selectOrganization(organizationId: string): Promise; - setDefaultModel(model: WorkspaceModelId): Promise; updateOrganizationProfile(input: UpdateFoundryOrganizationProfileInput): Promise; triggerGithubSync(organizationId: string): Promise; completeHostedCheckout(organizationId: string, planId: FoundryBillingPlanId): Promise; @@ -26,7 +24,7 @@ export interface FoundryAppClient { cancelScheduledRenewal(organizationId: string): Promise; resumeSubscription(organizationId: string): Promise; reconnectGithub(organizationId: string): Promise; - recordSeatUsage(organizationId: string): Promise; + recordSeatUsage(workspaceId: string): Promise; } export interface CreateFoundryAppClientOptions { diff --git a/foundry/packages/client/src/backend-client.ts b/foundry/packages/client/src/backend-client.ts index c2222cc..05047bb 100644 --- a/foundry/packages/client/src/backend-client.ts +++ b/foundry/packages/client/src/backend-client.ts @@ -1,50 +1,49 @@ import { createClient } from "rivetkit/client"; import type { AgentType, + AddRepoInput, AppConfig, FoundryAppSnapshot, FoundryBillingPlanId, CreateTaskInput, AppEvent, SessionEvent, - SandboxProcessSnapshot, SandboxProcessesEvent, TaskRecord, TaskSummary, - TaskWorkspaceChangeModelInput, - TaskWorkspaceChangeOwnerInput, - TaskWorkspaceCreateTaskInput, - TaskWorkspaceCreateTaskResponse, - TaskWorkspaceDiffInput, - TaskWorkspaceRenameInput, - TaskWorkspaceRenameSessionInput, - TaskWorkspaceSelectInput, - TaskWorkspaceSetSessionUnreadInput, - TaskWorkspaceSendMessageInput, - TaskWorkspaceSnapshot, - TaskWorkspaceSessionInput, - TaskWorkspaceUpdateDraftInput, + TaskWorkbenchChangeModelInput, + TaskWorkbenchCreateTaskInput, + TaskWorkbenchCreateTaskResponse, + TaskWorkbenchDiffInput, + TaskWorkbenchRenameInput, + TaskWorkbenchRenameSessionInput, + TaskWorkbenchSelectInput, + TaskWorkbenchSetSessionUnreadInput, + TaskWorkbenchSendMessageInput, + TaskWorkbenchSnapshot, + TaskWorkbenchTabInput, + TaskWorkbenchUpdateDraftInput, TaskEvent, - WorkspaceTaskDetail, - WorkspaceTaskSummary, - WorkspaceSessionDetail, - OrganizationEvent, - OrganizationSummarySnapshot, - AuditLogEvent as HistoryEvent, + WorkbenchTaskDetail, + WorkbenchTaskSummary, + WorkbenchSessionDetail, + WorkspaceEvent, + WorkspaceSummarySnapshot, + HistoryEvent, HistoryQueryInput, - SandboxProviderId, + ProviderId, RepoOverview, + RepoStackActionInput, + RepoStackActionResult, RepoRecord, StarSandboxAgentRepoInput, StarSandboxAgentRepoResult, SwitchResult, UpdateFoundryOrganizationProfileInput, - WorkspaceModelGroup, - WorkspaceModelId, } from "@sandbox-agent/foundry-shared"; -import type { ProcessCreateRequest, ProcessLogFollowQuery, ProcessLogsResponse, ProcessSignalQuery } from "sandbox-agent"; +import type { ProcessCreateRequest, ProcessInfo, ProcessLogFollowQuery, ProcessLogsResponse, ProcessSignalQuery } from "sandbox-agent"; import { createMockBackendClient } from "./mock/backend-client.js"; -import { taskKey, taskSandboxKey, organizationKey } from "./keys.js"; +import { sandboxInstanceKey, taskKey, workspaceKey } from "./keys.js"; export type TaskAction = "push" | "sync" | "merge" | "archive" | "kill"; @@ -55,7 +54,7 @@ export interface SandboxSessionRecord { lastConnectionId: string; createdAt: number; destroyedAt?: number; - status?: "pending_provision" | "pending_session_create" | "ready" | "running" | "idle" | "error"; + status?: "running" | "idle" | "error"; } export interface SandboxSessionEventRecord { @@ -68,7 +67,7 @@ export interface SandboxSessionEventRecord { payload: unknown; } -export type SandboxProcessRecord = SandboxProcessSnapshot; +export type SandboxProcessRecord = ProcessInfo; export interface ActorConn { on(event: string, listener: (payload: any) => void): () => void; @@ -76,54 +75,51 @@ export interface ActorConn { dispose(): Promise; } -interface AuthSessionScopedInput { - authSessionId?: string; -} - -interface OrganizationHandle { +interface WorkspaceHandle { connect(): ActorConn; - listRepos(input: { organizationId: string }): Promise; + addRepo(input: AddRepoInput): Promise; + listRepos(input: { workspaceId: string }): Promise; createTask(input: CreateTaskInput): Promise; - listTasks(input: { organizationId: string; repoId?: string }): Promise; - getRepoOverview(input: { organizationId: string; repoId: string }): Promise; - auditLog(input: HistoryQueryInput): Promise; - switchTask(input: { repoId: string; taskId: string }): Promise; - getTask(input: { organizationId: string; repoId: string; taskId: string }): Promise; - attachTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise<{ target: string; sessionId: string | null }>; - pushTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise; - syncTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise; - mergeTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise; - archiveTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise; - killTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise; - useOrganization(input: { organizationId: string }): Promise<{ organizationId: string }>; + listTasks(input: { workspaceId: string; repoId?: string }): Promise; + getRepoOverview(input: { workspaceId: string; repoId: string }): Promise; + runRepoStackAction(input: RepoStackActionInput): Promise; + history(input: HistoryQueryInput): Promise; + switchTask(taskId: string): Promise; + getTask(input: { workspaceId: string; taskId: string }): Promise; + attachTask(input: { workspaceId: string; taskId: string; reason?: string }): Promise<{ target: string; sessionId: string | null }>; + pushTask(input: { workspaceId: string; taskId: string; reason?: string }): Promise; + syncTask(input: { workspaceId: string; taskId: string; reason?: string }): Promise; + mergeTask(input: { workspaceId: string; taskId: string; reason?: string }): Promise; + archiveTask(input: { workspaceId: string; taskId: string; reason?: string }): Promise; + killTask(input: { workspaceId: string; taskId: string; reason?: string }): Promise; + useWorkspace(input: { workspaceId: string }): Promise<{ workspaceId: string }>; starSandboxAgentRepo(input: StarSandboxAgentRepoInput): Promise; - getOrganizationSummary(input: { organizationId: string }): Promise; - createWorkspaceTask(input: TaskWorkspaceCreateTaskInput & AuthSessionScopedInput): Promise; - markWorkspaceUnread(input: TaskWorkspaceSelectInput & AuthSessionScopedInput): Promise; - renameWorkspaceTask(input: TaskWorkspaceRenameInput & AuthSessionScopedInput): Promise; - createWorkspaceSession(input: TaskWorkspaceSelectInput & { model?: string } & AuthSessionScopedInput): Promise<{ sessionId: string }>; - renameWorkspaceSession(input: TaskWorkspaceRenameSessionInput & AuthSessionScopedInput): Promise; - selectWorkspaceSession(input: TaskWorkspaceSessionInput & AuthSessionScopedInput): Promise; - setWorkspaceSessionUnread(input: TaskWorkspaceSetSessionUnreadInput & AuthSessionScopedInput): Promise; - updateWorkspaceDraft(input: TaskWorkspaceUpdateDraftInput & AuthSessionScopedInput): Promise; - changeWorkspaceModel(input: TaskWorkspaceChangeModelInput & AuthSessionScopedInput): Promise; - sendWorkspaceMessage(input: TaskWorkspaceSendMessageInput & AuthSessionScopedInput): Promise; - stopWorkspaceSession(input: TaskWorkspaceSessionInput & AuthSessionScopedInput): Promise; - closeWorkspaceSession(input: TaskWorkspaceSessionInput & AuthSessionScopedInput): Promise; - publishWorkspacePr(input: TaskWorkspaceSelectInput & AuthSessionScopedInput): Promise; - changeWorkspaceTaskOwner(input: TaskWorkspaceChangeOwnerInput & AuthSessionScopedInput): Promise; - revertWorkspaceFile(input: TaskWorkspaceDiffInput & AuthSessionScopedInput): Promise; - adminReloadGithubOrganization(): Promise; - adminReloadGithubRepository(input: { repoId: string }): Promise; + getWorkspaceSummary(input: { workspaceId: string }): Promise; + applyTaskSummaryUpdate(input: { taskSummary: WorkbenchTaskSummary }): Promise; + removeTaskSummary(input: { taskId: string }): Promise; + reconcileWorkbenchState(input: { workspaceId: string }): Promise; + createWorkbenchTask(input: TaskWorkbenchCreateTaskInput): Promise; + markWorkbenchUnread(input: TaskWorkbenchSelectInput): Promise; + renameWorkbenchTask(input: TaskWorkbenchRenameInput): Promise; + renameWorkbenchBranch(input: TaskWorkbenchRenameInput): Promise; + createWorkbenchSession(input: TaskWorkbenchSelectInput & { model?: string }): Promise<{ tabId: string }>; + renameWorkbenchSession(input: TaskWorkbenchRenameSessionInput): Promise; + setWorkbenchSessionUnread(input: TaskWorkbenchSetSessionUnreadInput): Promise; + updateWorkbenchDraft(input: TaskWorkbenchUpdateDraftInput): Promise; + changeWorkbenchModel(input: TaskWorkbenchChangeModelInput): Promise; + sendWorkbenchMessage(input: TaskWorkbenchSendMessageInput): Promise; + stopWorkbenchSession(input: TaskWorkbenchTabInput): Promise; + closeWorkbenchSession(input: TaskWorkbenchTabInput): Promise; + publishWorkbenchPr(input: TaskWorkbenchSelectInput): Promise; + revertWorkbenchFile(input: TaskWorkbenchDiffInput): Promise; } -interface AppOrganizationHandle { +interface AppWorkspaceHandle { connect(): ActorConn; getAppSnapshot(input: { sessionId: string }): Promise; skipAppStarterRepo(input: { sessionId: string }): Promise; starAppStarterRepo(input: { sessionId: string; organizationId: string }): Promise; selectAppOrganization(input: { sessionId: string; organizationId: string }): Promise; - setAppDefaultModel(input: { sessionId: string; defaultModel: WorkspaceModelId }): Promise; updateAppOrganizationProfile(input: UpdateFoundryOrganizationProfileInput & { sessionId: string }): Promise; triggerAppRepoImport(input: { sessionId: string; organizationId: string }): Promise; beginAppGithubInstall(input: { sessionId: string; organizationId: string }): Promise<{ url: string }>; @@ -131,75 +127,67 @@ interface AppOrganizationHandle { createAppBillingPortalSession(input: { sessionId: string; organizationId: string }): Promise<{ url: string }>; cancelAppScheduledRenewal(input: { sessionId: string; organizationId: string }): Promise; resumeAppSubscription(input: { sessionId: string; organizationId: string }): Promise; - recordAppSeatUsage(input: { sessionId: string; organizationId: string }): Promise; + recordAppSeatUsage(input: { sessionId: string; workspaceId: string }): Promise; } interface TaskHandle { - getTaskSummary(): Promise; - getTaskDetail(input?: AuthSessionScopedInput): Promise; - getSessionDetail(input: { sessionId: string } & AuthSessionScopedInput): Promise; + getTaskSummary(): Promise; + getTaskDetail(): Promise; + getSessionDetail(input: { sessionId: string }): Promise; connect(): ActorConn; } -interface TaskSandboxHandle { +interface SandboxInstanceHandle { connect(): ActorConn; createSession(input: { - id?: string; - agent: string; - model?: string; - sessionInit?: { - cwd?: string; - }; - }): Promise<{ id: string }>; + prompt: string; + cwd?: string; + agent?: AgentType | "opencode"; + }): Promise<{ id: string | null; status: "running" | "idle" | "error"; error?: string }>; listSessions(input?: { cursor?: string; limit?: number }): Promise<{ items: SandboxSessionRecord[]; nextCursor?: string }>; - getEvents(input: { sessionId: string; cursor?: string; limit?: number }): Promise<{ items: SandboxSessionEventRecord[]; nextCursor?: string }>; + listSessionEvents(input: { sessionId: string; cursor?: string; limit?: number }): Promise<{ items: SandboxSessionEventRecord[]; nextCursor?: string }>; createProcess(input: ProcessCreateRequest): Promise; listProcesses(): Promise<{ processes: SandboxProcessRecord[] }>; - getProcessLogs(processId: string, query?: ProcessLogFollowQuery): Promise; - stopProcess(processId: string, query?: ProcessSignalQuery): Promise; - killProcess(processId: string, query?: ProcessSignalQuery): Promise; - deleteProcess(processId: string): Promise; - rawSendSessionMethod(sessionId: string, method: string, params: Record): Promise; - destroySession(sessionId: string): Promise; + getProcessLogs(input: { processId: string; query?: ProcessLogFollowQuery }): Promise; + stopProcess(input: { processId: string; query?: ProcessSignalQuery }): Promise; + killProcess(input: { processId: string; query?: ProcessSignalQuery }): Promise; + deleteProcess(input: { processId: string }): Promise; + sendPrompt(input: { sessionId: string; prompt: string; notification?: boolean }): Promise; + sessionStatus(input: { sessionId: string }): Promise<{ id: string; status: "running" | "idle" | "error" }>; sandboxAgentConnection(): Promise<{ endpoint: string; token?: string }>; - listWorkspaceModelGroups(): Promise; - providerState(): Promise<{ sandboxProviderId: SandboxProviderId; sandboxId: string; state: string; at: number }>; + providerState(): Promise<{ providerId: ProviderId; sandboxId: string; state: string; at: number }>; } interface RivetClient { - organization: { - getOrCreate(key?: string | string[], opts?: { createWithInput?: unknown }): OrganizationHandle; + workspace: { + getOrCreate(key?: string | string[], opts?: { createWithInput?: unknown }): WorkspaceHandle; }; task: { get(key?: string | string[]): TaskHandle; getOrCreate(key?: string | string[], opts?: { createWithInput?: unknown }): TaskHandle; }; - taskSandbox: { - get(key?: string | string[]): TaskSandboxHandle; - getOrCreate(key?: string | string[], opts?: { createWithInput?: unknown }): TaskSandboxHandle; - getForId(actorId: string): TaskSandboxHandle; + sandboxInstance: { + getOrCreate(key?: string | string[], opts?: { createWithInput?: unknown }): SandboxInstanceHandle; }; } export interface BackendClientOptions { endpoint: string; - defaultOrganizationId?: string; + defaultWorkspaceId?: string; mode?: "remote" | "mock"; - encoding?: "json" | "cbor" | "bare"; } export interface BackendClient { getAppSnapshot(): Promise; - connectOrganization(organizationId: string): Promise; - connectTask(organizationId: string, repoId: string, taskId: string): Promise; - connectSandbox(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise; + connectWorkspace(workspaceId: string): Promise; + connectTask(workspaceId: string, repoId: string, taskId: string): Promise; + connectSandbox(workspaceId: string, providerId: ProviderId, sandboxId: string): Promise; subscribeApp(listener: () => void): () => void; signInWithGithub(): Promise; signOutApp(): Promise; skipAppStarterRepo(): Promise; starAppStarterRepo(organizationId: string): Promise; selectAppOrganization(organizationId: string): Promise; - setAppDefaultModel(defaultModel: WorkspaceModelId): Promise; updateAppOrganizationProfile(input: UpdateFoundryOrganizationProfileInput): Promise; triggerAppRepoImport(organizationId: string): Promise; reconnectAppGithub(organizationId: string): Promise; @@ -207,112 +195,105 @@ export interface BackendClient { openAppBillingPortal(organizationId: string): Promise; cancelAppScheduledRenewal(organizationId: string): Promise; resumeAppSubscription(organizationId: string): Promise; - recordAppSeatUsage(organizationId: string): Promise; - listRepos(organizationId: string): Promise; + recordAppSeatUsage(workspaceId: string): Promise; + addRepo(workspaceId: string, remoteUrl: string): Promise; + listRepos(workspaceId: string): Promise; createTask(input: CreateTaskInput): Promise; - listTasks(organizationId: string, repoId?: string): Promise; - getRepoOverview(organizationId: string, repoId: string): Promise; - getTask(organizationId: string, repoId: string, taskId: string): Promise; + listTasks(workspaceId: string, repoId?: string): Promise; + getRepoOverview(workspaceId: string, repoId: string): Promise; + runRepoStackAction(input: RepoStackActionInput): Promise; + getTask(workspaceId: string, taskId: string): Promise; listHistory(input: HistoryQueryInput): Promise; - switchTask(organizationId: string, repoId: string, taskId: string): Promise; - attachTask(organizationId: string, repoId: string, taskId: string): Promise<{ target: string; sessionId: string | null }>; - runAction(organizationId: string, repoId: string, taskId: string, action: TaskAction): Promise; + switchTask(workspaceId: string, taskId: string): Promise; + attachTask(workspaceId: string, taskId: string): Promise<{ target: string; sessionId: string | null }>; + runAction(workspaceId: string, taskId: string, action: TaskAction): Promise; createSandboxSession(input: { - organizationId: string; - sandboxProviderId: SandboxProviderId; + workspaceId: string; + providerId: ProviderId; sandboxId: string; prompt: string; cwd?: string; agent?: AgentType | "opencode"; }): Promise<{ id: string; status: "running" | "idle" | "error" }>; listSandboxSessions( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, input?: { cursor?: string; limit?: number }, ): Promise<{ items: SandboxSessionRecord[]; nextCursor?: string }>; listSandboxSessionEvents( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, input: { sessionId: string; cursor?: string; limit?: number }, ): Promise<{ items: SandboxSessionEventRecord[]; nextCursor?: string }>; - createSandboxProcess(input: { - organizationId: string; - sandboxProviderId: SandboxProviderId; - sandboxId: string; - request: ProcessCreateRequest; - }): Promise; - listSandboxProcesses(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise<{ processes: SandboxProcessRecord[] }>; + createSandboxProcess(input: { workspaceId: string; providerId: ProviderId; sandboxId: string; request: ProcessCreateRequest }): Promise; + listSandboxProcesses(workspaceId: string, providerId: ProviderId, sandboxId: string): Promise<{ processes: SandboxProcessRecord[] }>; getSandboxProcessLogs( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, processId: string, query?: ProcessLogFollowQuery, ): Promise; stopSandboxProcess( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, processId: string, query?: ProcessSignalQuery, ): Promise; killSandboxProcess( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, processId: string, query?: ProcessSignalQuery, ): Promise; - deleteSandboxProcess(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string, processId: string): Promise; - subscribeSandboxProcesses(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string, listener: () => void): () => void; + deleteSandboxProcess(workspaceId: string, providerId: ProviderId, sandboxId: string, processId: string): Promise; + subscribeSandboxProcesses(workspaceId: string, providerId: ProviderId, sandboxId: string, listener: () => void): () => void; sendSandboxPrompt(input: { - organizationId: string; - sandboxProviderId: SandboxProviderId; + workspaceId: string; + providerId: ProviderId; sandboxId: string; sessionId: string; prompt: string; notification?: boolean; }): Promise; sandboxSessionStatus( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, sessionId: string, ): Promise<{ id: string; status: "running" | "idle" | "error" }>; sandboxProviderState( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, - ): Promise<{ sandboxProviderId: SandboxProviderId; sandboxId: string; state: string; at: number }>; - getSandboxAgentConnection(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise<{ endpoint: string; token?: string }>; - getSandboxWorkspaceModelGroups(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise; - getOrganizationSummary(organizationId: string): Promise; - getTaskDetail(organizationId: string, repoId: string, taskId: string): Promise; - getSessionDetail(organizationId: string, repoId: string, taskId: string, sessionId: string): Promise; - getWorkspace(organizationId: string): Promise; - subscribeWorkspace(organizationId: string, listener: () => void): () => void; - createWorkspaceTask(organizationId: string, input: TaskWorkspaceCreateTaskInput): Promise; - markWorkspaceUnread(organizationId: string, input: TaskWorkspaceSelectInput): Promise; - renameWorkspaceTask(organizationId: string, input: TaskWorkspaceRenameInput): Promise; - createWorkspaceSession(organizationId: string, input: TaskWorkspaceSelectInput & { model?: string }): Promise<{ sessionId: string }>; - renameWorkspaceSession(organizationId: string, input: TaskWorkspaceRenameSessionInput): Promise; - selectWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise; - setWorkspaceSessionUnread(organizationId: string, input: TaskWorkspaceSetSessionUnreadInput): Promise; - updateWorkspaceDraft(organizationId: string, input: TaskWorkspaceUpdateDraftInput): Promise; - changeWorkspaceModel(organizationId: string, input: TaskWorkspaceChangeModelInput): Promise; - sendWorkspaceMessage(organizationId: string, input: TaskWorkspaceSendMessageInput): Promise; - stopWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise; - closeWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise; - publishWorkspacePr(organizationId: string, input: TaskWorkspaceSelectInput): Promise; - changeWorkspaceTaskOwner(organizationId: string, input: TaskWorkspaceChangeOwnerInput): Promise; - revertWorkspaceFile(organizationId: string, input: TaskWorkspaceDiffInput): Promise; - adminReloadGithubOrganization(organizationId: string): Promise; - adminReloadGithubRepository(organizationId: string, repoId: string): Promise; + ): Promise<{ providerId: ProviderId; sandboxId: string; state: string; at: number }>; + getSandboxAgentConnection(workspaceId: string, providerId: ProviderId, sandboxId: string): Promise<{ endpoint: string; token?: string }>; + getWorkspaceSummary(workspaceId: string): Promise; + getTaskDetail(workspaceId: string, repoId: string, taskId: string): Promise; + getSessionDetail(workspaceId: string, repoId: string, taskId: string, sessionId: string): Promise; + getWorkbench(workspaceId: string): Promise; + subscribeWorkbench(workspaceId: string, listener: () => void): () => void; + createWorkbenchTask(workspaceId: string, input: TaskWorkbenchCreateTaskInput): Promise; + markWorkbenchUnread(workspaceId: string, input: TaskWorkbenchSelectInput): Promise; + renameWorkbenchTask(workspaceId: string, input: TaskWorkbenchRenameInput): Promise; + renameWorkbenchBranch(workspaceId: string, input: TaskWorkbenchRenameInput): Promise; + createWorkbenchSession(workspaceId: string, input: TaskWorkbenchSelectInput & { model?: string }): Promise<{ tabId: string }>; + renameWorkbenchSession(workspaceId: string, input: TaskWorkbenchRenameSessionInput): Promise; + setWorkbenchSessionUnread(workspaceId: string, input: TaskWorkbenchSetSessionUnreadInput): Promise; + updateWorkbenchDraft(workspaceId: string, input: TaskWorkbenchUpdateDraftInput): Promise; + changeWorkbenchModel(workspaceId: string, input: TaskWorkbenchChangeModelInput): Promise; + sendWorkbenchMessage(workspaceId: string, input: TaskWorkbenchSendMessageInput): Promise; + stopWorkbenchSession(workspaceId: string, input: TaskWorkbenchTabInput): Promise; + closeWorkbenchSession(workspaceId: string, input: TaskWorkbenchTabInput): Promise; + publishWorkbenchPr(workspaceId: string, input: TaskWorkbenchSelectInput): Promise; + revertWorkbenchFile(workspaceId: string, input: TaskWorkbenchDiffInput): Promise; health(): Promise<{ ok: true }>; - useOrganization(organizationId: string): Promise<{ organizationId: string }>; - starSandboxAgentRepo(organizationId: string): Promise; + useWorkspace(workspaceId: string): Promise<{ workspaceId: string }>; + starSandboxAgentRepo(workspaceId: string): Promise; } export function rivetEndpoint(config: AppConfig): string { @@ -322,49 +303,10 @@ export function rivetEndpoint(config: AppConfig): string { export function createBackendClientFromConfig(config: AppConfig): BackendClient { return createBackendClient({ endpoint: rivetEndpoint(config), - defaultOrganizationId: config.organization.default, + defaultWorkspaceId: config.workspace.default, }); } -export interface BackendHealthCheckOptions { - endpoint: string; - timeoutMs?: number; -} - -export interface BackendMetadata { - clientEndpoint: string; - appEndpoint: string; - rivetEndpoint: string; -} - -export async function checkBackendHealth(options: BackendHealthCheckOptions): Promise { - const controller = new AbortController(); - const timeout = setTimeout(() => controller.abort(), options.timeoutMs ?? 1_500); - - try { - const response = await fetch(normalizeLegacyBackendEndpoint(options.endpoint), { - method: "GET", - signal: controller.signal, - }); - return response.status < 500; - } catch { - return false; - } finally { - clearTimeout(timeout); - } -} - -export async function readBackendMetadata(options: BackendHealthCheckOptions): Promise { - const endpoints = deriveBackendEndpoints(options.endpoint); - const clientEndpoint = endpoints.rivetEndpoint.replace(/\/v1\/rivet\/?$/, ""); - - return { - clientEndpoint, - appEndpoint: endpoints.appEndpoint, - rivetEndpoint: endpoints.rivetEndpoint, - }; -} - function stripTrailingSlash(value: string): string { return value.replace(/\/$/, ""); } @@ -411,14 +353,14 @@ function signedOutAppSnapshot(): FoundryAppSnapshot { export function createBackendClient(options: BackendClientOptions): BackendClient { if (options.mode === "mock") { - return createMockBackendClient(options.defaultOrganizationId); + return createMockBackendClient(options.defaultWorkspaceId); } const endpoints = deriveBackendEndpoints(options.endpoint); const rivetApiEndpoint = endpoints.rivetEndpoint; const appApiEndpoint = endpoints.appEndpoint; - const client = createClient({ endpoint: rivetApiEndpoint, encoding: options.encoding }) as unknown as RivetClient; - const workspaceSubscriptions = new Map< + const client = createClient({ endpoint: rivetApiEndpoint }) as unknown as RivetClient; + const workbenchSubscriptions = new Map< string, { listeners: Set<() => void>; @@ -469,38 +411,20 @@ export function createBackendClient(options: BackendClientOptions): BackendClien return typeof sessionId === "string" && sessionId.length > 0 ? sessionId : null; }; - const getAuthSessionInput = async (): Promise => { - const authSessionId = await getSessionId(); - return authSessionId ? { authSessionId } : undefined; - }; - - const withAuthSessionInput = async (input: TInput): Promise => { - const authSessionInput = await getAuthSessionInput(); - return authSessionInput ? { ...input, ...authSessionInput } : input; - }; - - const organization = async (organizationId: string): Promise => - client.organization.getOrCreate(organizationKey(organizationId), { - createWithInput: organizationId, + const workspace = async (workspaceId: string): Promise => + client.workspace.getOrCreate(workspaceKey(workspaceId), { + createWithInput: workspaceId, }); - const appOrganization = async (): Promise => - client.organization.getOrCreate(organizationKey("app"), { + const appWorkspace = async (): Promise => + client.workspace.getOrCreate(workspaceKey("app"), { createWithInput: "app", - }) as unknown as AppOrganizationHandle; + }) as unknown as AppWorkspaceHandle; - // getOrCreate is intentional here — this is the ONLY lazy creation point for - // virtual tasks (PR-driven entries that exist in the org's local tables but - // have no task actor yet). The task actor self-initializes from org data in - // getCurrentRecord(). Backend code must NEVER use getOrCreateTask except in - // createTaskMutation. See backend/CLAUDE.md "Lazy Task Actor Creation". - const task = async (organizationId: string, repoId: string, taskId: string): Promise => - client.task.getOrCreate(taskKey(organizationId, repoId, taskId), { - createWithInput: { organizationId, repoId, taskId }, - }); + const task = async (workspaceId: string, repoId: string, taskId: string): Promise => client.task.get(taskKey(workspaceId, repoId, taskId)); - const sandboxByKey = async (organizationId: string, _providerId: SandboxProviderId, sandboxId: string): Promise => { - return (client as any).taskSandbox.get(taskSandboxKey(organizationId, sandboxId)); + const sandboxByKey = async (workspaceId: string, providerId: ProviderId, sandboxId: string): Promise => { + return (client as any).sandboxInstance.get(sandboxInstanceKey(workspaceId, providerId, sandboxId)); }; function isActorNotFoundError(error: unknown): boolean { @@ -508,28 +432,26 @@ export function createBackendClient(options: BackendClientOptions): BackendClien return message.includes("Actor not found"); } - const sandboxByActorIdFromTask = async ( - organizationId: string, - sandboxProviderId: SandboxProviderId, - sandboxId: string, - ): Promise => { - const ws = await organization(organizationId); - const rows = await ws.listTasks({ organizationId }); + const sandboxByActorIdFromTask = async (workspaceId: string, providerId: ProviderId, sandboxId: string): Promise => { + const ws = await workspace(workspaceId); + const rows = await ws.listTasks({ workspaceId }); const candidates = [...rows].sort((a, b) => b.updatedAt - a.updatedAt); for (const row of candidates) { try { - const detail = await ws.getTask({ organizationId, repoId: row.repoId, taskId: row.taskId }); - if (detail.sandboxProviderId !== sandboxProviderId) { + const detail = await ws.getTask({ workspaceId, taskId: row.taskId }); + if (detail.providerId !== providerId) { continue; } - const sandboxes = detail.sandboxes as Array<(typeof detail.sandboxes)[number] & { sandboxActorId?: string }>; - const sandbox = sandboxes.find( + const sandbox = detail.sandboxes.find( (sb) => - sb.sandboxId === sandboxId && sb.sandboxProviderId === sandboxProviderId && typeof sb.sandboxActorId === "string" && sb.sandboxActorId.length > 0, - ); + sb.sandboxId === sandboxId && + sb.providerId === providerId && + typeof (sb as any).sandboxActorId === "string" && + (sb as any).sandboxActorId.length > 0, + ) as { sandboxActorId?: string } | undefined; if (sandbox?.sandboxActorId) { - return (client as any).taskSandbox.getForId(sandbox.sandboxActorId); + return (client as any).sandboxInstance.getForId(sandbox.sandboxActorId); } } catch (error) { const message = error instanceof Error ? error.message : String(error); @@ -544,19 +466,19 @@ export function createBackendClient(options: BackendClientOptions): BackendClien }; const withSandboxHandle = async ( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, - run: (handle: TaskSandboxHandle) => Promise, + run: (handle: SandboxInstanceHandle) => Promise, ): Promise => { - const handle = await sandboxByKey(organizationId, sandboxProviderId, sandboxId); + const handle = await sandboxByKey(workspaceId, providerId, sandboxId); try { return await run(handle); } catch (error) { if (!isActorNotFoundError(error)) { throw error; } - const fallback = await sandboxByActorIdFromTask(organizationId, sandboxProviderId, sandboxId); + const fallback = await sandboxByActorIdFromTask(workspaceId, providerId, sandboxId); if (!fallback) { throw error; } @@ -564,22 +486,22 @@ export function createBackendClient(options: BackendClientOptions): BackendClien } }; - const connectOrganization = async (organizationId: string): Promise => { - return (await organization(organizationId)).connect() as ActorConn; + const connectWorkspace = async (workspaceId: string): Promise => { + return (await workspace(workspaceId)).connect() as ActorConn; }; - const connectTask = async (organizationId: string, repoId: string, taskIdValue: string): Promise => { - return (await task(organizationId, repoId, taskIdValue)).connect() as ActorConn; + const connectTask = async (workspaceId: string, repoId: string, taskIdValue: string): Promise => { + return (await task(workspaceId, repoId, taskIdValue)).connect() as ActorConn; }; - const connectSandbox = async (organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise => { + const connectSandbox = async (workspaceId: string, providerId: ProviderId, sandboxId: string): Promise => { try { - return (await sandboxByKey(organizationId, sandboxProviderId, sandboxId)).connect() as ActorConn; + return (await sandboxByKey(workspaceId, providerId, sandboxId)).connect() as ActorConn; } catch (error) { if (!isActorNotFoundError(error)) { throw error; } - const fallback = await sandboxByActorIdFromTask(organizationId, sandboxProviderId, sandboxId); + const fallback = await sandboxByActorIdFromTask(workspaceId, providerId, sandboxId); if (!fallback) { throw error; } @@ -587,46 +509,18 @@ export function createBackendClient(options: BackendClientOptions): BackendClien } }; - const getTaskDetailWithAuth = async (organizationId: string, repoId: string, taskIdValue: string): Promise => { - return (await task(organizationId, repoId, taskIdValue)).getTaskDetail(await getAuthSessionInput()); - }; - - const getSessionDetailWithAuth = async (organizationId: string, repoId: string, taskIdValue: string, sessionId: string): Promise => { - return (await task(organizationId, repoId, taskIdValue)).getSessionDetail(await withAuthSessionInput({ sessionId })); - }; - - const getWorkspaceCompat = async (organizationId: string): Promise => { - const authSessionInput = await getAuthSessionInput(); - const summary = await (await organization(organizationId)).getOrganizationSummary({ organizationId }); - const resolvedTasks = await Promise.all( + const getWorkbenchCompat = async (workspaceId: string): Promise => { + const summary = await (await workspace(workspaceId)).getWorkspaceSummary({ workspaceId }); + const tasks = await Promise.all( summary.taskSummaries.map(async (taskSummary) => { - let detail; - try { - const taskHandle = await task(organizationId, taskSummary.repoId, taskSummary.id); - detail = await taskHandle.getTaskDetail(authSessionInput); - } catch (error) { - if (isActorNotFoundError(error)) { - return null; - } - throw error; - } + const detail = await (await task(workspaceId, taskSummary.repoId, taskSummary.id)).getTaskDetail(); const sessionDetails = await Promise.all( detail.sessionsSummary.map(async (session) => { - try { - const full = await (await task(organizationId, detail.repoId, detail.id)).getSessionDetail({ - sessionId: session.id, - ...(authSessionInput ?? {}), - }); - return [session.id, full] as const; - } catch (error) { - if (isActorNotFoundError(error)) { - return null; - } - throw error; - } + const full = await (await task(workspaceId, detail.repoId, detail.id)).getSessionDetail({ sessionId: session.id }); + return [session.id, full] as const; }), ); - const sessionDetailsById = new Map(sessionDetails.filter((entry): entry is readonly [string, WorkspaceSessionDetail] => entry !== null)); + const sessionDetailsById = new Map(sessionDetails); return { id: detail.id, repoId: detail.repoId, @@ -636,8 +530,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien updatedAtMs: detail.updatedAtMs, branch: detail.branch, pullRequest: detail.pullRequest, - activeSessionId: detail.activeSessionId ?? null, - sessions: detail.sessionsSummary.map((session) => { + tabs: detail.sessionsSummary.map((session) => { const full = sessionDetailsById.get(session.id); return { id: session.id, @@ -657,13 +550,11 @@ export function createBackendClient(options: BackendClientOptions): BackendClien diffs: detail.diffs, fileTree: detail.fileTree, minutesUsed: detail.minutesUsed, - activeSandboxId: detail.activeSandboxId ?? null, }; }), ); - const tasks = resolvedTasks.filter((task): task is Exclude<(typeof resolvedTasks)[number], null> => task !== null); - const repositories = summary.repos + const projects = summary.repos .map((repo) => ({ id: repo.id, label: repo.label, @@ -673,31 +564,31 @@ export function createBackendClient(options: BackendClientOptions): BackendClien .filter((repo) => repo.tasks.length > 0); return { - organizationId, + workspaceId, repos: summary.repos.map((repo) => ({ id: repo.id, label: repo.label })), - repositories, + projects, tasks: tasks.sort((left, right) => right.updatedAtMs - left.updatedAtMs), }; }; - const subscribeWorkspace = (organizationId: string, listener: () => void): (() => void) => { - let entry = workspaceSubscriptions.get(organizationId); + const subscribeWorkbench = (workspaceId: string, listener: () => void): (() => void) => { + let entry = workbenchSubscriptions.get(workspaceId); if (!entry) { entry = { listeners: new Set(), disposeConnPromise: null, }; - workspaceSubscriptions.set(organizationId, entry); + workbenchSubscriptions.set(workspaceId, entry); } entry.listeners.add(listener); if (!entry.disposeConnPromise) { entry.disposeConnPromise = (async () => { - const handle = await organization(organizationId); + const handle = await workspace(workspaceId); const conn = (handle as any).connect(); - const unsubscribeEvent = conn.on("organizationUpdated", () => { - const current = workspaceSubscriptions.get(organizationId); + const unsubscribeEvent = conn.on("workbenchUpdated", () => { + const current = workbenchSubscriptions.get(workspaceId); if (!current) { return; } @@ -715,7 +606,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien } return () => { - const current = workspaceSubscriptions.get(organizationId); + const current = workbenchSubscriptions.get(workspaceId); if (!current) { return; } @@ -724,18 +615,17 @@ export function createBackendClient(options: BackendClientOptions): BackendClien return; } - workspaceSubscriptions.delete(organizationId); + workbenchSubscriptions.delete(workspaceId); void current.disposeConnPromise?.then(async (disposeConn) => { await disposeConn?.(); }); }; }; - const sandboxProcessSubscriptionKey = (organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): string => - `${organizationId}:${sandboxProviderId}:${sandboxId}`; + const sandboxProcessSubscriptionKey = (workspaceId: string, providerId: ProviderId, sandboxId: string): string => `${workspaceId}:${providerId}:${sandboxId}`; - const subscribeSandboxProcesses = (organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string, listener: () => void): (() => void) => { - const key = sandboxProcessSubscriptionKey(organizationId, sandboxProviderId, sandboxId); + const subscribeSandboxProcesses = (workspaceId: string, providerId: ProviderId, sandboxId: string, listener: () => void): (() => void) => { + const key = sandboxProcessSubscriptionKey(workspaceId, providerId, sandboxId); let entry = sandboxProcessSubscriptions.get(key); if (!entry) { entry = { @@ -749,7 +639,8 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!entry.disposeConnPromise) { entry.disposeConnPromise = (async () => { - const conn = await connectSandbox(organizationId, sandboxProviderId, sandboxId); + const handle = await sandboxByKey(workspaceId, providerId, sandboxId); + const conn = (handle as any).connect(); const unsubscribeEvent = conn.on("processesUpdated", () => { const current = sandboxProcessSubscriptions.get(key); if (!current) { @@ -790,7 +681,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!appSubscriptions.disposeConnPromise) { appSubscriptions.disposeConnPromise = (async () => { - const handle = await appOrganization(); + const handle = await appWorkspace(); const conn = (handle as any).connect(); const unsubscribeEvent = conn.on("appUpdated", () => { for (const currentListener of [...appSubscriptions.listeners]) { @@ -825,19 +716,19 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { return signedOutAppSnapshot(); } - return await (await appOrganization()).getAppSnapshot({ sessionId }); + return await (await appWorkspace()).getAppSnapshot({ sessionId }); }, - async connectOrganization(organizationId: string): Promise { - return await connectOrganization(organizationId); + async connectWorkspace(workspaceId: string): Promise { + return await connectWorkspace(workspaceId); }, - async connectTask(organizationId: string, repoId: string, taskIdValue: string): Promise { - return await connectTask(organizationId, repoId, taskIdValue); + async connectTask(workspaceId: string, repoId: string, taskIdValue: string): Promise { + return await connectTask(workspaceId, repoId, taskIdValue); }, - async connectSandbox(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise { - return await connectSandbox(organizationId, sandboxProviderId, sandboxId); + async connectSandbox(workspaceId: string, providerId: ProviderId, sandboxId: string): Promise { + return await connectSandbox(workspaceId, providerId, sandboxId); }, subscribeApp(listener: () => void): () => void { @@ -868,7 +759,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { throw new Error("No active auth session"); } - return await (await appOrganization()).skipAppStarterRepo({ sessionId }); + return await (await appWorkspace()).skipAppStarterRepo({ sessionId }); }, async starAppStarterRepo(organizationId: string): Promise { @@ -876,7 +767,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { throw new Error("No active auth session"); } - return await (await appOrganization()).starAppStarterRepo({ sessionId, organizationId }); + return await (await appWorkspace()).starAppStarterRepo({ sessionId, organizationId }); }, async selectAppOrganization(organizationId: string): Promise { @@ -884,15 +775,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { throw new Error("No active auth session"); } - return await (await appOrganization()).selectAppOrganization({ sessionId, organizationId }); - }, - - async setAppDefaultModel(defaultModel: WorkspaceModelId): Promise { - const sessionId = await getSessionId(); - if (!sessionId) { - throw new Error("No active auth session"); - } - return await (await appOrganization()).setAppDefaultModel({ sessionId, defaultModel }); + return await (await appWorkspace()).selectAppOrganization({ sessionId, organizationId }); }, async updateAppOrganizationProfile(input: UpdateFoundryOrganizationProfileInput): Promise { @@ -900,7 +783,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { throw new Error("No active auth session"); } - return await (await appOrganization()).updateAppOrganizationProfile({ + return await (await appWorkspace()).updateAppOrganizationProfile({ sessionId, organizationId: input.organizationId, displayName: input.displayName, @@ -914,7 +797,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { throw new Error("No active auth session"); } - return await (await appOrganization()).triggerAppRepoImport({ sessionId, organizationId }); + return await (await appWorkspace()).triggerAppRepoImport({ sessionId, organizationId }); }, async reconnectAppGithub(organizationId: string): Promise { @@ -922,7 +805,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { throw new Error("No active auth session"); } - const response = await (await appOrganization()).beginAppGithubInstall({ sessionId, organizationId }); + const response = await (await appWorkspace()).beginAppGithubInstall({ sessionId, organizationId }); if (typeof window !== "undefined") { window.location.assign(response.url); } @@ -933,7 +816,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { throw new Error("No active auth session"); } - const response = await (await appOrganization()).createAppCheckoutSession({ sessionId, organizationId, planId }); + const response = await (await appWorkspace()).createAppCheckoutSession({ sessionId, organizationId, planId }); if (typeof window !== "undefined") { window.location.assign(response.url); } @@ -944,7 +827,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { throw new Error("No active auth session"); } - const response = await (await appOrganization()).createAppBillingPortalSession({ sessionId, organizationId }); + const response = await (await appWorkspace()).createAppBillingPortalSession({ sessionId, organizationId }); if (typeof window !== "undefined") { window.location.assign(response.url); } @@ -955,7 +838,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { throw new Error("No active auth session"); } - return await (await appOrganization()).cancelAppScheduledRenewal({ sessionId, organizationId }); + return await (await appWorkspace()).cancelAppScheduledRenewal({ sessionId, organizationId }); }, async resumeAppSubscription(organizationId: string): Promise { @@ -963,358 +846,329 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (!sessionId) { throw new Error("No active auth session"); } - return await (await appOrganization()).resumeAppSubscription({ sessionId, organizationId }); + return await (await appWorkspace()).resumeAppSubscription({ sessionId, organizationId }); }, - async recordAppSeatUsage(organizationId: string): Promise { + async recordAppSeatUsage(workspaceId: string): Promise { const sessionId = await getSessionId(); if (!sessionId) { throw new Error("No active auth session"); } - return await (await appOrganization()).recordAppSeatUsage({ sessionId, organizationId }); + return await (await appWorkspace()).recordAppSeatUsage({ sessionId, workspaceId }); }, - async listRepos(organizationId: string): Promise { - return (await organization(organizationId)).listRepos({ organizationId }); + async addRepo(workspaceId: string, remoteUrl: string): Promise { + return (await workspace(workspaceId)).addRepo({ workspaceId, remoteUrl }); + }, + + async listRepos(workspaceId: string): Promise { + return (await workspace(workspaceId)).listRepos({ workspaceId }); }, async createTask(input: CreateTaskInput): Promise { - return (await organization(input.organizationId)).createTask(input); + return (await workspace(input.workspaceId)).createTask(input); }, - async starSandboxAgentRepo(organizationId: string): Promise { - return (await organization(organizationId)).starSandboxAgentRepo({ organizationId }); + async starSandboxAgentRepo(workspaceId: string): Promise { + return (await workspace(workspaceId)).starSandboxAgentRepo({ workspaceId }); }, - async listTasks(organizationId: string, repoId?: string): Promise { - return (await organization(organizationId)).listTasks({ organizationId, repoId }); + async listTasks(workspaceId: string, repoId?: string): Promise { + return (await workspace(workspaceId)).listTasks({ workspaceId, repoId }); }, - async getRepoOverview(organizationId: string, repoId: string): Promise { - return (await organization(organizationId)).getRepoOverview({ organizationId, repoId }); + async getRepoOverview(workspaceId: string, repoId: string): Promise { + return (await workspace(workspaceId)).getRepoOverview({ workspaceId, repoId }); }, - async getTask(organizationId: string, repoId: string, taskId: string): Promise { - return (await organization(organizationId)).getTask({ - organizationId, - repoId, + async runRepoStackAction(input: RepoStackActionInput): Promise { + return (await workspace(input.workspaceId)).runRepoStackAction(input); + }, + + async getTask(workspaceId: string, taskId: string): Promise { + return (await workspace(workspaceId)).getTask({ + workspaceId, taskId, }); }, async listHistory(input: HistoryQueryInput): Promise { - return (await organization(input.organizationId)).auditLog(input); + return (await workspace(input.workspaceId)).history(input); }, - async switchTask(organizationId: string, repoId: string, taskId: string): Promise { - return (await organization(organizationId)).switchTask({ repoId, taskId }); + async switchTask(workspaceId: string, taskId: string): Promise { + return (await workspace(workspaceId)).switchTask(taskId); }, - async attachTask(organizationId: string, repoId: string, taskId: string): Promise<{ target: string; sessionId: string | null }> { - return (await organization(organizationId)).attachTask({ - organizationId, - repoId, + async attachTask(workspaceId: string, taskId: string): Promise<{ target: string; sessionId: string | null }> { + return (await workspace(workspaceId)).attachTask({ + workspaceId, taskId, reason: "cli.attach", }); }, - async runAction(organizationId: string, repoId: string, taskId: string, action: TaskAction): Promise { + async runAction(workspaceId: string, taskId: string, action: TaskAction): Promise { if (action === "push") { - await (await organization(organizationId)).pushTask({ - organizationId, - repoId, + await (await workspace(workspaceId)).pushTask({ + workspaceId, taskId, reason: "cli.push", }); return; } if (action === "sync") { - await (await organization(organizationId)).syncTask({ - organizationId, - repoId, + await (await workspace(workspaceId)).syncTask({ + workspaceId, taskId, reason: "cli.sync", }); return; } if (action === "merge") { - await (await organization(organizationId)).mergeTask({ - organizationId, - repoId, + await (await workspace(workspaceId)).mergeTask({ + workspaceId, taskId, reason: "cli.merge", }); return; } if (action === "archive") { - await (await organization(organizationId)).archiveTask({ - organizationId, - repoId, + await (await workspace(workspaceId)).archiveTask({ + workspaceId, taskId, reason: "cli.archive", }); return; } - await (await organization(organizationId)).killTask({ - organizationId, - repoId, + await (await workspace(workspaceId)).killTask({ + workspaceId, taskId, reason: "cli.kill", }); }, async createSandboxSession(input: { - organizationId: string; - sandboxProviderId: SandboxProviderId; + workspaceId: string; + providerId: ProviderId; sandboxId: string; prompt: string; cwd?: string; agent?: AgentType | "opencode"; }): Promise<{ id: string; status: "running" | "idle" | "error" }> { - const created = await withSandboxHandle(input.organizationId, input.sandboxProviderId, input.sandboxId, async (handle) => + const created = await withSandboxHandle(input.workspaceId, input.providerId, input.sandboxId, async (handle) => handle.createSession({ - agent: input.agent ?? "claude", - sessionInit: { - cwd: input.cwd, - }, + prompt: input.prompt, + cwd: input.cwd, + agent: input.agent, }), ); - if (input.prompt.trim().length > 0) { - await withSandboxHandle(input.organizationId, input.sandboxProviderId, input.sandboxId, async (handle) => - handle.rawSendSessionMethod(created.id, "session/prompt", { - prompt: [{ type: "text", text: input.prompt }], - }), - ); + if (!created.id) { + throw new Error(created.error ?? "sandbox session creation failed"); } return { id: created.id, - status: "idle", + status: created.status, }; }, async listSandboxSessions( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, input?: { cursor?: string; limit?: number }, ): Promise<{ items: SandboxSessionRecord[]; nextCursor?: string }> { - return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.listSessions(input ?? {})); + return await withSandboxHandle(workspaceId, providerId, sandboxId, async (handle) => handle.listSessions(input ?? {})); }, async listSandboxSessionEvents( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, input: { sessionId: string; cursor?: string; limit?: number }, ): Promise<{ items: SandboxSessionEventRecord[]; nextCursor?: string }> { - return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.getEvents(input)); + return await withSandboxHandle(workspaceId, providerId, sandboxId, async (handle) => handle.listSessionEvents(input)); }, async createSandboxProcess(input: { - organizationId: string; - sandboxProviderId: SandboxProviderId; + workspaceId: string; + providerId: ProviderId; sandboxId: string; request: ProcessCreateRequest; }): Promise { - return await withSandboxHandle(input.organizationId, input.sandboxProviderId, input.sandboxId, async (handle) => handle.createProcess(input.request)); + return await withSandboxHandle(input.workspaceId, input.providerId, input.sandboxId, async (handle) => handle.createProcess(input.request)); }, - async listSandboxProcesses( - organizationId: string, - sandboxProviderId: SandboxProviderId, - sandboxId: string, - ): Promise<{ processes: SandboxProcessRecord[] }> { - return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.listProcesses()); + async listSandboxProcesses(workspaceId: string, providerId: ProviderId, sandboxId: string): Promise<{ processes: SandboxProcessRecord[] }> { + return await withSandboxHandle(workspaceId, providerId, sandboxId, async (handle) => handle.listProcesses()); }, async getSandboxProcessLogs( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, processId: string, query?: ProcessLogFollowQuery, ): Promise { - return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.getProcessLogs(processId, query)); + return await withSandboxHandle(workspaceId, providerId, sandboxId, async (handle) => handle.getProcessLogs({ processId, query })); }, async stopSandboxProcess( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, processId: string, query?: ProcessSignalQuery, ): Promise { - return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.stopProcess(processId, query)); + return await withSandboxHandle(workspaceId, providerId, sandboxId, async (handle) => handle.stopProcess({ processId, query })); }, async killSandboxProcess( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, processId: string, query?: ProcessSignalQuery, ): Promise { - return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.killProcess(processId, query)); + return await withSandboxHandle(workspaceId, providerId, sandboxId, async (handle) => handle.killProcess({ processId, query })); }, - async deleteSandboxProcess(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string, processId: string): Promise { - await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.deleteProcess(processId)); + async deleteSandboxProcess(workspaceId: string, providerId: ProviderId, sandboxId: string, processId: string): Promise { + await withSandboxHandle(workspaceId, providerId, sandboxId, async (handle) => handle.deleteProcess({ processId })); }, - subscribeSandboxProcesses(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string, listener: () => void): () => void { - return subscribeSandboxProcesses(organizationId, sandboxProviderId, sandboxId, listener); + subscribeSandboxProcesses(workspaceId: string, providerId: ProviderId, sandboxId: string, listener: () => void): () => void { + return subscribeSandboxProcesses(workspaceId, providerId, sandboxId, listener); }, async sendSandboxPrompt(input: { - organizationId: string; - sandboxProviderId: SandboxProviderId; + workspaceId: string; + providerId: ProviderId; sandboxId: string; sessionId: string; prompt: string; notification?: boolean; }): Promise { - await withSandboxHandle(input.organizationId, input.sandboxProviderId, input.sandboxId, async (handle) => - handle.rawSendSessionMethod(input.sessionId, "session/prompt", { - prompt: [{ type: "text", text: input.prompt }], + await withSandboxHandle(input.workspaceId, input.providerId, input.sandboxId, async (handle) => + handle.sendPrompt({ + sessionId: input.sessionId, + prompt: input.prompt, + notification: input.notification, }), ); }, async sandboxSessionStatus( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, sessionId: string, ): Promise<{ id: string; status: "running" | "idle" | "error" }> { - return { - id: sessionId, - status: "idle", - }; + return await withSandboxHandle(workspaceId, providerId, sandboxId, async (handle) => handle.sessionStatus({ sessionId })); }, async sandboxProviderState( - organizationId: string, - sandboxProviderId: SandboxProviderId, + workspaceId: string, + providerId: ProviderId, sandboxId: string, - ): Promise<{ sandboxProviderId: SandboxProviderId; sandboxId: string; state: string; at: number }> { - return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.providerState()); + ): Promise<{ providerId: ProviderId; sandboxId: string; state: string; at: number }> { + return await withSandboxHandle(workspaceId, providerId, sandboxId, async (handle) => handle.providerState()); }, - async getSandboxAgentConnection( - organizationId: string, - sandboxProviderId: SandboxProviderId, - sandboxId: string, - ): Promise<{ endpoint: string; token?: string }> { - return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.sandboxAgentConnection()); + async getSandboxAgentConnection(workspaceId: string, providerId: ProviderId, sandboxId: string): Promise<{ endpoint: string; token?: string }> { + return await withSandboxHandle(workspaceId, providerId, sandboxId, async (handle) => handle.sandboxAgentConnection()); }, - async getSandboxWorkspaceModelGroups(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise { - return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.listWorkspaceModelGroups()); + async getWorkspaceSummary(workspaceId: string): Promise { + return (await workspace(workspaceId)).getWorkspaceSummary({ workspaceId }); }, - async getOrganizationSummary(organizationId: string): Promise { - return (await organization(organizationId)).getOrganizationSummary({ organizationId }); + async getTaskDetail(workspaceId: string, repoId: string, taskIdValue: string): Promise { + return (await task(workspaceId, repoId, taskIdValue)).getTaskDetail(); }, - async getTaskDetail(organizationId: string, repoId: string, taskIdValue: string): Promise { - return await getTaskDetailWithAuth(organizationId, repoId, taskIdValue); + async getSessionDetail(workspaceId: string, repoId: string, taskIdValue: string, sessionId: string): Promise { + return (await task(workspaceId, repoId, taskIdValue)).getSessionDetail({ sessionId }); }, - async getSessionDetail(organizationId: string, repoId: string, taskIdValue: string, sessionId: string): Promise { - return await getSessionDetailWithAuth(organizationId, repoId, taskIdValue, sessionId); + async getWorkbench(workspaceId: string): Promise { + return await getWorkbenchCompat(workspaceId); }, - async getWorkspace(organizationId: string): Promise { - return await getWorkspaceCompat(organizationId); + subscribeWorkbench(workspaceId: string, listener: () => void): () => void { + return subscribeWorkbench(workspaceId, listener); }, - subscribeWorkspace(organizationId: string, listener: () => void): () => void { - return subscribeWorkspace(organizationId, listener); + async createWorkbenchTask(workspaceId: string, input: TaskWorkbenchCreateTaskInput): Promise { + return (await workspace(workspaceId)).createWorkbenchTask(input); }, - async createWorkspaceTask(organizationId: string, input: TaskWorkspaceCreateTaskInput): Promise { - return (await organization(organizationId)).createWorkspaceTask(await withAuthSessionInput(input)); + async markWorkbenchUnread(workspaceId: string, input: TaskWorkbenchSelectInput): Promise { + await (await workspace(workspaceId)).markWorkbenchUnread(input); }, - async markWorkspaceUnread(organizationId: string, input: TaskWorkspaceSelectInput): Promise { - await (await organization(organizationId)).markWorkspaceUnread(await withAuthSessionInput(input)); + async renameWorkbenchTask(workspaceId: string, input: TaskWorkbenchRenameInput): Promise { + await (await workspace(workspaceId)).renameWorkbenchTask(input); }, - async renameWorkspaceTask(organizationId: string, input: TaskWorkspaceRenameInput): Promise { - await (await organization(organizationId)).renameWorkspaceTask(await withAuthSessionInput(input)); + async renameWorkbenchBranch(workspaceId: string, input: TaskWorkbenchRenameInput): Promise { + await (await workspace(workspaceId)).renameWorkbenchBranch(input); }, - async createWorkspaceSession(organizationId: string, input: TaskWorkspaceSelectInput & { model?: string }): Promise<{ sessionId: string }> { - return await (await organization(organizationId)).createWorkspaceSession(await withAuthSessionInput(input)); + async createWorkbenchSession(workspaceId: string, input: TaskWorkbenchSelectInput & { model?: string }): Promise<{ tabId: string }> { + return await (await workspace(workspaceId)).createWorkbenchSession(input); }, - async renameWorkspaceSession(organizationId: string, input: TaskWorkspaceRenameSessionInput): Promise { - await (await organization(organizationId)).renameWorkspaceSession(await withAuthSessionInput(input)); + async renameWorkbenchSession(workspaceId: string, input: TaskWorkbenchRenameSessionInput): Promise { + await (await workspace(workspaceId)).renameWorkbenchSession(input); }, - async selectWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise { - await (await organization(organizationId)).selectWorkspaceSession(await withAuthSessionInput(input)); + async setWorkbenchSessionUnread(workspaceId: string, input: TaskWorkbenchSetSessionUnreadInput): Promise { + await (await workspace(workspaceId)).setWorkbenchSessionUnread(input); }, - async setWorkspaceSessionUnread(organizationId: string, input: TaskWorkspaceSetSessionUnreadInput): Promise { - await (await organization(organizationId)).setWorkspaceSessionUnread(await withAuthSessionInput(input)); + async updateWorkbenchDraft(workspaceId: string, input: TaskWorkbenchUpdateDraftInput): Promise { + await (await workspace(workspaceId)).updateWorkbenchDraft(input); }, - async updateWorkspaceDraft(organizationId: string, input: TaskWorkspaceUpdateDraftInput): Promise { - await (await organization(organizationId)).updateWorkspaceDraft(await withAuthSessionInput(input)); + async changeWorkbenchModel(workspaceId: string, input: TaskWorkbenchChangeModelInput): Promise { + await (await workspace(workspaceId)).changeWorkbenchModel(input); }, - async changeWorkspaceModel(organizationId: string, input: TaskWorkspaceChangeModelInput): Promise { - await (await organization(organizationId)).changeWorkspaceModel(await withAuthSessionInput(input)); + async sendWorkbenchMessage(workspaceId: string, input: TaskWorkbenchSendMessageInput): Promise { + await (await workspace(workspaceId)).sendWorkbenchMessage(input); }, - async sendWorkspaceMessage(organizationId: string, input: TaskWorkspaceSendMessageInput): Promise { - await (await organization(organizationId)).sendWorkspaceMessage(await withAuthSessionInput(input)); + async stopWorkbenchSession(workspaceId: string, input: TaskWorkbenchTabInput): Promise { + await (await workspace(workspaceId)).stopWorkbenchSession(input); }, - async stopWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise { - await (await organization(organizationId)).stopWorkspaceSession(await withAuthSessionInput(input)); + async closeWorkbenchSession(workspaceId: string, input: TaskWorkbenchTabInput): Promise { + await (await workspace(workspaceId)).closeWorkbenchSession(input); }, - async closeWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise { - await (await organization(organizationId)).closeWorkspaceSession(await withAuthSessionInput(input)); + async publishWorkbenchPr(workspaceId: string, input: TaskWorkbenchSelectInput): Promise { + await (await workspace(workspaceId)).publishWorkbenchPr(input); }, - async publishWorkspacePr(organizationId: string, input: TaskWorkspaceSelectInput): Promise { - await (await organization(organizationId)).publishWorkspacePr(await withAuthSessionInput(input)); - }, - - async changeWorkspaceTaskOwner(organizationId: string, input: TaskWorkspaceChangeOwnerInput): Promise { - await (await organization(organizationId)).changeWorkspaceTaskOwner(await withAuthSessionInput(input)); - }, - - async revertWorkspaceFile(organizationId: string, input: TaskWorkspaceDiffInput): Promise { - await (await organization(organizationId)).revertWorkspaceFile(await withAuthSessionInput(input)); - }, - - async adminReloadGithubOrganization(organizationId: string): Promise { - await (await organization(organizationId)).adminReloadGithubOrganization(); - }, - - async adminReloadGithubRepository(organizationId: string, repoId: string): Promise { - await (await organization(organizationId)).adminReloadGithubRepository({ repoId }); + async revertWorkbenchFile(workspaceId: string, input: TaskWorkbenchDiffInput): Promise { + await (await workspace(workspaceId)).revertWorkbenchFile(input); }, async health(): Promise<{ ok: true }> { - const organizationId = options.defaultOrganizationId; - if (!organizationId) { - throw new Error("Backend client default organization is required for health checks"); + const workspaceId = options.defaultWorkspaceId; + if (!workspaceId) { + throw new Error("Backend client default workspace is required for health checks"); } - await (await organization(organizationId)).useOrganization({ - organizationId, + await (await workspace(workspaceId)).useWorkspace({ + workspaceId, }); return { ok: true }; }, - async useOrganization(organizationId: string): Promise<{ organizationId: string }> { - return (await organization(organizationId)).useOrganization({ organizationId }); + async useWorkspace(workspaceId: string): Promise<{ workspaceId: string }> { + return (await workspace(workspaceId)).useWorkspace({ workspaceId }); }, }; } diff --git a/foundry/packages/client/src/index.ts b/foundry/packages/client/src/index.ts index e28745f..7605986 100644 --- a/foundry/packages/client/src/index.ts +++ b/foundry/packages/client/src/index.ts @@ -1,11 +1,11 @@ export * from "./app-client.js"; export * from "./backend-client.js"; -export * from "./subscription/manager.js"; -export * from "./subscription/mock-manager.js"; -export * from "./subscription/remote-manager.js"; -export * from "./subscription/topics.js"; -export * from "./subscription/use-subscription.js"; +export * from "./interest/manager.js"; +export * from "./interest/mock-manager.js"; +export * from "./interest/remote-manager.js"; +export * from "./interest/topics.js"; +export * from "./interest/use-interest.js"; export * from "./keys.js"; export * from "./mock-app.js"; export * from "./view-model.js"; -export * from "./workspace-client.js"; +export * from "./workbench-client.js"; diff --git a/foundry/packages/client/src/subscription/manager.ts b/foundry/packages/client/src/interest/manager.ts similarity index 72% rename from foundry/packages/client/src/subscription/manager.ts rename to foundry/packages/client/src/interest/manager.ts index b9bee0b..b2aab57 100644 --- a/foundry/packages/client/src/subscription/manager.ts +++ b/foundry/packages/client/src/interest/manager.ts @@ -2,14 +2,6 @@ import type { TopicData, TopicKey, TopicParams } from "./topics.js"; export type TopicStatus = "loading" | "connected" | "error"; -export interface DebugSubscriptionTopic { - topicKey: TopicKey; - cacheKey: string; - listenerCount: number; - status: TopicStatus; - lastRefreshAt: number | null; -} - export interface TopicState { data: TopicData | undefined; status: TopicStatus; @@ -17,17 +9,16 @@ export interface TopicState { } /** - * The SubscriptionManager owns all realtime actor connections and cached state. + * The InterestManager owns all realtime actor connections and cached state. * * Multiple subscribers to the same topic share one connection and one cache * entry. After the last subscriber leaves, a short grace period keeps the * connection warm so navigation does not thrash actor connections. */ -export interface SubscriptionManager { +export interface InterestManager { subscribe(topicKey: K, params: TopicParams, listener: () => void): () => void; getSnapshot(topicKey: K, params: TopicParams): TopicData | undefined; getStatus(topicKey: K, params: TopicParams): TopicStatus; getError(topicKey: K, params: TopicParams): Error | null; - listDebugTopics(): DebugSubscriptionTopic[]; dispose(): void; } diff --git a/foundry/packages/client/src/interest/mock-manager.ts b/foundry/packages/client/src/interest/mock-manager.ts new file mode 100644 index 0000000..f1c065e --- /dev/null +++ b/foundry/packages/client/src/interest/mock-manager.ts @@ -0,0 +1,12 @@ +import { createMockBackendClient } from "../mock/backend-client.js"; +import { RemoteInterestManager } from "./remote-manager.js"; + +/** + * Mock implementation shares the same interest-manager harness as the remote + * path, but uses the in-memory mock backend that synthesizes actor events. + */ +export class MockInterestManager extends RemoteInterestManager { + constructor() { + super(createMockBackendClient()); + } +} diff --git a/foundry/packages/client/src/subscription/remote-manager.ts b/foundry/packages/client/src/interest/remote-manager.ts similarity index 58% rename from foundry/packages/client/src/subscription/remote-manager.ts rename to foundry/packages/client/src/interest/remote-manager.ts index ae774c6..3016ad0 100644 --- a/foundry/packages/client/src/subscription/remote-manager.ts +++ b/foundry/packages/client/src/interest/remote-manager.ts @@ -1,19 +1,14 @@ import type { BackendClient } from "../backend-client.js"; -import type { DebugSubscriptionTopic, SubscriptionManager, TopicStatus } from "./manager.js"; +import type { InterestManager, TopicStatus } from "./manager.js"; import { topicDefinitions, type TopicData, type TopicDefinition, type TopicKey, type TopicParams } from "./topics.js"; const GRACE_PERIOD_MS = 30_000; -/** Initial retry delay in ms. */ -const RETRY_BASE_MS = 1_000; -/** Maximum retry delay in ms. */ -const RETRY_MAX_MS = 30_000; - /** - * Remote implementation of SubscriptionManager. + * Remote implementation of InterestManager. * Each cache entry owns one actor connection plus one materialized snapshot. */ -export class RemoteSubscriptionManager implements SubscriptionManager { +export class RemoteInterestManager implements InterestManager { private entries = new Map>(); constructor(private readonly backend: BackendClient) {} @@ -24,7 +19,7 @@ export class RemoteSubscriptionManager implements SubscriptionManager { let entry = this.entries.get(cacheKey); if (!entry) { - entry = new TopicEntry(topicKey, cacheKey, definition, this.backend, params as any); + entry = new TopicEntry(definition, this.backend, params as any); this.entries.set(cacheKey, entry); } @@ -58,13 +53,6 @@ export class RemoteSubscriptionManager implements SubscriptionManager { return this.entries.get((topicDefinitions[topicKey] as any).key(params))?.error ?? null; } - listDebugTopics(): DebugSubscriptionTopic[] { - return [...this.entries.values()] - .filter((entry) => entry.listenerCount > 0) - .map((entry) => entry.getDebugTopic()) - .sort((left, right) => left.cacheKey.localeCompare(right.cacheKey)); - } - dispose(): void { for (const entry of this.entries.values()) { entry.dispose(); @@ -78,38 +66,21 @@ class TopicEntry { status: TopicStatus = "loading"; error: Error | null = null; listenerCount = 0; - lastRefreshAt: number | null = null; private readonly listeners = new Set<() => void>(); private conn: Awaited["connect"]>> | null = null; private unsubscribeEvent: (() => void) | null = null; private unsubscribeError: (() => void) | null = null; private teardownTimer: ReturnType | null = null; - private retryTimer: ReturnType | null = null; - private retryAttempt = 0; private startPromise: Promise | null = null; - private eventPromise: Promise = Promise.resolve(); private started = false; - private disposed = false; constructor( - private readonly topicKey: TopicKey, - private readonly cacheKey: string, private readonly definition: TopicDefinition, private readonly backend: BackendClient, private readonly params: TParams, ) {} - getDebugTopic(): DebugSubscriptionTopic { - return { - topicKey: this.topicKey, - cacheKey: this.cacheKey, - listenerCount: this.listenerCount, - status: this.status, - lastRefreshAt: this.lastRefreshAt, - }; - } - addListener(listener: () => void): void { this.listeners.add(listener); this.listenerCount = this.listeners.size; @@ -144,9 +115,7 @@ class TopicEntry { } dispose(): void { - this.disposed = true; this.cancelTeardown(); - this.cancelRetry(); this.unsubscribeEvent?.(); this.unsubscribeError?.(); if (this.conn) { @@ -156,57 +125,7 @@ class TopicEntry { this.data = undefined; this.status = "loading"; this.error = null; - this.lastRefreshAt = null; this.started = false; - this.retryAttempt = 0; - } - - private cancelRetry(): void { - if (this.retryTimer) { - clearTimeout(this.retryTimer); - this.retryTimer = null; - } - } - - /** - * Schedules a retry with exponential backoff. Cleans up any existing - * connection state before reconnecting. - */ - private scheduleRetry(): void { - if (this.disposed || this.listenerCount === 0) { - return; - } - - const delay = Math.min(RETRY_BASE_MS * 2 ** this.retryAttempt, RETRY_MAX_MS); - this.retryAttempt++; - - this.retryTimer = setTimeout(() => { - this.retryTimer = null; - if (this.disposed || this.listenerCount === 0) { - return; - } - - // Tear down the old connection before retrying - this.cleanupConnection(); - this.started = false; - this.startPromise = this.start().finally(() => { - this.startPromise = null; - }); - }, delay); - } - - /** - * Cleans up connection resources without resetting data/status/retry state. - */ - private cleanupConnection(): void { - this.unsubscribeEvent?.(); - this.unsubscribeError?.(); - this.unsubscribeEvent = null; - this.unsubscribeError = null; - if (this.conn) { - void this.conn.dispose(); - } - this.conn = null; } private async start(): Promise { @@ -217,56 +136,29 @@ class TopicEntry { try { this.conn = await this.definition.connect(this.backend, this.params); this.unsubscribeEvent = this.conn.on(this.definition.event, (event: TEvent) => { - void this.applyEvent(event); + if (this.data === undefined) { + return; + } + this.data = this.definition.applyEvent(this.data, event); + this.notify(); }); this.unsubscribeError = this.conn.onError((error: unknown) => { this.status = "error"; this.error = error instanceof Error ? error : new Error(String(error)); this.notify(); - this.scheduleRetry(); }); this.data = await this.definition.fetchInitial(this.backend, this.params); this.status = "connected"; - this.lastRefreshAt = Date.now(); this.started = true; - this.retryAttempt = 0; this.notify(); } catch (error) { this.status = "error"; this.error = error instanceof Error ? error : new Error(String(error)); this.started = false; this.notify(); - this.scheduleRetry(); } } - private applyEvent(event: TEvent): Promise { - this.eventPromise = this.eventPromise - .then(async () => { - if (!this.started || this.data === undefined) { - return; - } - - const nextData = await this.definition.applyEvent(this.backend, this.params, this.data, event); - if (!this.started) { - return; - } - - this.data = nextData; - this.status = "connected"; - this.error = null; - this.lastRefreshAt = Date.now(); - this.notify(); - }) - .catch((error) => { - this.status = "error"; - this.error = error instanceof Error ? error : new Error(String(error)); - this.notify(); - }); - - return this.eventPromise; - } - private notify(): void { for (const listener of [...this.listeners]) { listener(); diff --git a/foundry/packages/client/src/interest/topics.ts b/foundry/packages/client/src/interest/topics.ts new file mode 100644 index 0000000..a111248 --- /dev/null +++ b/foundry/packages/client/src/interest/topics.ts @@ -0,0 +1,131 @@ +import type { + AppEvent, + FoundryAppSnapshot, + ProviderId, + SandboxProcessesEvent, + SessionEvent, + TaskEvent, + WorkbenchSessionDetail, + WorkbenchTaskDetail, + WorkspaceEvent, + WorkspaceSummarySnapshot, +} from "@sandbox-agent/foundry-shared"; +import type { ActorConn, BackendClient, SandboxProcessRecord } from "../backend-client.js"; + +/** + * Topic definitions for the interest manager. + * + * Each topic describes one actor connection plus one materialized read model. + * Events always carry full replacement payloads for the changed entity so the + * client can replace cached state directly instead of reconstructing patches. + */ +export interface TopicDefinition { + key: (params: TParams) => string; + event: string; + connect: (backend: BackendClient, params: TParams) => Promise; + fetchInitial: (backend: BackendClient, params: TParams) => Promise; + applyEvent: (current: TData, event: TEvent) => TData; +} + +export interface AppTopicParams {} +export interface WorkspaceTopicParams { + workspaceId: string; +} +export interface TaskTopicParams { + workspaceId: string; + repoId: string; + taskId: string; +} +export interface SessionTopicParams { + workspaceId: string; + repoId: string; + taskId: string; + sessionId: string; +} +export interface SandboxProcessesTopicParams { + workspaceId: string; + providerId: ProviderId; + sandboxId: string; +} + +function upsertById(items: T[], nextItem: T, sort: (left: T, right: T) => number): T[] { + const filtered = items.filter((item) => item.id !== nextItem.id); + return [...filtered, nextItem].sort(sort); +} + +export const topicDefinitions = { + app: { + key: () => "app", + event: "appUpdated", + connect: (backend: BackendClient, _params: AppTopicParams) => backend.connectWorkspace("app"), + fetchInitial: (backend: BackendClient, _params: AppTopicParams) => backend.getAppSnapshot(), + applyEvent: (_current: FoundryAppSnapshot, event: AppEvent) => event.snapshot, + } satisfies TopicDefinition, + + workspace: { + key: (params: WorkspaceTopicParams) => `workspace:${params.workspaceId}`, + event: "workspaceUpdated", + connect: (backend: BackendClient, params: WorkspaceTopicParams) => backend.connectWorkspace(params.workspaceId), + fetchInitial: (backend: BackendClient, params: WorkspaceTopicParams) => backend.getWorkspaceSummary(params.workspaceId), + applyEvent: (current: WorkspaceSummarySnapshot, event: WorkspaceEvent) => { + switch (event.type) { + case "taskSummaryUpdated": + return { + ...current, + taskSummaries: upsertById(current.taskSummaries, event.taskSummary, (left, right) => right.updatedAtMs - left.updatedAtMs), + }; + case "taskRemoved": + return { + ...current, + taskSummaries: current.taskSummaries.filter((task) => task.id !== event.taskId), + }; + case "repoAdded": + case "repoUpdated": + return { + ...current, + repos: upsertById(current.repos, event.repo, (left, right) => right.latestActivityMs - left.latestActivityMs), + }; + case "repoRemoved": + return { + ...current, + repos: current.repos.filter((repo) => repo.id !== event.repoId), + }; + } + }, + } satisfies TopicDefinition, + + task: { + key: (params: TaskTopicParams) => `task:${params.workspaceId}:${params.taskId}`, + event: "taskUpdated", + connect: (backend: BackendClient, params: TaskTopicParams) => backend.connectTask(params.workspaceId, params.repoId, params.taskId), + fetchInitial: (backend: BackendClient, params: TaskTopicParams) => backend.getTaskDetail(params.workspaceId, params.repoId, params.taskId), + applyEvent: (_current: WorkbenchTaskDetail, event: TaskEvent) => event.detail, + } satisfies TopicDefinition, + + session: { + key: (params: SessionTopicParams) => `session:${params.workspaceId}:${params.taskId}:${params.sessionId}`, + event: "sessionUpdated", + connect: (backend: BackendClient, params: SessionTopicParams) => backend.connectTask(params.workspaceId, params.repoId, params.taskId), + fetchInitial: (backend: BackendClient, params: SessionTopicParams) => + backend.getSessionDetail(params.workspaceId, params.repoId, params.taskId, params.sessionId), + applyEvent: (current: WorkbenchSessionDetail, event: SessionEvent) => { + if (event.session.sessionId !== current.sessionId) { + return current; + } + return event.session; + }, + } satisfies TopicDefinition, + + sandboxProcesses: { + key: (params: SandboxProcessesTopicParams) => `sandbox:${params.workspaceId}:${params.providerId}:${params.sandboxId}`, + event: "processesUpdated", + connect: (backend: BackendClient, params: SandboxProcessesTopicParams) => backend.connectSandbox(params.workspaceId, params.providerId, params.sandboxId), + fetchInitial: async (backend: BackendClient, params: SandboxProcessesTopicParams) => + (await backend.listSandboxProcesses(params.workspaceId, params.providerId, params.sandboxId)).processes, + applyEvent: (_current: SandboxProcessRecord[], event: SandboxProcessesEvent) => event.processes, + } satisfies TopicDefinition, +} as const; + +export type TopicKey = keyof typeof topicDefinitions; +export type TopicParams = Parameters<(typeof topicDefinitions)[K]["fetchInitial"]>[1]; +export type TopicData = Awaited>; diff --git a/foundry/packages/client/src/subscription/use-subscription.ts b/foundry/packages/client/src/interest/use-interest.ts similarity index 85% rename from foundry/packages/client/src/subscription/use-subscription.ts rename to foundry/packages/client/src/interest/use-interest.ts index c83148a..4ffd733 100644 --- a/foundry/packages/client/src/subscription/use-subscription.ts +++ b/foundry/packages/client/src/interest/use-interest.ts @@ -1,14 +1,14 @@ import { useMemo, useRef, useSyncExternalStore } from "react"; -import type { SubscriptionManager, TopicState } from "./manager.js"; +import type { InterestManager, TopicState } from "./manager.js"; import { topicDefinitions, type TopicKey, type TopicParams } from "./topics.js"; /** - * React bridge for the subscription manager. + * React bridge for the interest manager. * * `null` params disable the subscription entirely, which is how screens express - * conditional subscription in task/session/sandbox topics. + * conditional interest in task/session/sandbox topics. */ -export function useSubscription(manager: SubscriptionManager, topicKey: K, params: TopicParams | null): TopicState { +export function useInterest(manager: InterestManager, topicKey: K, params: TopicParams | null): TopicState { const paramsKey = params ? (topicDefinitions[topicKey] as any).key(params) : null; const paramsRef = useRef | null>(params); paramsRef.current = params; diff --git a/foundry/packages/client/src/keys.ts b/foundry/packages/client/src/keys.ts index 7242aae..f6b210e 100644 --- a/foundry/packages/client/src/keys.ts +++ b/foundry/packages/client/src/keys.ts @@ -1,17 +1,34 @@ export type ActorKey = string[]; -export function organizationKey(organizationId: string): ActorKey { - return ["org", organizationId]; +export function workspaceKey(workspaceId: string): ActorKey { + return ["ws", workspaceId]; } -export function taskKey(organizationId: string, repoId: string, taskId: string): ActorKey { - return ["org", organizationId, "task", repoId, taskId]; +export function projectKey(workspaceId: string, repoId: string): ActorKey { + return ["ws", workspaceId, "project", repoId]; } -export function taskSandboxKey(organizationId: string, sandboxId: string): ActorKey { - return ["org", organizationId, "sandbox", sandboxId]; +export function taskKey(workspaceId: string, repoId: string, taskId: string): ActorKey { + return ["ws", workspaceId, "project", repoId, "task", taskId]; } -export function auditLogKey(organizationId: string): ActorKey { - return ["org", organizationId, "audit-log"]; +export function sandboxInstanceKey(workspaceId: string, providerId: string, sandboxId: string): ActorKey { + return ["ws", workspaceId, "provider", providerId, "sandbox", sandboxId]; +} + +export function historyKey(workspaceId: string, repoId: string): ActorKey { + return ["ws", workspaceId, "project", repoId, "history"]; +} + +export function projectPrSyncKey(workspaceId: string, repoId: string): ActorKey { + return ["ws", workspaceId, "project", repoId, "pr-sync"]; +} + +export function projectBranchSyncKey(workspaceId: string, repoId: string): ActorKey { + return ["ws", workspaceId, "project", repoId, "branch-sync"]; +} + +export function taskStatusSyncKey(workspaceId: string, repoId: string, taskId: string, sandboxId: string, sessionId: string): ActorKey { + // Include sandbox + session so multiple sandboxes/sessions can be tracked per task. + return ["ws", workspaceId, "project", repoId, "task", taskId, "status-sync", sandboxId, sessionId]; } diff --git a/foundry/packages/client/src/mock-app.ts b/foundry/packages/client/src/mock-app.ts index 00fd9ca..0cf499d 100644 --- a/foundry/packages/client/src/mock-app.ts +++ b/foundry/packages/client/src/mock-app.ts @@ -1,8 +1,3 @@ -import { DEFAULT_WORKSPACE_MODEL_GROUPS, DEFAULT_WORKSPACE_MODEL_ID, type WorkspaceModelId } from "@sandbox-agent/foundry-shared"; - -const claudeModels = DEFAULT_WORKSPACE_MODEL_GROUPS.find((group) => group.agentKind === "Claude")?.models ?? []; -const CLAUDE_SECONDARY_MODEL_ID = claudeModels[1]?.id ?? claudeModels[0]?.id ?? DEFAULT_WORKSPACE_MODEL_ID; -const CLAUDE_TERTIARY_MODEL_ID = claudeModels[2]?.id ?? CLAUDE_SECONDARY_MODEL_ID; import { injectMockLatency } from "./mock/latency.js"; import rivetDevFixture from "../../../scripts/data/rivet-dev.json" with { type: "json" }; @@ -20,7 +15,6 @@ export interface MockFoundryUser { githubLogin: string; roleLabel: string; eligibleOrganizationIds: string[]; - defaultModel: WorkspaceModelId; } export interface MockFoundryOrganizationMember { @@ -57,8 +51,6 @@ export interface MockFoundryGithubState { importedRepoCount: number; lastSyncLabel: string; lastSyncAt: number | null; - lastWebhookAt: number | null; - lastWebhookEvent: string; } export interface MockFoundryOrganizationSettings { @@ -66,12 +58,13 @@ export interface MockFoundryOrganizationSettings { slug: string; primaryDomain: string; seatAccrualMode: "first_prompt"; + defaultModel: "claude-sonnet-4" | "claude-opus-4" | "gpt-4o" | "o3"; autoImportRepos: boolean; } export interface MockFoundryOrganization { id: string; - organizationId: string; + workspaceId: string; kind: MockOrganizationKind; settings: MockFoundryOrganizationSettings; github: MockFoundryGithubState; @@ -115,7 +108,6 @@ export interface MockFoundryAppClient { skipStarterRepo(): Promise; starStarterRepo(organizationId: string): Promise; selectOrganization(organizationId: string): Promise; - setDefaultModel(model: WorkspaceModelId): Promise; updateOrganizationProfile(input: UpdateMockOrganizationProfileInput): Promise; triggerGithubSync(organizationId: string): Promise; completeHostedCheckout(organizationId: string, planId: MockBillingPlanId): Promise; @@ -123,7 +115,7 @@ export interface MockFoundryAppClient { cancelScheduledRenewal(organizationId: string): Promise; resumeSubscription(organizationId: string): Promise; reconnectGithub(organizationId: string): Promise; - recordSeatUsage(organizationId: string): void; + recordSeatUsage(workspaceId: string): void; } const STORAGE_KEY = "sandbox-agent-foundry:mock-app:v1"; @@ -178,13 +170,14 @@ function buildRivetOrganization(): MockFoundryOrganization { return { id: "rivet", - organizationId: "rivet", + workspaceId: "rivet", kind: "organization", settings: { displayName: rivetDevFixture.name ?? rivetDevFixture.login, slug: "rivet", primaryDomain: "rivet.dev", seatAccrualMode: "first_prompt", + defaultModel: "o3", autoImportRepos: true, }, github: { @@ -194,8 +187,6 @@ function buildRivetOrganization(): MockFoundryOrganization { importedRepoCount: repos.length, lastSyncLabel: "Synced just now", lastSyncAt: Date.now() - 60_000, - lastWebhookAt: Date.now() - 30_000, - lastWebhookEvent: "push", }, billing: { planId: "team", @@ -237,7 +228,6 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { githubLogin: "nathan", roleLabel: "Founder", eligibleOrganizationIds: ["personal-nathan", "acme", "rivet"], - defaultModel: DEFAULT_WORKSPACE_MODEL_ID, }, { id: "user-maya", @@ -246,7 +236,6 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { githubLogin: "maya", roleLabel: "Staff Engineer", eligibleOrganizationIds: ["acme"], - defaultModel: CLAUDE_SECONDARY_MODEL_ID, }, { id: "user-jamie", @@ -255,19 +244,19 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { githubLogin: "jamie", roleLabel: "Platform Lead", eligibleOrganizationIds: ["personal-jamie", "rivet"], - defaultModel: CLAUDE_TERTIARY_MODEL_ID, }, ], organizations: [ { id: "personal-nathan", - organizationId: "personal-nathan", + workspaceId: "personal-nathan", kind: "personal", settings: { displayName: "Nathan", slug: "nathan", primaryDomain: "personal", seatAccrualMode: "first_prompt", + defaultModel: "claude-sonnet-4", autoImportRepos: true, }, github: { @@ -277,8 +266,6 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { importedRepoCount: 1, lastSyncLabel: "Synced just now", lastSyncAt: Date.now() - 60_000, - lastWebhookAt: Date.now() - 120_000, - lastWebhookEvent: "pull_request.opened", }, billing: { planId: "free", @@ -296,13 +283,14 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { }, { id: "acme", - organizationId: "acme", + workspaceId: "acme", kind: "organization", settings: { displayName: "Acme", slug: "acme", primaryDomain: "acme.dev", seatAccrualMode: "first_prompt", + defaultModel: "claude-sonnet-4", autoImportRepos: true, }, github: { @@ -312,8 +300,6 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { importedRepoCount: 3, lastSyncLabel: "Waiting for first import", lastSyncAt: null, - lastWebhookAt: null, - lastWebhookEvent: "", }, billing: { planId: "team", @@ -340,13 +326,14 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { buildRivetOrganization(), { id: "personal-jamie", - organizationId: "personal-jamie", + workspaceId: "personal-jamie", kind: "personal", settings: { displayName: "Jamie", slug: "jamie", primaryDomain: "personal", seatAccrualMode: "first_prompt", + defaultModel: "claude-opus-4", autoImportRepos: true, }, github: { @@ -356,8 +343,6 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { importedRepoCount: 1, lastSyncLabel: "Synced yesterday", lastSyncAt: Date.now() - 24 * 60 * 60_000, - lastWebhookAt: Date.now() - 3_600_000, - lastWebhookEvent: "check_run.completed", }, billing: { planId: "free", @@ -411,8 +396,6 @@ function parseStoredSnapshot(): MockFoundryAppSnapshot | null { ...organization.github, syncStatus: syncStatusFromLegacy(organization.github?.syncStatus ?? organization.repoImportStatus), lastSyncAt: organization.github?.lastSyncAt ?? null, - lastWebhookAt: organization.github?.lastWebhookAt ?? null, - lastWebhookEvent: organization.github?.lastWebhookEvent ?? "", }, })), }; @@ -542,18 +525,6 @@ class MockFoundryAppStore implements MockFoundryAppClient { } } - async setDefaultModel(model: WorkspaceModelId): Promise { - await this.injectAsyncLatency(); - const currentUserId = this.snapshot.auth.currentUserId; - if (!currentUserId) { - throw new Error("No signed-in mock user"); - } - this.updateSnapshot((current) => ({ - ...current, - users: current.users.map((user) => (user.id === currentUserId ? { ...user, defaultModel: model } : user)), - })); - } - async updateOrganizationProfile(input: UpdateMockOrganizationProfileInput): Promise { await this.injectAsyncLatency(); this.requireOrganization(input.organizationId); @@ -595,8 +566,6 @@ class MockFoundryAppStore implements MockFoundryAppClient { syncStatus: "synced", lastSyncLabel: "Synced just now", lastSyncAt: Date.now(), - lastWebhookAt: Date.now(), - lastWebhookEvent: "installation_repositories.added", }, })); this.importTimers.delete(organizationId); @@ -675,8 +644,8 @@ class MockFoundryAppStore implements MockFoundryAppClient { })); } - recordSeatUsage(organizationId: string): void { - const org = this.snapshot.organizations.find((candidate) => candidate.organizationId === organizationId); + recordSeatUsage(workspaceId: string): void { + const org = this.snapshot.organizations.find((candidate) => candidate.workspaceId === workspaceId); const currentUser = currentMockUser(this.snapshot); if (!org || !currentUser) { return; diff --git a/foundry/packages/client/src/mock/backend-client.ts b/foundry/packages/client/src/mock/backend-client.ts index 191f68c..2048a60 100644 --- a/foundry/packages/client/src/mock/backend-client.ts +++ b/foundry/packages/client/src/mock/backend-client.ts @@ -1,4 +1,5 @@ import type { + AddRepoInput, AppEvent, CreateTaskInput, FoundryAppSnapshot, @@ -6,37 +7,37 @@ import type { SessionEvent, TaskRecord, TaskSummary, - TaskWorkspaceChangeModelInput, - TaskWorkspaceCreateTaskInput, - TaskWorkspaceCreateTaskResponse, - TaskWorkspaceDiffInput, - TaskWorkspaceRenameInput, - TaskWorkspaceRenameSessionInput, - TaskWorkspaceSelectInput, - TaskWorkspaceSetSessionUnreadInput, - TaskWorkspaceSendMessageInput, - TaskWorkspaceSnapshot, - TaskWorkspaceSessionInput, - TaskWorkspaceUpdateDraftInput, + TaskWorkbenchChangeModelInput, + TaskWorkbenchCreateTaskInput, + TaskWorkbenchCreateTaskResponse, + TaskWorkbenchDiffInput, + TaskWorkbenchRenameInput, + TaskWorkbenchRenameSessionInput, + TaskWorkbenchSelectInput, + TaskWorkbenchSetSessionUnreadInput, + TaskWorkbenchSendMessageInput, + TaskWorkbenchSnapshot, + TaskWorkbenchTabInput, + TaskWorkbenchUpdateDraftInput, TaskEvent, - WorkspaceSessionDetail, - WorkspaceModelGroup, - WorkspaceTaskDetail, - WorkspaceTaskSummary, - OrganizationEvent, - OrganizationSummarySnapshot, - AuditLogEvent as HistoryEvent, + WorkbenchSessionDetail, + WorkbenchTaskDetail, + WorkbenchTaskSummary, + WorkspaceEvent, + WorkspaceSummarySnapshot, + HistoryEvent, HistoryQueryInput, - SandboxProviderId, + ProviderId, RepoOverview, RepoRecord, + RepoStackActionInput, + RepoStackActionResult, StarSandboxAgentRepoResult, SwitchResult, } from "@sandbox-agent/foundry-shared"; -import { DEFAULT_WORKSPACE_MODEL_GROUPS } from "@sandbox-agent/foundry-shared"; import type { ProcessCreateRequest, ProcessLogFollowQuery, ProcessLogsResponse, ProcessSignalQuery } from "sandbox-agent"; import type { ActorConn, BackendClient, SandboxProcessRecord, SandboxSessionEventRecord, SandboxSessionRecord } from "../backend-client.js"; -import { getSharedMockWorkspaceClient } from "./workspace-client.js"; +import { getSharedMockWorkbenchClient } from "./workbench-client.js"; interface MockProcessRecord extends SandboxProcessRecord { logText: string; @@ -90,8 +91,8 @@ function toTaskStatus(status: TaskRecord["status"], archived: boolean): TaskReco return status; } -export function createMockBackendClient(defaultOrganizationId = "default"): BackendClient { - const workspace = getSharedMockWorkspaceClient(); +export function createMockBackendClient(defaultWorkspaceId = "default"): BackendClient { + const workbench = getSharedMockWorkbenchClient(); const listenersBySandboxId = new Map void>>(); const processesBySandboxId = new Map(); const connectionListeners = new Map void>>(); @@ -99,7 +100,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back let nextProcessId = 1; const requireTask = (taskId: string) => { - const task = workspace.getSnapshot().tasks.find((candidate) => candidate.id === taskId); + const task = workbench.getSnapshot().tasks.find((candidate) => candidate.id === taskId); if (!task) { throw new Error(`Unknown mock task ${taskId}`); } @@ -166,7 +167,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back async dispose(): Promise {}, }); - const buildTaskSummary = (task: TaskWorkspaceSnapshot["tasks"][number]): WorkspaceTaskSummary => ({ + const buildTaskSummary = (task: TaskWorkbenchSnapshot["tasks"][number]): WorkbenchTaskSummary => ({ id: task.id, repoId: task.repoId, title: task.title, @@ -175,11 +176,9 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back updatedAtMs: task.updatedAtMs, branch: task.branch, pullRequest: task.pullRequest, - activeSessionId: task.activeSessionId ?? task.sessions[0]?.id ?? null, - sessionsSummary: task.sessions.map((tab) => ({ + sessionsSummary: task.tabs.map((tab) => ({ id: tab.id, sessionId: tab.sessionId, - sandboxSessionId: tab.sandboxSessionId ?? tab.sessionId, sessionName: tab.sessionName, agent: tab.agent, model: tab.model, @@ -188,36 +187,41 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back unread: tab.unread, created: tab.created, })), - primaryUserLogin: null, - primaryUserAvatarUrl: null, }); - const buildTaskDetail = (task: TaskWorkspaceSnapshot["tasks"][number]): WorkspaceTaskDetail => ({ + const buildTaskDetail = (task: TaskWorkbenchSnapshot["tasks"][number]): WorkbenchTaskDetail => ({ ...buildTaskSummary(task), task: task.title, + agentType: task.tabs[0]?.agent === "Codex" ? "codex" : "claude", + runtimeStatus: toTaskStatus(task.status === "archived" ? "archived" : "running", task.status === "archived"), + statusMessage: task.status === "archived" ? "archived" : "mock sandbox ready", + activeSessionId: task.tabs[0]?.sessionId ?? null, + diffStat: task.fileChanges.length > 0 ? `+${task.fileChanges.length}/-${task.fileChanges.length}` : "+0/-0", + prUrl: task.pullRequest ? `https://example.test/pr/${task.pullRequest.number}` : null, + reviewStatus: null, fileChanges: task.fileChanges, diffs: task.diffs, fileTree: task.fileTree, minutesUsed: task.minutesUsed, sandboxes: [ { - sandboxProviderId: "local", + providerId: "local", sandboxId: task.id, cwd: mockCwd(task.repoName, task.id), - url: null, }, ], activeSandboxId: task.id, }); - const buildSessionDetail = (task: TaskWorkspaceSnapshot["tasks"][number], sessionId: string): WorkspaceSessionDetail => { - const tab = task.sessions.find((candidate) => candidate.id === sessionId); + const buildSessionDetail = (task: TaskWorkbenchSnapshot["tasks"][number], tabId: string): WorkbenchSessionDetail => { + const tab = task.tabs.find((candidate) => candidate.id === tabId); if (!tab) { - throw new Error(`Unknown mock session ${sessionId} for task ${task.id}`); + throw new Error(`Unknown mock tab ${tabId} for task ${task.id}`); } return { sessionId: tab.id, - sandboxSessionId: tab.sandboxSessionId ?? tab.sessionId, + tabId: tab.id, + sandboxSessionId: tab.sessionId, sessionName: tab.sessionName, agent: tab.agent, model: tab.model, @@ -230,25 +234,11 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }; }; - const buildOrganizationSummary = (): OrganizationSummarySnapshot => { - const snapshot = workspace.getSnapshot(); + const buildWorkspaceSummary = (): WorkspaceSummarySnapshot => { + const snapshot = workbench.getSnapshot(); const taskSummaries = snapshot.tasks.map(buildTaskSummary); return { - organizationId: defaultOrganizationId, - github: { - connectedAccount: "mock", - installationStatus: "connected", - syncStatus: "synced", - importedRepoCount: snapshot.repos.length, - lastSyncLabel: "Synced just now", - lastSyncAt: nowMs(), - lastWebhookAt: null, - lastWebhookEvent: "", - syncGeneration: 1, - syncPhase: null, - processedRepositoryCount: snapshot.repos.length, - totalRepositoryCount: snapshot.repos.length, - }, + workspaceId: defaultWorkspaceId, repos: snapshot.repos.map((repo) => { const repoTasks = taskSummaries.filter((task) => task.repoId === repo.id); return { @@ -262,36 +252,39 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }; }; - const organizationScope = (organizationId: string): string => `organization:${organizationId}`; - const taskScope = (organizationId: string, repoId: string, taskId: string): string => `task:${organizationId}:${repoId}:${taskId}`; - const sandboxScope = (organizationId: string, sandboxProviderId: string, sandboxId: string): string => - `sandbox:${organizationId}:${sandboxProviderId}:${sandboxId}`; + const workspaceScope = (workspaceId: string): string => `workspace:${workspaceId}`; + const taskScope = (workspaceId: string, repoId: string, taskId: string): string => `task:${workspaceId}:${repoId}:${taskId}`; + const sandboxScope = (workspaceId: string, providerId: string, sandboxId: string): string => `sandbox:${workspaceId}:${providerId}:${sandboxId}`; - const emitOrganizationSnapshot = (): void => { - emitConnectionEvent(organizationScope(defaultOrganizationId), "organizationUpdated", { - type: "organizationUpdated", - snapshot: buildOrganizationSummary(), - } satisfies OrganizationEvent); + const emitWorkspaceSnapshot = (): void => { + const summary = buildWorkspaceSummary(); + const latestTask = [...summary.taskSummaries].sort((left, right) => right.updatedAtMs - left.updatedAtMs)[0] ?? null; + if (latestTask) { + emitConnectionEvent(workspaceScope(defaultWorkspaceId), "workspaceUpdated", { + type: "taskSummaryUpdated", + taskSummary: latestTask, + } satisfies WorkspaceEvent); + } }; const emitTaskUpdate = (taskId: string): void => { const task = requireTask(taskId); - emitConnectionEvent(taskScope(defaultOrganizationId, task.repoId, task.id), "taskUpdated", { - type: "taskUpdated", + emitConnectionEvent(taskScope(defaultWorkspaceId, task.repoId, task.id), "taskUpdated", { + type: "taskDetailUpdated", detail: buildTaskDetail(task), } satisfies TaskEvent); }; - const emitSessionUpdate = (taskId: string, sessionId: string): void => { + const emitSessionUpdate = (taskId: string, tabId: string): void => { const task = requireTask(taskId); - emitConnectionEvent(taskScope(defaultOrganizationId, task.repoId, task.id), "sessionUpdated", { + emitConnectionEvent(taskScope(defaultWorkspaceId, task.repoId, task.id), "sessionUpdated", { type: "sessionUpdated", - session: buildSessionDetail(task, sessionId), + session: buildSessionDetail(task, tabId), } satisfies SessionEvent); }; const emitSandboxProcessesUpdate = (sandboxId: string): void => { - emitConnectionEvent(sandboxScope(defaultOrganizationId, "local", sandboxId), "processesUpdated", { + emitConnectionEvent(sandboxScope(defaultWorkspaceId, "local", sandboxId), "processesUpdated", { type: "processesUpdated", processes: ensureProcessList(sandboxId).map((process) => cloneProcess(process)), } satisfies SandboxProcessesEvent); @@ -302,21 +295,22 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back const cwd = mockCwd(task.repoName, task.id); const archived = task.status === "archived"; return { - organizationId: defaultOrganizationId, + workspaceId: defaultWorkspaceId, repoId: task.repoId, repoRemote: mockRepoRemote(task.repoName), taskId: task.id, branchName: task.branch, title: task.title, task: task.title, - sandboxProviderId: "local", + providerId: "local", status: toTaskStatus(archived ? "archived" : "running", archived), - pullRequest: null, + statusMessage: archived ? "archived" : "mock sandbox ready", activeSandboxId: task.id, + activeSessionId: task.tabs[0]?.sessionId ?? null, sandboxes: [ { sandboxId: task.id, - sandboxProviderId: "local", + providerId: "local", sandboxActorId: "mock-sandbox", switchTarget: `mock://${task.id}`, cwd, @@ -324,6 +318,17 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back updatedAt: task.updatedAtMs, }, ], + agentType: task.tabs[0]?.agent === "Codex" ? "codex" : "claude", + prSubmitted: Boolean(task.pullRequest), + diffStat: task.fileChanges.length > 0 ? `+${task.fileChanges.length}/-${task.fileChanges.length}` : "+0/-0", + prUrl: task.pullRequest ? `https://example.test/pr/${task.pullRequest.number}` : null, + prAuthor: task.pullRequest ? "mock" : null, + ciStatus: null, + reviewStatus: null, + reviewer: null, + conflictsWithMain: "0", + hasUnpushed: task.fileChanges.length > 0 ? "1" : "0", + parentBranch: null, createdAt: task.updatedAtMs, updatedAt: task.updatedAtMs, }; @@ -360,16 +365,16 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back return unsupportedAppSnapshot(); }, - async connectOrganization(organizationId: string): Promise { - return createConn(organizationScope(organizationId)); + async connectWorkspace(workspaceId: string): Promise { + return createConn(workspaceScope(workspaceId)); }, - async connectTask(organizationId: string, repoId: string, taskId: string): Promise { - return createConn(taskScope(organizationId, repoId, taskId)); + async connectTask(workspaceId: string, repoId: string, taskId: string): Promise { + return createConn(taskScope(workspaceId, repoId, taskId)); }, - async connectSandbox(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise { - return createConn(sandboxScope(organizationId, sandboxProviderId, sandboxId)); + async connectSandbox(workspaceId: string, providerId: ProviderId, sandboxId: string): Promise { + return createConn(sandboxScope(workspaceId, providerId, sandboxId)); }, subscribeApp(): () => void { @@ -396,10 +401,6 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back return unsupportedAppSnapshot(); }, - async setAppDefaultModel(): Promise { - return unsupportedAppSnapshot(); - }, - async updateAppOrganizationProfile(): Promise { return unsupportedAppSnapshot(); }, @@ -432,9 +433,13 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back return unsupportedAppSnapshot(); }, - async listRepos(_organizationId: string): Promise { - return workspace.getSnapshot().repos.map((repo) => ({ - organizationId: defaultOrganizationId, + async addRepo(_workspaceId: string, _remoteUrl: string): Promise { + notSupported("addRepo"); + }, + + async listRepos(_workspaceId: string): Promise { + return workbench.getSnapshot().repos.map((repo) => ({ + workspaceId: defaultWorkspaceId, repoId: repo.id, remoteUrl: mockRepoRemote(repo.label), createdAt: nowMs(), @@ -446,26 +451,30 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back notSupported("createTask"); }, - async listTasks(_organizationId: string, repoId?: string): Promise { - return workspace + async listTasks(_workspaceId: string, repoId?: string): Promise { + return workbench .getSnapshot() .tasks.filter((task) => !repoId || task.repoId === repoId) .map((task) => ({ - organizationId: defaultOrganizationId, + workspaceId: defaultWorkspaceId, repoId: task.repoId, taskId: task.id, branchName: task.branch, title: task.title, status: task.status === "archived" ? "archived" : "running", - pullRequest: null, updatedAt: task.updatedAtMs, })); }, - async getRepoOverview(_organizationId: string, _repoId: string): Promise { + async getRepoOverview(_workspaceId: string, _repoId: string): Promise { notSupported("getRepoOverview"); }, - async getTask(_organizationId: string, _repoId: string, taskId: string): Promise { + + async runRepoStackAction(_input: RepoStackActionInput): Promise { + notSupported("runRepoStackAction"); + }, + + async getTask(_workspaceId: string, taskId: string): Promise { return buildTaskRecord(taskId); }, @@ -473,23 +482,23 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back return []; }, - async switchTask(_organizationId: string, _repoId: string, taskId: string): Promise { + async switchTask(_workspaceId: string, taskId: string): Promise { return { - organizationId: defaultOrganizationId, + workspaceId: defaultWorkspaceId, taskId, - sandboxProviderId: "local", + providerId: "local", switchTarget: `mock://${taskId}`, }; }, - async attachTask(_organizationId: string, _repoId: string, taskId: string): Promise<{ target: string; sessionId: string | null }> { + async attachTask(_workspaceId: string, taskId: string): Promise<{ target: string; sessionId: string | null }> { return { target: `mock://${taskId}`, - sessionId: requireTask(taskId).sessions[0]?.sessionId ?? null, + sessionId: requireTask(taskId).tabs[0]?.sessionId ?? null, }; }, - async runAction(_organizationId: string, _repoId: string, _taskId: string): Promise { + async runAction(_workspaceId: string, _taskId: string): Promise { notSupported("runAction"); }, @@ -506,8 +515,8 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }, async createSandboxProcess(input: { - organizationId: string; - sandboxProviderId: SandboxProviderId; + workspaceId: string; + providerId: ProviderId; sandboxId: string; request: ProcessCreateRequest; }): Promise { @@ -519,15 +528,15 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back return cloneProcess(created); }, - async listSandboxProcesses(_organizationId: string, _providerId: SandboxProviderId, sandboxId: string): Promise<{ processes: SandboxProcessRecord[] }> { + async listSandboxProcesses(_workspaceId: string, _providerId: ProviderId, sandboxId: string): Promise<{ processes: SandboxProcessRecord[] }> { return { processes: ensureProcessList(sandboxId).map((process) => cloneProcess(process)), }; }, async getSandboxProcessLogs( - _organizationId: string, - _providerId: SandboxProviderId, + _workspaceId: string, + _providerId: ProviderId, sandboxId: string, processId: string, query?: ProcessLogFollowQuery, @@ -554,8 +563,8 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }, async stopSandboxProcess( - _organizationId: string, - _providerId: SandboxProviderId, + _workspaceId: string, + _providerId: ProviderId, sandboxId: string, processId: string, _query?: ProcessSignalQuery, @@ -573,8 +582,8 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }, async killSandboxProcess( - _organizationId: string, - _providerId: SandboxProviderId, + _workspaceId: string, + _providerId: ProviderId, sandboxId: string, processId: string, _query?: ProcessSignalQuery, @@ -591,7 +600,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back return cloneProcess(process); }, - async deleteSandboxProcess(_organizationId: string, _providerId: SandboxProviderId, sandboxId: string, processId: string): Promise { + async deleteSandboxProcess(_workspaceId: string, _providerId: ProviderId, sandboxId: string, processId: string): Promise { processesBySandboxId.set( sandboxId, ensureProcessList(sandboxId).filter((candidate) => candidate.id !== processId), @@ -599,7 +608,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back notifySandbox(sandboxId); }, - subscribeSandboxProcesses(_organizationId: string, _providerId: SandboxProviderId, sandboxId: string, listener: () => void): () => void { + subscribeSandboxProcesses(_workspaceId: string, _providerId: ProviderId, sandboxId: string, listener: () => void): () => void { let listeners = listenersBySandboxId.get(sandboxId); if (!listeners) { listeners = new Set(); @@ -627,156 +636,139 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }, async sandboxProviderState( - _organizationId: string, - _providerId: SandboxProviderId, + _workspaceId: string, + _providerId: ProviderId, sandboxId: string, - ): Promise<{ sandboxProviderId: SandboxProviderId; sandboxId: string; state: string; at: number }> { - return { sandboxProviderId: "local", sandboxId, state: "running", at: nowMs() }; + ): Promise<{ providerId: ProviderId; sandboxId: string; state: string; at: number }> { + return { providerId: "local", sandboxId, state: "running", at: nowMs() }; }, async getSandboxAgentConnection(): Promise<{ endpoint: string; token?: string }> { return { endpoint: "mock://terminal-unavailable" }; }, - async getSandboxWorkspaceModelGroups(_organizationId: string, _sandboxProviderId: SandboxProviderId, _sandboxId: string): Promise { - return DEFAULT_WORKSPACE_MODEL_GROUPS; + async getWorkspaceSummary(): Promise { + return buildWorkspaceSummary(); }, - async getOrganizationSummary(): Promise { - return buildOrganizationSummary(); - }, - - async getTaskDetail(_organizationId: string, _repoId: string, taskId: string): Promise { + async getTaskDetail(_workspaceId: string, _repoId: string, taskId: string): Promise { return buildTaskDetail(requireTask(taskId)); }, - async getSessionDetail(_organizationId: string, _repoId: string, taskId: string, sessionId: string): Promise { + async getSessionDetail(_workspaceId: string, _repoId: string, taskId: string, sessionId: string): Promise { return buildSessionDetail(requireTask(taskId), sessionId); }, - async getWorkspace(): Promise { - return workspace.getSnapshot(); + async getWorkbench(): Promise { + return workbench.getSnapshot(); }, - subscribeWorkspace(_organizationId: string, listener: () => void): () => void { - return workspace.subscribe(listener); + subscribeWorkbench(_workspaceId: string, listener: () => void): () => void { + return workbench.subscribe(listener); }, - async createWorkspaceTask(_organizationId: string, input: TaskWorkspaceCreateTaskInput): Promise { - const created = await workspace.createTask(input); - emitOrganizationSnapshot(); + async createWorkbenchTask(_workspaceId: string, input: TaskWorkbenchCreateTaskInput): Promise { + const created = await workbench.createTask(input); + emitWorkspaceSnapshot(); emitTaskUpdate(created.taskId); - if (created.sessionId) { - emitSessionUpdate(created.taskId, created.sessionId); + if (created.tabId) { + emitSessionUpdate(created.taskId, created.tabId); } return created; }, - async markWorkspaceUnread(_organizationId: string, input: TaskWorkspaceSelectInput): Promise { - await workspace.markTaskUnread(input); - emitOrganizationSnapshot(); + async markWorkbenchUnread(_workspaceId: string, input: TaskWorkbenchSelectInput): Promise { + await workbench.markTaskUnread(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); }, - async renameWorkspaceTask(_organizationId: string, input: TaskWorkspaceRenameInput): Promise { - await workspace.renameTask(input); - emitOrganizationSnapshot(); + async renameWorkbenchTask(_workspaceId: string, input: TaskWorkbenchRenameInput): Promise { + await workbench.renameTask(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); }, - async createWorkspaceSession(_organizationId: string, input: TaskWorkspaceSelectInput & { model?: string }): Promise<{ sessionId: string }> { - const created = await workspace.addSession(input); - emitOrganizationSnapshot(); + async renameWorkbenchBranch(_workspaceId: string, input: TaskWorkbenchRenameInput): Promise { + await workbench.renameBranch(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); - emitSessionUpdate(input.taskId, created.sessionId); + }, + + async createWorkbenchSession(_workspaceId: string, input: TaskWorkbenchSelectInput & { model?: string }): Promise<{ tabId: string }> { + const created = await workbench.addTab(input); + emitWorkspaceSnapshot(); + emitTaskUpdate(input.taskId); + emitSessionUpdate(input.taskId, created.tabId); return created; }, - async renameWorkspaceSession(_organizationId: string, input: TaskWorkspaceRenameSessionInput): Promise { - await workspace.renameSession(input); - emitOrganizationSnapshot(); + async renameWorkbenchSession(_workspaceId: string, input: TaskWorkbenchRenameSessionInput): Promise { + await workbench.renameSession(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); - emitSessionUpdate(input.taskId, input.sessionId); + emitSessionUpdate(input.taskId, input.tabId); }, - async selectWorkspaceSession(_organizationId: string, input: TaskWorkspaceSessionInput): Promise { - await workspace.selectSession(input); - emitOrganizationSnapshot(); + async setWorkbenchSessionUnread(_workspaceId: string, input: TaskWorkbenchSetSessionUnreadInput): Promise { + await workbench.setSessionUnread(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); - emitSessionUpdate(input.taskId, input.sessionId); + emitSessionUpdate(input.taskId, input.tabId); }, - async setWorkspaceSessionUnread(_organizationId: string, input: TaskWorkspaceSetSessionUnreadInput): Promise { - await workspace.setSessionUnread(input); - emitOrganizationSnapshot(); + async updateWorkbenchDraft(_workspaceId: string, input: TaskWorkbenchUpdateDraftInput): Promise { + await workbench.updateDraft(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); - emitSessionUpdate(input.taskId, input.sessionId); + emitSessionUpdate(input.taskId, input.tabId); }, - async updateWorkspaceDraft(_organizationId: string, input: TaskWorkspaceUpdateDraftInput): Promise { - await workspace.updateDraft(input); - emitOrganizationSnapshot(); + async changeWorkbenchModel(_workspaceId: string, input: TaskWorkbenchChangeModelInput): Promise { + await workbench.changeModel(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); - emitSessionUpdate(input.taskId, input.sessionId); + emitSessionUpdate(input.taskId, input.tabId); }, - async changeWorkspaceModel(_organizationId: string, input: TaskWorkspaceChangeModelInput): Promise { - await workspace.changeModel(input); - emitOrganizationSnapshot(); + async sendWorkbenchMessage(_workspaceId: string, input: TaskWorkbenchSendMessageInput): Promise { + await workbench.sendMessage(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); - emitSessionUpdate(input.taskId, input.sessionId); + emitSessionUpdate(input.taskId, input.tabId); }, - async sendWorkspaceMessage(_organizationId: string, input: TaskWorkspaceSendMessageInput): Promise { - await workspace.sendMessage(input); - emitOrganizationSnapshot(); + async stopWorkbenchSession(_workspaceId: string, input: TaskWorkbenchTabInput): Promise { + await workbench.stopAgent(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); - emitSessionUpdate(input.taskId, input.sessionId); + emitSessionUpdate(input.taskId, input.tabId); }, - async stopWorkspaceSession(_organizationId: string, input: TaskWorkspaceSessionInput): Promise { - await workspace.stopAgent(input); - emitOrganizationSnapshot(); - emitTaskUpdate(input.taskId); - emitSessionUpdate(input.taskId, input.sessionId); - }, - - async closeWorkspaceSession(_organizationId: string, input: TaskWorkspaceSessionInput): Promise { - await workspace.closeSession(input); - emitOrganizationSnapshot(); + async closeWorkbenchSession(_workspaceId: string, input: TaskWorkbenchTabInput): Promise { + await workbench.closeTab(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); }, - async publishWorkspacePr(_organizationId: string, input: TaskWorkspaceSelectInput): Promise { - await workspace.publishPr(input); - emitOrganizationSnapshot(); + async publishWorkbenchPr(_workspaceId: string, input: TaskWorkbenchSelectInput): Promise { + await workbench.publishPr(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); }, - async changeWorkspaceTaskOwner( - _organizationId: string, - input: { repoId: string; taskId: string; targetUserId: string; targetUserName: string; targetUserEmail: string }, - ): Promise { - await workspace.changeOwner(input); - emitOrganizationSnapshot(); + async revertWorkbenchFile(_workspaceId: string, input: TaskWorkbenchDiffInput): Promise { + await workbench.revertFile(input); + emitWorkspaceSnapshot(); emitTaskUpdate(input.taskId); }, - async revertWorkspaceFile(_organizationId: string, input: TaskWorkspaceDiffInput): Promise { - await workspace.revertFile(input); - emitOrganizationSnapshot(); - emitTaskUpdate(input.taskId); - }, - - async adminReloadGithubOrganization(): Promise {}, - async adminReloadGithubRepository(): Promise {}, - async health(): Promise<{ ok: true }> { return { ok: true }; }, - async useOrganization(organizationId: string): Promise<{ organizationId: string }> { - return { organizationId }; + async useWorkspace(workspaceId: string): Promise<{ workspaceId: string }> { + return { workspaceId }; }, async starSandboxAgentRepo(): Promise { diff --git a/foundry/packages/client/src/mock/workspace-client.ts b/foundry/packages/client/src/mock/workbench-client.ts similarity index 53% rename from foundry/packages/client/src/mock/workspace-client.ts rename to foundry/packages/client/src/mock/workbench-client.ts index 7983e0f..f27c436 100644 --- a/foundry/packages/client/src/mock/workspace-client.ts +++ b/foundry/packages/client/src/mock/workbench-client.ts @@ -1,34 +1,33 @@ import { MODEL_GROUPS, buildInitialMockLayoutViewModel, - groupWorkspaceRepositories, + groupWorkbenchProjects, nowMs, providerAgent, randomReply, removeFileTreePath, slugify, uid, -} from "../workspace-model.js"; -import { DEFAULT_WORKSPACE_MODEL_ID, workspaceAgentForModel } from "@sandbox-agent/foundry-shared"; +} from "../workbench-model.js"; import type { - TaskWorkspaceAddSessionResponse, - TaskWorkspaceChangeModelInput, - TaskWorkspaceCreateTaskInput, - TaskWorkspaceCreateTaskResponse, - TaskWorkspaceDiffInput, - TaskWorkspaceRenameInput, - TaskWorkspaceRenameSessionInput, - TaskWorkspaceSelectInput, - TaskWorkspaceSetSessionUnreadInput, - TaskWorkspaceSendMessageInput, - TaskWorkspaceSnapshot, - TaskWorkspaceSessionInput, - TaskWorkspaceUpdateDraftInput, - WorkspaceSession as AgentSession, - WorkspaceTask as Task, - WorkspaceTranscriptEvent as TranscriptEvent, + TaskWorkbenchAddTabResponse, + TaskWorkbenchChangeModelInput, + TaskWorkbenchCreateTaskInput, + TaskWorkbenchCreateTaskResponse, + TaskWorkbenchDiffInput, + TaskWorkbenchRenameInput, + TaskWorkbenchRenameSessionInput, + TaskWorkbenchSelectInput, + TaskWorkbenchSetSessionUnreadInput, + TaskWorkbenchSendMessageInput, + TaskWorkbenchSnapshot, + TaskWorkbenchTabInput, + TaskWorkbenchUpdateDraftInput, + WorkbenchAgentTab as AgentTab, + WorkbenchTask as Task, + WorkbenchTranscriptEvent as TranscriptEvent, } from "@sandbox-agent/foundry-shared"; -import type { TaskWorkspaceClient } from "../workspace-client.js"; +import type { TaskWorkbenchClient } from "../workbench-client.js"; function buildTranscriptEvent(params: { sessionId: string; @@ -48,12 +47,12 @@ function buildTranscriptEvent(params: { }; } -class MockWorkspaceStore implements TaskWorkspaceClient { +class MockWorkbenchStore implements TaskWorkbenchClient { private snapshot = buildInitialMockLayoutViewModel(); private listeners = new Set<() => void>(); private pendingTimers = new Map>(); - getSnapshot(): TaskWorkspaceSnapshot { + getSnapshot(): TaskWorkbenchSnapshot { return this.snapshot; } @@ -64,9 +63,9 @@ class MockWorkspaceStore implements TaskWorkspaceClient { }; } - async createTask(input: TaskWorkspaceCreateTaskInput): Promise { + async createTask(input: TaskWorkbenchCreateTaskInput): Promise { const id = uid(); - const sessionId = `session-${id}`; + const tabId = `session-${id}`; const repo = this.snapshot.repos.find((candidate) => candidate.id === input.repoId); if (!repo) { throw new Error(`Cannot create mock task for unknown repo ${input.repoId}`); @@ -75,19 +74,20 @@ class MockWorkspaceStore implements TaskWorkspaceClient { id, repoId: repo.id, title: input.title?.trim() || "New Task", - status: "init_enqueue_provision", + status: "new", repoName: repo.label, updatedAtMs: nowMs(), branch: input.branch?.trim() || null, pullRequest: null, - activeSessionId: sessionId, - sessions: [ + tabs: [ { - id: sessionId, - sessionId: sessionId, + id: tabId, + sessionId: tabId, sessionName: "Session 1", - agent: workspaceAgentForModel(input.model ?? DEFAULT_WORKSPACE_MODEL_ID, MODEL_GROUPS), - model: input.model ?? DEFAULT_WORKSPACE_MODEL_ID, + agent: providerAgent( + MODEL_GROUPS.find((group) => group.models.some((model) => model.id === (input.model ?? "claude-sonnet-4")))?.provider ?? "Claude", + ), + model: input.model ?? "claude-sonnet-4", status: "idle", thinkingSinceMs: null, unread: false, @@ -106,24 +106,24 @@ class MockWorkspaceStore implements TaskWorkspaceClient { ...current, tasks: [nextTask, ...current.tasks], })); - return { taskId: id, sessionId }; + return { taskId: id, tabId }; } - async markTaskUnread(input: TaskWorkspaceSelectInput): Promise { + async markTaskUnread(input: TaskWorkbenchSelectInput): Promise { this.updateTask(input.taskId, (task) => { - const targetSession = task.sessions[task.sessions.length - 1] ?? null; - if (!targetSession) { + const targetTab = task.tabs[task.tabs.length - 1] ?? null; + if (!targetTab) { return task; } return { ...task, - sessions: task.sessions.map((session) => (session.id === targetSession.id ? { ...session, unread: true } : session)), + tabs: task.tabs.map((tab) => (tab.id === targetTab.id ? { ...tab, unread: true } : tab)), }; }); } - async renameTask(input: TaskWorkspaceRenameInput): Promise { + async renameTask(input: TaskWorkbenchRenameInput): Promise { const value = input.value.trim(); if (!value) { throw new Error(`Cannot rename task ${input.taskId} to an empty title`); @@ -131,32 +131,28 @@ class MockWorkspaceStore implements TaskWorkspaceClient { this.updateTask(input.taskId, (task) => ({ ...task, title: value, updatedAtMs: nowMs() })); } - async archiveTask(input: TaskWorkspaceSelectInput): Promise { + async renameBranch(input: TaskWorkbenchRenameInput): Promise { + const value = input.value.trim(); + if (!value) { + throw new Error(`Cannot rename branch for task ${input.taskId} to an empty value`); + } + this.updateTask(input.taskId, (task) => ({ ...task, branch: value, updatedAtMs: nowMs() })); + } + + async archiveTask(input: TaskWorkbenchSelectInput): Promise { this.updateTask(input.taskId, (task) => ({ ...task, status: "archived", updatedAtMs: nowMs() })); } - async publishPr(input: TaskWorkspaceSelectInput): Promise { + async publishPr(input: TaskWorkbenchSelectInput): Promise { const nextPrNumber = Math.max(0, ...this.snapshot.tasks.map((task) => task.pullRequest?.number ?? 0)) + 1; this.updateTask(input.taskId, (task) => ({ ...task, updatedAtMs: nowMs(), - pullRequest: { - number: nextPrNumber, - status: "ready", - title: task.title, - state: "open", - url: `https://example.test/pr/${nextPrNumber}`, - headRefName: task.branch ?? `task/${task.id}`, - baseRefName: "main", - repoFullName: task.repoName, - authorLogin: "mock", - isDraft: false, - updatedAtMs: nowMs(), - }, + pullRequest: { number: nextPrNumber, status: "ready" }, })); } - async revertFile(input: TaskWorkspaceDiffInput): Promise { + async revertFile(input: TaskWorkbenchDiffInput): Promise { this.updateTask(input.taskId, (task) => { const file = task.fileChanges.find((entry) => entry.path === input.path); const nextDiffs = { ...task.diffs }; @@ -171,13 +167,13 @@ class MockWorkspaceStore implements TaskWorkspaceClient { }); } - async updateDraft(input: TaskWorkspaceUpdateDraftInput): Promise { - this.assertSession(input.taskId, input.sessionId); + async updateDraft(input: TaskWorkbenchUpdateDraftInput): Promise { + this.assertTab(input.taskId, input.tabId); this.updateTask(input.taskId, (task) => ({ ...task, updatedAtMs: nowMs(), - sessions: task.sessions.map((tab) => - tab.id === input.sessionId + tabs: task.tabs.map((tab) => + tab.id === input.tabId ? { ...tab, draft: { @@ -191,25 +187,25 @@ class MockWorkspaceStore implements TaskWorkspaceClient { })); } - async sendMessage(input: TaskWorkspaceSendMessageInput): Promise { + async sendMessage(input: TaskWorkbenchSendMessageInput): Promise { const text = input.text.trim(); if (!text) { throw new Error(`Cannot send an empty mock prompt for task ${input.taskId}`); } - this.assertSession(input.taskId, input.sessionId); + this.assertTab(input.taskId, input.tabId); const startedAtMs = nowMs(); this.updateTask(input.taskId, (currentTask) => { - const isFirstOnTask = String(currentTask.status).startsWith("init_"); + const isFirstOnTask = currentTask.status === "new"; const newTitle = isFirstOnTask ? (text.length > 50 ? `${text.slice(0, 47)}...` : text) : currentTask.title; const newBranch = isFirstOnTask ? `feat/${slugify(newTitle)}` : currentTask.branch; const userMessageLines = [text, ...input.attachments.map((attachment) => `@ ${attachment.filePath}:${attachment.lineNumber}`)]; const userEvent = buildTranscriptEvent({ - sessionId: input.sessionId, + sessionId: input.tabId, sender: "client", createdAt: startedAtMs, - eventIndex: candidateEventIndex(currentTask, input.sessionId), + eventIndex: candidateEventIndex(currentTask, input.tabId), payload: { method: "session/prompt", params: { @@ -224,8 +220,8 @@ class MockWorkspaceStore implements TaskWorkspaceClient { branch: newBranch, status: "running", updatedAtMs: startedAtMs, - sessions: currentTask.sessions.map((candidate) => - candidate.id === input.sessionId + tabs: currentTask.tabs.map((candidate) => + candidate.id === input.tabId ? { ...candidate, created: true, @@ -240,20 +236,20 @@ class MockWorkspaceStore implements TaskWorkspaceClient { }; }); - const existingTimer = this.pendingTimers.get(input.sessionId); + const existingTimer = this.pendingTimers.get(input.tabId); if (existingTimer) { clearTimeout(existingTimer); } const timer = setTimeout(() => { const task = this.requireTask(input.taskId); - this.requireSession(task, input.sessionId); + const replyTab = this.requireTab(task, input.tabId); const completedAtMs = nowMs(); const replyEvent = buildTranscriptEvent({ - sessionId: input.sessionId, + sessionId: input.tabId, sender: "agent", createdAt: completedAtMs, - eventIndex: candidateEventIndex(task, input.sessionId), + eventIndex: candidateEventIndex(task, input.tabId), payload: { result: { text: randomReply(), @@ -263,8 +259,8 @@ class MockWorkspaceStore implements TaskWorkspaceClient { }); this.updateTask(input.taskId, (currentTask) => { - const updatedTabs = currentTask.sessions.map((candidate) => { - if (candidate.id !== input.sessionId) { + const updatedTabs = currentTask.tabs.map((candidate) => { + if (candidate.id !== input.tabId) { return candidate; } @@ -281,93 +277,79 @@ class MockWorkspaceStore implements TaskWorkspaceClient { return { ...currentTask, updatedAtMs: completedAtMs, - sessions: updatedTabs, + tabs: updatedTabs, status: currentTask.status === "archived" ? "archived" : anyRunning ? "running" : "idle", }; }); - this.pendingTimers.delete(input.sessionId); + this.pendingTimers.delete(input.tabId); }, 2_500); - this.pendingTimers.set(input.sessionId, timer); + this.pendingTimers.set(input.tabId, timer); } - async stopAgent(input: TaskWorkspaceSessionInput): Promise { - this.assertSession(input.taskId, input.sessionId); - const existing = this.pendingTimers.get(input.sessionId); + async stopAgent(input: TaskWorkbenchTabInput): Promise { + this.assertTab(input.taskId, input.tabId); + const existing = this.pendingTimers.get(input.tabId); if (existing) { clearTimeout(existing); - this.pendingTimers.delete(input.sessionId); + this.pendingTimers.delete(input.tabId); } this.updateTask(input.taskId, (currentTask) => { - const updatedTabs = currentTask.sessions.map((candidate) => - candidate.id === input.sessionId ? { ...candidate, status: "idle" as const, thinkingSinceMs: null } : candidate, + const updatedTabs = currentTask.tabs.map((candidate) => + candidate.id === input.tabId ? { ...candidate, status: "idle" as const, thinkingSinceMs: null } : candidate, ); const anyRunning = updatedTabs.some((candidate) => candidate.status === "running"); return { ...currentTask, updatedAtMs: nowMs(), - sessions: updatedTabs, + tabs: updatedTabs, status: currentTask.status === "archived" ? "archived" : anyRunning ? "running" : "idle", }; }); } - async selectSession(input: TaskWorkspaceSessionInput): Promise { - this.assertSession(input.taskId, input.sessionId); + async setSessionUnread(input: TaskWorkbenchSetSessionUnreadInput): Promise { this.updateTask(input.taskId, (currentTask) => ({ ...currentTask, - activeSessionId: input.sessionId, + tabs: currentTask.tabs.map((candidate) => (candidate.id === input.tabId ? { ...candidate, unread: input.unread } : candidate)), })); } - async setSessionUnread(input: TaskWorkspaceSetSessionUnreadInput): Promise { - this.updateTask(input.taskId, (currentTask) => ({ - ...currentTask, - sessions: currentTask.sessions.map((candidate) => (candidate.id === input.sessionId ? { ...candidate, unread: input.unread } : candidate)), - })); - } - - async renameSession(input: TaskWorkspaceRenameSessionInput): Promise { + async renameSession(input: TaskWorkbenchRenameSessionInput): Promise { const title = input.title.trim(); if (!title) { - throw new Error(`Cannot rename session ${input.sessionId} to an empty title`); + throw new Error(`Cannot rename session ${input.tabId} to an empty title`); } this.updateTask(input.taskId, (currentTask) => ({ ...currentTask, - sessions: currentTask.sessions.map((candidate) => (candidate.id === input.sessionId ? { ...candidate, sessionName: title } : candidate)), + tabs: currentTask.tabs.map((candidate) => (candidate.id === input.tabId ? { ...candidate, sessionName: title } : candidate)), })); } - async closeSession(input: TaskWorkspaceSessionInput): Promise { + async closeTab(input: TaskWorkbenchTabInput): Promise { this.updateTask(input.taskId, (currentTask) => { - if (currentTask.sessions.length <= 1) { + if (currentTask.tabs.length <= 1) { return currentTask; } return { ...currentTask, - activeSessionId: - currentTask.activeSessionId === input.sessionId - ? (currentTask.sessions.find((candidate) => candidate.id !== input.sessionId)?.id ?? null) - : currentTask.activeSessionId, - sessions: currentTask.sessions.filter((candidate) => candidate.id !== input.sessionId), + tabs: currentTask.tabs.filter((candidate) => candidate.id !== input.tabId), }; }); } - async addSession(input: TaskWorkspaceSelectInput): Promise { + async addTab(input: TaskWorkbenchSelectInput): Promise { this.assertTask(input.taskId); - const nextSessionId = uid(); - const nextSession: AgentSession = { - id: nextSessionId, - sessionId: nextSessionId, - sandboxSessionId: null, - sessionName: `Session ${this.requireTask(input.taskId).sessions.length + 1}`, - agent: workspaceAgentForModel(DEFAULT_WORKSPACE_MODEL_ID, MODEL_GROUPS), - model: DEFAULT_WORKSPACE_MODEL_ID, + const nextTab: AgentTab = { + id: uid(), + sessionId: null, + sessionName: `Session ${this.requireTask(input.taskId).tabs.length + 1}`, + agent: "Claude", + model: "claude-sonnet-4", status: "idle", thinkingSinceMs: null, unread: false, @@ -379,13 +361,12 @@ class MockWorkspaceStore implements TaskWorkspaceClient { this.updateTask(input.taskId, (currentTask) => ({ ...currentTask, updatedAtMs: nowMs(), - activeSessionId: nextSession.id, - sessions: [...currentTask.sessions, nextSession], + tabs: [...currentTask.tabs, nextTab], })); - return { sessionId: nextSession.id }; + return { tabId: nextTab.id }; } - async changeModel(input: TaskWorkspaceChangeModelInput): Promise { + async changeModel(input: TaskWorkbenchChangeModelInput): Promise { const group = MODEL_GROUPS.find((candidate) => candidate.models.some((entry) => entry.id === input.model)); if (!group) { throw new Error(`Unable to resolve model provider for ${input.model}`); @@ -393,25 +374,17 @@ class MockWorkspaceStore implements TaskWorkspaceClient { this.updateTask(input.taskId, (currentTask) => ({ ...currentTask, - sessions: currentTask.sessions.map((candidate) => - candidate.id === input.sessionId ? { ...candidate, model: input.model, agent: workspaceAgentForModel(input.model, MODEL_GROUPS) } : candidate, + tabs: currentTask.tabs.map((candidate) => + candidate.id === input.tabId ? { ...candidate, model: input.model, agent: providerAgent(group.provider) } : candidate, ), })); } - async changeOwner(input: { repoId: string; taskId: string; targetUserId: string; targetUserName: string; targetUserEmail: string }): Promise { - this.updateTask(input.taskId, (currentTask) => ({ - ...currentTask, - primaryUserLogin: input.targetUserName, - primaryUserAvatarUrl: null, - })); - } - - private updateState(updater: (current: TaskWorkspaceSnapshot) => TaskWorkspaceSnapshot): void { + private updateState(updater: (current: TaskWorkbenchSnapshot) => TaskWorkbenchSnapshot): void { const nextSnapshot = updater(this.snapshot); this.snapshot = { ...nextSnapshot, - repositories: groupWorkspaceRepositories(nextSnapshot.repos, nextSnapshot.tasks), + projects: groupWorkbenchProjects(nextSnapshot.repos, nextSnapshot.tasks), }; this.notify(); } @@ -434,9 +407,9 @@ class MockWorkspaceStore implements TaskWorkspaceClient { this.requireTask(taskId); } - private assertSession(taskId: string, sessionId: string): void { + private assertTab(taskId: string, tabId: string): void { const task = this.requireTask(taskId); - this.requireSession(task, sessionId); + this.requireTab(task, tabId); } private requireTask(taskId: string): Task { @@ -447,25 +420,25 @@ class MockWorkspaceStore implements TaskWorkspaceClient { return task; } - private requireSession(task: Task, sessionId: string): AgentSession { - const session = task.sessions.find((candidate) => candidate.id === sessionId); - if (!session) { - throw new Error(`Unable to find mock session ${sessionId} in task ${task.id}`); + private requireTab(task: Task, tabId: string): AgentTab { + const tab = task.tabs.find((candidate) => candidate.id === tabId); + if (!tab) { + throw new Error(`Unable to find mock tab ${tabId} in task ${task.id}`); } - return session; + return tab; } } -function candidateEventIndex(task: Task, sessionId: string): number { - const session = task.sessions.find((candidate) => candidate.id === sessionId); - return (session?.transcript.length ?? 0) + 1; +function candidateEventIndex(task: Task, tabId: string): number { + const tab = task.tabs.find((candidate) => candidate.id === tabId); + return (tab?.transcript.length ?? 0) + 1; } -let sharedMockWorkspaceClient: TaskWorkspaceClient | null = null; +let sharedMockWorkbenchClient: TaskWorkbenchClient | null = null; -export function getSharedMockWorkspaceClient(): TaskWorkspaceClient { - if (!sharedMockWorkspaceClient) { - sharedMockWorkspaceClient = new MockWorkspaceStore(); +export function getSharedMockWorkbenchClient(): TaskWorkbenchClient { + if (!sharedMockWorkbenchClient) { + sharedMockWorkbenchClient = new MockWorkbenchStore(); } - return sharedMockWorkspaceClient; + return sharedMockWorkbenchClient; } diff --git a/foundry/packages/client/src/remote/app-client.ts b/foundry/packages/client/src/remote/app-client.ts index f1cb908..9b80f3c 100644 --- a/foundry/packages/client/src/remote/app-client.ts +++ b/foundry/packages/client/src/remote/app-client.ts @@ -1,4 +1,4 @@ -import type { FoundryAppSnapshot, FoundryBillingPlanId, UpdateFoundryOrganizationProfileInput, WorkspaceModelId } from "@sandbox-agent/foundry-shared"; +import type { FoundryAppSnapshot, FoundryBillingPlanId, UpdateFoundryOrganizationProfileInput } from "@sandbox-agent/foundry-shared"; import type { BackendClient } from "../backend-client.js"; import type { FoundryAppClient } from "../app-client.js"; @@ -72,11 +72,6 @@ class RemoteFoundryAppStore implements FoundryAppClient { this.notify(); } - async setDefaultModel(model: WorkspaceModelId): Promise { - this.snapshot = await this.backend.setAppDefaultModel(model); - this.notify(); - } - async updateOrganizationProfile(input: UpdateFoundryOrganizationProfileInput): Promise { this.snapshot = await this.backend.updateAppOrganizationProfile(input); this.notify(); @@ -109,8 +104,8 @@ class RemoteFoundryAppStore implements FoundryAppClient { await this.backend.reconnectAppGithub(organizationId); } - async recordSeatUsage(organizationId: string): Promise { - this.snapshot = await this.backend.recordAppSeatUsage(organizationId); + async recordSeatUsage(workspaceId: string): Promise { + this.snapshot = await this.backend.recordAppSeatUsage(workspaceId); this.notify(); } diff --git a/foundry/packages/client/src/remote/workbench-client.ts b/foundry/packages/client/src/remote/workbench-client.ts new file mode 100644 index 0000000..4b25193 --- /dev/null +++ b/foundry/packages/client/src/remote/workbench-client.ts @@ -0,0 +1,197 @@ +import type { + TaskWorkbenchAddTabResponse, + TaskWorkbenchChangeModelInput, + TaskWorkbenchCreateTaskInput, + TaskWorkbenchCreateTaskResponse, + TaskWorkbenchDiffInput, + TaskWorkbenchRenameInput, + TaskWorkbenchRenameSessionInput, + TaskWorkbenchSelectInput, + TaskWorkbenchSetSessionUnreadInput, + TaskWorkbenchSendMessageInput, + TaskWorkbenchSnapshot, + TaskWorkbenchTabInput, + TaskWorkbenchUpdateDraftInput, +} from "@sandbox-agent/foundry-shared"; +import type { BackendClient } from "../backend-client.js"; +import { groupWorkbenchProjects } from "../workbench-model.js"; +import type { TaskWorkbenchClient } from "../workbench-client.js"; + +export interface RemoteWorkbenchClientOptions { + backend: BackendClient; + workspaceId: string; +} + +class RemoteWorkbenchStore implements TaskWorkbenchClient { + private readonly backend: BackendClient; + private readonly workspaceId: string; + private snapshot: TaskWorkbenchSnapshot; + private readonly listeners = new Set<() => void>(); + private unsubscribeWorkbench: (() => void) | null = null; + private refreshPromise: Promise | null = null; + private refreshRetryTimeout: ReturnType | null = null; + + constructor(options: RemoteWorkbenchClientOptions) { + this.backend = options.backend; + this.workspaceId = options.workspaceId; + this.snapshot = { + workspaceId: options.workspaceId, + repos: [], + projects: [], + tasks: [], + }; + } + + getSnapshot(): TaskWorkbenchSnapshot { + return this.snapshot; + } + + subscribe(listener: () => void): () => void { + this.listeners.add(listener); + this.ensureStarted(); + return () => { + this.listeners.delete(listener); + if (this.listeners.size === 0 && this.refreshRetryTimeout) { + clearTimeout(this.refreshRetryTimeout); + this.refreshRetryTimeout = null; + } + if (this.listeners.size === 0 && this.unsubscribeWorkbench) { + this.unsubscribeWorkbench(); + this.unsubscribeWorkbench = null; + } + }; + } + + async createTask(input: TaskWorkbenchCreateTaskInput): Promise { + const created = await this.backend.createWorkbenchTask(this.workspaceId, input); + await this.refresh(); + return created; + } + + async markTaskUnread(input: TaskWorkbenchSelectInput): Promise { + await this.backend.markWorkbenchUnread(this.workspaceId, input); + await this.refresh(); + } + + async renameTask(input: TaskWorkbenchRenameInput): Promise { + await this.backend.renameWorkbenchTask(this.workspaceId, input); + await this.refresh(); + } + + async renameBranch(input: TaskWorkbenchRenameInput): Promise { + await this.backend.renameWorkbenchBranch(this.workspaceId, input); + await this.refresh(); + } + + async archiveTask(input: TaskWorkbenchSelectInput): Promise { + await this.backend.runAction(this.workspaceId, input.taskId, "archive"); + await this.refresh(); + } + + async publishPr(input: TaskWorkbenchSelectInput): Promise { + await this.backend.publishWorkbenchPr(this.workspaceId, input); + await this.refresh(); + } + + async revertFile(input: TaskWorkbenchDiffInput): Promise { + await this.backend.revertWorkbenchFile(this.workspaceId, input); + await this.refresh(); + } + + async updateDraft(input: TaskWorkbenchUpdateDraftInput): Promise { + await this.backend.updateWorkbenchDraft(this.workspaceId, input); + await this.refresh(); + } + + async sendMessage(input: TaskWorkbenchSendMessageInput): Promise { + await this.backend.sendWorkbenchMessage(this.workspaceId, input); + await this.refresh(); + } + + async stopAgent(input: TaskWorkbenchTabInput): Promise { + await this.backend.stopWorkbenchSession(this.workspaceId, input); + await this.refresh(); + } + + async setSessionUnread(input: TaskWorkbenchSetSessionUnreadInput): Promise { + await this.backend.setWorkbenchSessionUnread(this.workspaceId, input); + await this.refresh(); + } + + async renameSession(input: TaskWorkbenchRenameSessionInput): Promise { + await this.backend.renameWorkbenchSession(this.workspaceId, input); + await this.refresh(); + } + + async closeTab(input: TaskWorkbenchTabInput): Promise { + await this.backend.closeWorkbenchSession(this.workspaceId, input); + await this.refresh(); + } + + async addTab(input: TaskWorkbenchSelectInput): Promise { + const created = await this.backend.createWorkbenchSession(this.workspaceId, input); + await this.refresh(); + return created; + } + + async changeModel(input: TaskWorkbenchChangeModelInput): Promise { + await this.backend.changeWorkbenchModel(this.workspaceId, input); + await this.refresh(); + } + + private ensureStarted(): void { + if (!this.unsubscribeWorkbench) { + this.unsubscribeWorkbench = this.backend.subscribeWorkbench(this.workspaceId, () => { + void this.refresh().catch(() => { + this.scheduleRefreshRetry(); + }); + }); + } + void this.refresh().catch(() => { + this.scheduleRefreshRetry(); + }); + } + + private scheduleRefreshRetry(): void { + if (this.refreshRetryTimeout || this.listeners.size === 0) { + return; + } + + this.refreshRetryTimeout = setTimeout(() => { + this.refreshRetryTimeout = null; + void this.refresh().catch(() => { + this.scheduleRefreshRetry(); + }); + }, 1_000); + } + + private async refresh(): Promise { + if (this.refreshPromise) { + await this.refreshPromise; + return; + } + + this.refreshPromise = (async () => { + const nextSnapshot = await this.backend.getWorkbench(this.workspaceId); + if (this.refreshRetryTimeout) { + clearTimeout(this.refreshRetryTimeout); + this.refreshRetryTimeout = null; + } + this.snapshot = { + ...nextSnapshot, + projects: nextSnapshot.projects ?? groupWorkbenchProjects(nextSnapshot.repos, nextSnapshot.tasks), + }; + for (const listener of [...this.listeners]) { + listener(); + } + })().finally(() => { + this.refreshPromise = null; + }); + + await this.refreshPromise; + } +} + +export function createRemoteWorkbenchClient(options: RemoteWorkbenchClientOptions): TaskWorkbenchClient { + return new RemoteWorkbenchStore(options); +} diff --git a/foundry/packages/client/src/remote/workspace-client.ts b/foundry/packages/client/src/remote/workspace-client.ts deleted file mode 100644 index 2a11f51..0000000 --- a/foundry/packages/client/src/remote/workspace-client.ts +++ /dev/null @@ -1,204 +0,0 @@ -import type { - TaskWorkspaceAddSessionResponse, - TaskWorkspaceChangeModelInput, - TaskWorkspaceChangeOwnerInput, - TaskWorkspaceCreateTaskInput, - TaskWorkspaceCreateTaskResponse, - TaskWorkspaceDiffInput, - TaskWorkspaceRenameInput, - TaskWorkspaceRenameSessionInput, - TaskWorkspaceSelectInput, - TaskWorkspaceSetSessionUnreadInput, - TaskWorkspaceSendMessageInput, - TaskWorkspaceSnapshot, - TaskWorkspaceSessionInput, - TaskWorkspaceUpdateDraftInput, -} from "@sandbox-agent/foundry-shared"; -import type { BackendClient } from "../backend-client.js"; -import { groupWorkspaceRepositories } from "../workspace-model.js"; -import type { TaskWorkspaceClient } from "../workspace-client.js"; - -export interface RemoteWorkspaceClientOptions { - backend: BackendClient; - organizationId: string; -} - -class RemoteWorkspaceStore implements TaskWorkspaceClient { - private readonly backend: BackendClient; - private readonly organizationId: string; - private snapshot: TaskWorkspaceSnapshot; - private readonly listeners = new Set<() => void>(); - private unsubscribeWorkspace: (() => void) | null = null; - private refreshPromise: Promise | null = null; - private refreshRetryTimeout: ReturnType | null = null; - - constructor(options: RemoteWorkspaceClientOptions) { - this.backend = options.backend; - this.organizationId = options.organizationId; - this.snapshot = { - organizationId: options.organizationId, - repos: [], - repositories: [], - tasks: [], - }; - } - - getSnapshot(): TaskWorkspaceSnapshot { - return this.snapshot; - } - - subscribe(listener: () => void): () => void { - this.listeners.add(listener); - this.ensureStarted(); - return () => { - this.listeners.delete(listener); - if (this.listeners.size === 0 && this.refreshRetryTimeout) { - clearTimeout(this.refreshRetryTimeout); - this.refreshRetryTimeout = null; - } - if (this.listeners.size === 0 && this.unsubscribeWorkspace) { - this.unsubscribeWorkspace(); - this.unsubscribeWorkspace = null; - } - }; - } - - async createTask(input: TaskWorkspaceCreateTaskInput): Promise { - const created = await this.backend.createWorkspaceTask(this.organizationId, input); - await this.refresh(); - return created; - } - - async markTaskUnread(input: TaskWorkspaceSelectInput): Promise { - await this.backend.markWorkspaceUnread(this.organizationId, input); - await this.refresh(); - } - - async renameTask(input: TaskWorkspaceRenameInput): Promise { - await this.backend.renameWorkspaceTask(this.organizationId, input); - await this.refresh(); - } - - async archiveTask(input: TaskWorkspaceSelectInput): Promise { - await this.backend.runAction(this.organizationId, input.repoId, input.taskId, "archive"); - await this.refresh(); - } - - async publishPr(input: TaskWorkspaceSelectInput): Promise { - await this.backend.publishWorkspacePr(this.organizationId, input); - await this.refresh(); - } - - async revertFile(input: TaskWorkspaceDiffInput): Promise { - await this.backend.revertWorkspaceFile(this.organizationId, input); - await this.refresh(); - } - - async updateDraft(input: TaskWorkspaceUpdateDraftInput): Promise { - await this.backend.updateWorkspaceDraft(this.organizationId, input); - // Skip refresh — the server broadcast will trigger it, and the frontend - // holds local draft state to avoid the round-trip overwriting user input. - } - - async sendMessage(input: TaskWorkspaceSendMessageInput): Promise { - await this.backend.sendWorkspaceMessage(this.organizationId, input); - await this.refresh(); - } - - async stopAgent(input: TaskWorkspaceSessionInput): Promise { - await this.backend.stopWorkspaceSession(this.organizationId, input); - await this.refresh(); - } - - async selectSession(input: TaskWorkspaceSessionInput): Promise { - await this.backend.selectWorkspaceSession(this.organizationId, input); - await this.refresh(); - } - - async setSessionUnread(input: TaskWorkspaceSetSessionUnreadInput): Promise { - await this.backend.setWorkspaceSessionUnread(this.organizationId, input); - await this.refresh(); - } - - async renameSession(input: TaskWorkspaceRenameSessionInput): Promise { - await this.backend.renameWorkspaceSession(this.organizationId, input); - await this.refresh(); - } - - async closeSession(input: TaskWorkspaceSessionInput): Promise { - await this.backend.closeWorkspaceSession(this.organizationId, input); - await this.refresh(); - } - - async addSession(input: TaskWorkspaceSelectInput): Promise { - const created = await this.backend.createWorkspaceSession(this.organizationId, input); - await this.refresh(); - return created; - } - - async changeModel(input: TaskWorkspaceChangeModelInput): Promise { - await this.backend.changeWorkspaceModel(this.organizationId, input); - await this.refresh(); - } - - async changeOwner(input: TaskWorkspaceChangeOwnerInput): Promise { - await this.backend.changeWorkspaceTaskOwner(this.organizationId, input); - await this.refresh(); - } - - private ensureStarted(): void { - if (!this.unsubscribeWorkspace) { - this.unsubscribeWorkspace = this.backend.subscribeWorkspace(this.organizationId, () => { - void this.refresh().catch(() => { - this.scheduleRefreshRetry(); - }); - }); - } - void this.refresh().catch(() => { - this.scheduleRefreshRetry(); - }); - } - - private scheduleRefreshRetry(): void { - if (this.refreshRetryTimeout || this.listeners.size === 0) { - return; - } - - this.refreshRetryTimeout = setTimeout(() => { - this.refreshRetryTimeout = null; - void this.refresh().catch(() => { - this.scheduleRefreshRetry(); - }); - }, 1_000); - } - - private async refresh(): Promise { - if (this.refreshPromise) { - await this.refreshPromise; - return; - } - - this.refreshPromise = (async () => { - const nextSnapshot = await this.backend.getWorkspace(this.organizationId); - if (this.refreshRetryTimeout) { - clearTimeout(this.refreshRetryTimeout); - this.refreshRetryTimeout = null; - } - this.snapshot = { - ...nextSnapshot, - repositories: nextSnapshot.repositories ?? groupWorkspaceRepositories(nextSnapshot.repos, nextSnapshot.tasks), - }; - for (const listener of [...this.listeners]) { - listener(); - } - })().finally(() => { - this.refreshPromise = null; - }); - - await this.refreshPromise; - } -} - -export function createRemoteWorkspaceClient(options: RemoteWorkspaceClientOptions): TaskWorkspaceClient { - return new RemoteWorkspaceStore(options); -} diff --git a/foundry/packages/client/src/subscription/mock-manager.ts b/foundry/packages/client/src/subscription/mock-manager.ts deleted file mode 100644 index bcdb389..0000000 --- a/foundry/packages/client/src/subscription/mock-manager.ts +++ /dev/null @@ -1,12 +0,0 @@ -import { createMockBackendClient } from "../mock/backend-client.js"; -import { RemoteSubscriptionManager } from "./remote-manager.js"; - -/** - * Mock implementation shares the same subscription-manager harness as the remote - * path, but uses the in-memory mock backend that synthesizes actor events. - */ -export class MockSubscriptionManager extends RemoteSubscriptionManager { - constructor() { - super(createMockBackendClient()); - } -} diff --git a/foundry/packages/client/src/subscription/topics.ts b/foundry/packages/client/src/subscription/topics.ts deleted file mode 100644 index bbda118..0000000 --- a/foundry/packages/client/src/subscription/topics.ts +++ /dev/null @@ -1,106 +0,0 @@ -import type { - AppEvent, - FoundryAppSnapshot, - SandboxProviderId, - SandboxProcessesEvent, - SessionEvent, - TaskEvent, - WorkspaceSessionDetail, - WorkspaceTaskDetail, - OrganizationEvent, - OrganizationSummarySnapshot, -} from "@sandbox-agent/foundry-shared"; -import type { ActorConn, BackendClient, SandboxProcessRecord } from "../backend-client.js"; - -/** - * Topic definitions for the subscription manager. - * - * Each topic describes one actor connection plus one materialized read model. - * Some topics can apply broadcast payloads directly, while others refetch - * through BackendClient so auth-scoped state stays user-specific. - */ -export interface TopicDefinition { - key: (params: TParams) => string; - event: string; - connect: (backend: BackendClient, params: TParams) => Promise; - fetchInitial: (backend: BackendClient, params: TParams) => Promise; - applyEvent: (backend: BackendClient, params: TParams, current: TData, event: TEvent) => Promise | TData; -} - -export interface AppTopicParams {} -export interface OrganizationTopicParams { - organizationId: string; -} -export interface TaskTopicParams { - organizationId: string; - repoId: string; - taskId: string; -} -export interface SessionTopicParams { - organizationId: string; - repoId: string; - taskId: string; - sessionId: string; -} -export interface SandboxProcessesTopicParams { - organizationId: string; - sandboxProviderId: SandboxProviderId; - sandboxId: string; -} - -export const topicDefinitions = { - app: { - key: () => "app", - event: "appUpdated", - connect: (backend: BackendClient, _params: AppTopicParams) => backend.connectOrganization("app"), - fetchInitial: (backend: BackendClient, _params: AppTopicParams) => backend.getAppSnapshot(), - applyEvent: (_backend: BackendClient, _params: AppTopicParams, _current: FoundryAppSnapshot, event: AppEvent) => event.snapshot, - } satisfies TopicDefinition, - - organization: { - key: (params: OrganizationTopicParams) => `organization:${params.organizationId}`, - event: "organizationUpdated", - connect: (backend: BackendClient, params: OrganizationTopicParams) => backend.connectOrganization(params.organizationId), - fetchInitial: (backend: BackendClient, params: OrganizationTopicParams) => backend.getOrganizationSummary(params.organizationId), - applyEvent: (_backend: BackendClient, _params: OrganizationTopicParams, _current: OrganizationSummarySnapshot, event: OrganizationEvent) => - event.snapshot, - } satisfies TopicDefinition, - - task: { - key: (params: TaskTopicParams) => `task:${params.organizationId}:${params.taskId}`, - event: "taskUpdated", - connect: (backend: BackendClient, params: TaskTopicParams) => backend.connectTask(params.organizationId, params.repoId, params.taskId), - fetchInitial: (backend: BackendClient, params: TaskTopicParams) => backend.getTaskDetail(params.organizationId, params.repoId, params.taskId), - applyEvent: (backend: BackendClient, params: TaskTopicParams, _current: WorkspaceTaskDetail, _event: TaskEvent) => - backend.getTaskDetail(params.organizationId, params.repoId, params.taskId), - } satisfies TopicDefinition, - - session: { - key: (params: SessionTopicParams) => `session:${params.organizationId}:${params.taskId}:${params.sessionId}`, - event: "sessionUpdated", - connect: (backend: BackendClient, params: SessionTopicParams) => backend.connectTask(params.organizationId, params.repoId, params.taskId), - fetchInitial: (backend: BackendClient, params: SessionTopicParams) => - backend.getSessionDetail(params.organizationId, params.repoId, params.taskId, params.sessionId), - applyEvent: async (backend: BackendClient, params: SessionTopicParams, current: WorkspaceSessionDetail, event: SessionEvent) => { - if (event.session.sessionId !== params.sessionId) { - return current; - } - return await backend.getSessionDetail(params.organizationId, params.repoId, params.taskId, params.sessionId); - }, - } satisfies TopicDefinition, - - sandboxProcesses: { - key: (params: SandboxProcessesTopicParams) => `sandbox:${params.organizationId}:${params.sandboxProviderId}:${params.sandboxId}`, - event: "processesUpdated", - connect: (backend: BackendClient, params: SandboxProcessesTopicParams) => - backend.connectSandbox(params.organizationId, params.sandboxProviderId, params.sandboxId), - fetchInitial: async (backend: BackendClient, params: SandboxProcessesTopicParams) => - (await backend.listSandboxProcesses(params.organizationId, params.sandboxProviderId, params.sandboxId)).processes, - applyEvent: (_backend: BackendClient, _params: SandboxProcessesTopicParams, _current: SandboxProcessRecord[], event: SandboxProcessesEvent) => - event.processes, - } satisfies TopicDefinition, -} as const; - -export type TopicKey = keyof typeof topicDefinitions; -export type TopicParams = Parameters<(typeof topicDefinitions)[K]["fetchInitial"]>[1]; -export type TopicData = Awaited>; diff --git a/foundry/packages/client/src/view-model.ts b/foundry/packages/client/src/view-model.ts index bd7a98c..4764bac 100644 --- a/foundry/packages/client/src/view-model.ts +++ b/foundry/packages/client/src/view-model.ts @@ -9,6 +9,12 @@ const QUEUED_STATUSES = new Set([ "init_enqueue_provision", "init_ensure_name", "init_assert_name", + "init_create_sandbox", + "init_ensure_agent", + "init_start_sandbox_instance", + "init_create_session", + "init_write_db", + "init_start_status_sync", "init_complete", "archive_stop_status_sync", "archive_release_sandbox", @@ -65,7 +71,7 @@ export function filterTasks(rows: TaskRecord[], query: string): TaskRecord[] { } return rows.filter((row) => { - const fields = [row.branchName ?? "", row.title ?? "", row.taskId, row.task]; + const fields = [row.branchName ?? "", row.title ?? "", row.taskId, row.task, row.prAuthor ?? "", row.reviewer ?? ""]; return fields.some((field) => fuzzyMatch(field, q)); }); } @@ -87,7 +93,7 @@ export function summarizeTasks(rows: TaskRecord[]): TaskSummary { for (const row of rows) { byStatus[groupTaskStatus(row.status)] += 1; - byProvider[row.sandboxProviderId] = (byProvider[row.sandboxProviderId] ?? 0) + 1; + byProvider[row.providerId] = (byProvider[row.providerId] ?? 0) + 1; } return { diff --git a/foundry/packages/client/src/workbench-client.ts b/foundry/packages/client/src/workbench-client.ts new file mode 100644 index 0000000..b6990fc --- /dev/null +++ b/foundry/packages/client/src/workbench-client.ts @@ -0,0 +1,64 @@ +import type { + TaskWorkbenchAddTabResponse, + TaskWorkbenchChangeModelInput, + TaskWorkbenchCreateTaskInput, + TaskWorkbenchCreateTaskResponse, + TaskWorkbenchDiffInput, + TaskWorkbenchRenameInput, + TaskWorkbenchRenameSessionInput, + TaskWorkbenchSelectInput, + TaskWorkbenchSetSessionUnreadInput, + TaskWorkbenchSendMessageInput, + TaskWorkbenchSnapshot, + TaskWorkbenchTabInput, + TaskWorkbenchUpdateDraftInput, +} from "@sandbox-agent/foundry-shared"; +import type { BackendClient } from "./backend-client.js"; +import { getSharedMockWorkbenchClient } from "./mock/workbench-client.js"; +import { createRemoteWorkbenchClient } from "./remote/workbench-client.js"; + +export type TaskWorkbenchClientMode = "mock" | "remote"; + +export interface CreateTaskWorkbenchClientOptions { + mode: TaskWorkbenchClientMode; + backend?: BackendClient; + workspaceId?: string; +} + +export interface TaskWorkbenchClient { + getSnapshot(): TaskWorkbenchSnapshot; + subscribe(listener: () => void): () => void; + createTask(input: TaskWorkbenchCreateTaskInput): Promise; + markTaskUnread(input: TaskWorkbenchSelectInput): Promise; + renameTask(input: TaskWorkbenchRenameInput): Promise; + renameBranch(input: TaskWorkbenchRenameInput): Promise; + archiveTask(input: TaskWorkbenchSelectInput): Promise; + publishPr(input: TaskWorkbenchSelectInput): Promise; + revertFile(input: TaskWorkbenchDiffInput): Promise; + updateDraft(input: TaskWorkbenchUpdateDraftInput): Promise; + sendMessage(input: TaskWorkbenchSendMessageInput): Promise; + stopAgent(input: TaskWorkbenchTabInput): Promise; + setSessionUnread(input: TaskWorkbenchSetSessionUnreadInput): Promise; + renameSession(input: TaskWorkbenchRenameSessionInput): Promise; + closeTab(input: TaskWorkbenchTabInput): Promise; + addTab(input: TaskWorkbenchSelectInput): Promise; + changeModel(input: TaskWorkbenchChangeModelInput): Promise; +} + +export function createTaskWorkbenchClient(options: CreateTaskWorkbenchClientOptions): TaskWorkbenchClient { + if (options.mode === "mock") { + return getSharedMockWorkbenchClient(); + } + + if (!options.backend) { + throw new Error("Remote task workbench client requires a backend client"); + } + if (!options.workspaceId) { + throw new Error("Remote task workbench client requires a workspace id"); + } + + return createRemoteWorkbenchClient({ + backend: options.backend, + workspaceId: options.workspaceId, + }); +} diff --git a/foundry/packages/client/src/workspace-model.ts b/foundry/packages/client/src/workbench-model.ts similarity index 78% rename from foundry/packages/client/src/workspace-model.ts rename to foundry/packages/client/src/workbench-model.ts index 290794b..b99f588 100644 --- a/foundry/packages/client/src/workspace-model.ts +++ b/foundry/packages/client/src/workbench-model.ts @@ -1,28 +1,36 @@ -import { - DEFAULT_WORKSPACE_MODEL_ID, - DEFAULT_WORKSPACE_MODEL_GROUPS as SharedModelGroups, - workspaceModelLabel as sharedWorkspaceModelLabel, - workspaceProviderAgent as sharedWorkspaceProviderAgent, -} from "@sandbox-agent/foundry-shared"; import type { - WorkspaceAgentKind as AgentKind, - WorkspaceSession as AgentSession, - WorkspaceDiffLineKind as DiffLineKind, - WorkspaceFileTreeNode as FileTreeNode, - WorkspaceTask as Task, - TaskWorkspaceSnapshot, - WorkspaceHistoryEvent as HistoryEvent, - WorkspaceModelGroup as ModelGroup, - WorkspaceModelId as ModelId, - WorkspaceParsedDiffLine as ParsedDiffLine, - WorkspaceRepositorySection, - WorkspaceRepo, - WorkspaceTranscriptEvent as TranscriptEvent, + WorkbenchAgentKind as AgentKind, + WorkbenchAgentTab as AgentTab, + WorkbenchDiffLineKind as DiffLineKind, + WorkbenchFileTreeNode as FileTreeNode, + WorkbenchTask as Task, + TaskWorkbenchSnapshot, + WorkbenchHistoryEvent as HistoryEvent, + WorkbenchModelGroup as ModelGroup, + WorkbenchModelId as ModelId, + WorkbenchParsedDiffLine as ParsedDiffLine, + WorkbenchProjectSection, + WorkbenchRepo, + WorkbenchTranscriptEvent as TranscriptEvent, } from "@sandbox-agent/foundry-shared"; import rivetDevFixture from "../../../scripts/data/rivet-dev.json" with { type: "json" }; -export const MODEL_GROUPS: ModelGroup[] = SharedModelGroups; -export const DEFAULT_MODEL_ID: ModelId = DEFAULT_WORKSPACE_MODEL_ID; +export const MODEL_GROUPS: ModelGroup[] = [ + { + provider: "Claude", + models: [ + { id: "claude-sonnet-4", label: "Sonnet 4" }, + { id: "claude-opus-4", label: "Opus 4" }, + ], + }, + { + provider: "OpenAI", + models: [ + { id: "gpt-4o", label: "GPT-4o" }, + { id: "o3", label: "o3" }, + ], + }, +]; const MOCK_REPLIES = [ "Got it. I'll work on that now. Let me start by examining the relevant files...", @@ -61,11 +69,15 @@ export function formatMessageDuration(durationMs: number): string { } export function modelLabel(id: ModelId): string { - return sharedWorkspaceModelLabel(id, MODEL_GROUPS); + const group = MODEL_GROUPS.find((candidate) => candidate.models.some((model) => model.id === id)); + const model = group?.models.find((candidate) => candidate.id === id); + return model && group ? `${group.provider} ${model.label}` : id; } export function providerAgent(provider: string): AgentKind { - return sharedWorkspaceProviderAgent(provider); + if (provider === "Claude") return "Claude"; + if (provider === "OpenAI") return "Codex"; + return "Cursor"; } export function slugify(text: string): string { @@ -170,17 +182,17 @@ function historyDetail(event: TranscriptEvent): string { return content || "Untitled event"; } -export function buildHistoryEvents(sessions: AgentSession[]): HistoryEvent[] { - return sessions - .flatMap((session) => - session.transcript +export function buildHistoryEvents(tabs: AgentTab[]): HistoryEvent[] { + return tabs + .flatMap((tab) => + tab.transcript .filter((event) => event.sender === "client") .map((event) => ({ - id: `history-${session.id}-${event.id}`, + id: `history-${tab.id}-${event.id}`, messageId: event.id, preview: historyPreview(event), - sessionName: session.sessionName, - sessionId: session.id, + sessionName: tab.sessionName, + tabId: tab.id, createdAtMs: event.createdAt, detail: historyDetail(event), })), @@ -188,29 +200,6 @@ export function buildHistoryEvents(sessions: AgentSession[]): HistoryEvent[] { .sort((left, right) => messageOrder(left.messageId) - messageOrder(right.messageId)); } -function buildPullRequestSummary(params: { - number: number; - title: string; - branch: string; - repoName: string; - updatedAtMs: number; - status: "ready" | "draft"; -}) { - return { - number: params.number, - status: params.status, - title: params.title, - state: "open", - url: `https://github.com/${params.repoName}/pull/${params.number}`, - headRefName: params.branch, - baseRefName: "main", - repoFullName: params.repoName, - authorLogin: "mock", - isDraft: params.status === "draft", - updatedAtMs: params.updatedAtMs, - }; -} - function transcriptFromLegacyMessages(sessionId: string, messages: LegacyMessage[]): TranscriptEvent[] { return messages.map((message, index) => ({ id: message.id, @@ -242,41 +231,6 @@ function minutesAgo(minutes: number): number { return NOW_MS - minutes * 60_000; } -function buildTranscriptStressMessages(pairCount: number): LegacyMessage[] { - const startedAtMs = NOW_MS - pairCount * 8_000; - const messages: LegacyMessage[] = []; - - for (let index = 0; index < pairCount; index++) { - const sequence = index + 1; - const createdAtMs = startedAtMs + index * 8_000; - - messages.push({ - id: `stress-user-${sequence}`, - role: "user", - agent: null, - createdAtMs, - lines: [ - `Stress prompt ${sequence}: summarize the current state of the transcript virtualizer.`, - `Keep the answer focused on scroll position, render cost, and preserved expansion state.`, - ], - }); - - messages.push({ - id: `stress-agent-${sequence}`, - role: "agent", - agent: "codex", - createdAtMs: createdAtMs + 3_000, - lines: [ - `Stress reply ${sequence}: the list should only render visible rows plus overscan while preserving scroll anchoring near the bottom.`, - `Grouping, minimap navigation, and per-row UI should remain stable even as older rows unmount.`, - ], - durationMs: 2_500, - }); - } - - return messages; -} - export function parseDiffLines(diff: string): ParsedDiffLine[] { return diff.split("\n").map((text, index) => { if (text.startsWith("@@")) { @@ -322,21 +276,14 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/sandbox-agent", updatedAtMs: minutesAgo(8), branch: "NathanFlurry/pi-bootstrap-fix", - pullRequest: buildPullRequestSummary({ - number: 227, - title: "Normalize Pi ACP bootstrap payloads", - branch: "NathanFlurry/pi-bootstrap-fix", - repoName: "rivet-dev/sandbox-agent", - updatedAtMs: minutesAgo(8), - status: "ready", - }), - sessions: [ + pullRequest: { number: 227, status: "ready" }, + tabs: [ { id: "t1", sessionId: "t1", sessionName: "Pi payload fix", agent: "Claude", - model: "sonnet", + model: "claude-sonnet-4", status: "idle", thinkingSinceMs: null, unread: false, @@ -387,7 +334,7 @@ export function buildInitialTasks(): Task[] { sessionId: "t2", sessionName: "Test coverage", agent: "Codex", - model: "gpt-5.3-codex", + model: "gpt-4o", status: "idle", thinkingSinceMs: null, unread: true, @@ -498,21 +445,14 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/sandbox-agent", updatedAtMs: minutesAgo(3), branch: "feat/builtin-agent-skills", - pullRequest: buildPullRequestSummary({ - number: 223, - title: "Auto-inject builtin agent skills at startup", - branch: "feat/builtin-agent-skills", - repoName: "rivet-dev/sandbox-agent", - updatedAtMs: minutesAgo(3), - status: "draft", - }), - sessions: [ + pullRequest: { number: 223, status: "draft" }, + tabs: [ { id: "t3", sessionId: "t3", sessionName: "Skills injection", agent: "Claude", - model: "opus", + model: "claude-opus-4", status: "running", thinkingSinceMs: NOW_MS - 45_000, unread: false, @@ -605,21 +545,14 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/sandbox-agent", updatedAtMs: minutesAgo(45), branch: "hooks-example", - pullRequest: buildPullRequestSummary({ - number: 225, - title: "Add hooks example for Claude, Codex, and OpenCode", - branch: "hooks-example", - repoName: "rivet-dev/sandbox-agent", - updatedAtMs: minutesAgo(45), - status: "ready", - }), - sessions: [ + pullRequest: { number: 225, status: "ready" }, + tabs: [ { id: "t4", sessionId: "t4", sessionName: "Example docs", agent: "Claude", - model: "sonnet", + model: "claude-sonnet-4", status: "idle", thinkingSinceMs: null, unread: false, @@ -687,21 +620,14 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/rivet", updatedAtMs: minutesAgo(15), branch: "actor-reschedule-endpoint", - pullRequest: buildPullRequestSummary({ - number: 4400, - title: "Add actor reschedule endpoint", - branch: "actor-reschedule-endpoint", - repoName: "rivet-dev/rivet", - updatedAtMs: minutesAgo(15), - status: "ready", - }), - sessions: [ + pullRequest: { number: 4400, status: "ready" }, + tabs: [ { id: "t5", sessionId: "t5", sessionName: "Reschedule API", agent: "Claude", - model: "sonnet", + model: "claude-sonnet-4", status: "idle", thinkingSinceMs: null, unread: false, @@ -828,21 +754,14 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/rivet", updatedAtMs: minutesAgo(35), branch: "feat/dynamic-actors", - pullRequest: buildPullRequestSummary({ - number: 4395, - title: "Dynamic actors", - branch: "feat/dynamic-actors", - repoName: "rivet-dev/rivet", - updatedAtMs: minutesAgo(35), - status: "draft", - }), - sessions: [ + pullRequest: { number: 4395, status: "draft" }, + tabs: [ { id: "t6", sessionId: "t6", sessionName: "Dynamic actors impl", agent: "Claude", - model: "opus", + model: "claude-opus-4", status: "idle", thinkingSinceMs: null, unread: true, @@ -892,21 +811,14 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/vbare", updatedAtMs: minutesAgo(25), branch: "fix-use-full-cloud-run-pool-name", - pullRequest: buildPullRequestSummary({ - number: 235, - title: "Use full cloud run pool name for routing", - branch: "fix-use-full-cloud-run-pool-name", - repoName: "rivet-dev/vbare", - updatedAtMs: minutesAgo(25), - status: "ready", - }), - sessions: [ + pullRequest: { number: 235, status: "ready" }, + tabs: [ { id: "t7", sessionId: "t7", sessionName: "Pool routing fix", agent: "Claude", - model: "sonnet", + model: "claude-sonnet-4", status: "idle", thinkingSinceMs: null, unread: false, @@ -1008,21 +920,14 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/skills", updatedAtMs: minutesAgo(50), branch: "fix-guard-support-https-targets", - pullRequest: buildPullRequestSummary({ - number: 125, - title: "Route compute gateway path correctly", - branch: "fix-guard-support-https-targets", - repoName: "rivet-dev/skills", - updatedAtMs: minutesAgo(50), - status: "ready", - }), - sessions: [ + pullRequest: { number: 125, status: "ready" }, + tabs: [ { id: "t8", sessionId: "t8", sessionName: "Guard routing", agent: "Claude", - model: "sonnet", + model: "claude-sonnet-4", status: "idle", thinkingSinceMs: null, unread: false, @@ -1129,21 +1034,14 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/skills", updatedAtMs: minutesAgo(2 * 24 * 60), branch: "chore-move-compute-gateway-to", - pullRequest: buildPullRequestSummary({ - number: 123, - title: "Move compute gateway to guard", - branch: "chore-move-compute-gateway-to", - repoName: "rivet-dev/skills", - updatedAtMs: minutesAgo(2 * 24 * 60), - status: "ready", - }), - sessions: [ + pullRequest: { number: 123, status: "ready" }, + tabs: [ { id: "t9", sessionId: "t9", sessionName: "Gateway migration", agent: "Claude", - model: "sonnet", + model: "claude-sonnet-4", status: "idle", thinkingSinceMs: null, unread: false, @@ -1179,13 +1077,13 @@ export function buildInitialTasks(): Task[] { updatedAtMs: minutesAgo(90), branch: "fix/namespace-isolation", pullRequest: null, - sessions: [ + tabs: [ { id: "t10", sessionId: "t10", sessionName: "Namespace fix", agent: "Codex", - model: "gpt-5.3-codex", + model: "gpt-4o", status: "idle", thinkingSinceMs: null, unread: true, @@ -1222,143 +1120,15 @@ export function buildInitialTasks(): Task[] { fileTree: [], minutesUsed: 3, }, - - // ── Status demo tasks ────────────────────────────────────────────── - { - id: "status-error", - repoId: "sandbox-agent", - title: "Fix broken auth middleware (error demo)", - status: "error", - repoName: "rivet-dev/sandbox-agent", - updatedAtMs: minutesAgo(2), - branch: "fix/auth-middleware", - pullRequest: null, - sessions: [ - { - id: "status-error-session", - sessionId: "status-error-session", - sessionName: "Auth fix", - agent: "Claude", - model: "sonnet", - status: "error", - thinkingSinceMs: null, - unread: false, - created: true, - errorMessage: "Sandbox process exited unexpectedly (exit code 137). The sandbox may have run out of memory.", - draft: { text: "", attachments: [], updatedAtMs: null }, - transcript: [], - }, - ], - fileChanges: [], - diffs: {}, - fileTree: [], - minutesUsed: 1, - }, - { - id: "status-provisioning", - repoId: "sandbox-agent", - title: "Add rate limiting to API gateway (provisioning demo)", - status: "init_enqueue_provision", - repoName: "rivet-dev/sandbox-agent", - updatedAtMs: minutesAgo(0), - branch: null, - pullRequest: null, - sessions: [ - { - id: "status-prov-session", - sessionId: "status-prov-session", - sandboxSessionId: null, - sessionName: "Session 1", - agent: "Claude", - model: "sonnet", - status: "pending_provision", - thinkingSinceMs: null, - unread: false, - created: false, - draft: { text: "", attachments: [], updatedAtMs: null }, - transcript: [], - }, - ], - fileChanges: [], - diffs: {}, - fileTree: [], - minutesUsed: 0, - }, - { - id: "stress-transcript", - repoId: "sandbox-agent", - title: "Transcript virtualization stress test", - status: "idle", - repoName: "rivet-dev/sandbox-agent", - updatedAtMs: minutesAgo(40), - branch: "perf/transcript-virtualizer", - pullRequest: null, - sessions: [ - { - id: "stress-transcript-tab", - sessionId: "stress-transcript-session", - sessionName: "Virtualizer stress session", - agent: "Codex", - model: "gpt-5.3-codex", - status: "idle", - thinkingSinceMs: null, - unread: false, - created: true, - draft: { text: "", attachments: [], updatedAtMs: null }, - transcript: transcriptFromLegacyMessages("stress-transcript-tab", buildTranscriptStressMessages(1600)), - }, - ], - fileChanges: [], - diffs: {}, - fileTree: [], - minutesUsed: 18, - }, - { - id: "status-running", - repoId: "sandbox-agent", - title: "Refactor WebSocket handler (running demo)", - status: "running", - repoName: "rivet-dev/sandbox-agent", - updatedAtMs: minutesAgo(1), - branch: "refactor/ws-handler", - pullRequest: null, - sessions: [ - { - id: "status-run-session", - sessionId: "status-run-session", - sessionName: "WS refactor", - agent: "Codex", - model: "gpt-5.3-codex", - status: "running", - thinkingSinceMs: Date.now() - 12_000, - unread: false, - created: true, - draft: { text: "", attachments: [], updatedAtMs: null }, - transcript: transcriptFromLegacyMessages("status-run-session", [ - { - id: "sr1", - role: "user", - agent: null, - createdAtMs: minutesAgo(3), - lines: ["Refactor the WebSocket handler to use a connection pool pattern."], - }, - ]), - }, - ], - fileChanges: [], - diffs: {}, - fileTree: [], - minutesUsed: 2, - }, ]; } /** * Build repos list from the rivet-dev fixture data (scripts/data/rivet-dev.json). * Uses real public repos so the mock sidebar matches what an actual rivet-dev - * organization would show after a GitHub sync. + * workspace would show after a GitHub sync. */ -function buildMockRepos(): WorkspaceRepo[] { +function buildMockRepos(): WorkbenchRepo[] { return rivetDevFixture.repos.map((r) => ({ id: repoIdFromFullName(r.fullName), label: r.fullName, @@ -1371,19 +1141,55 @@ function repoIdFromFullName(fullName: string): string { return parts[parts.length - 1] ?? fullName; } -export function buildInitialMockLayoutViewModel(): TaskWorkspaceSnapshot { +/** + * Build task entries from open PR fixture data. + * Maps to the backend's PR sync behavior (ProjectPrSyncActor) where PRs + * appear as first-class sidebar items even without an associated task. + * Each open PR gets a lightweight task entry so it shows in the sidebar. + */ +function buildPrTasks(): Task[] { + // Collect branch names already claimed by hand-written tasks so we don't duplicate + const existingBranches = new Set( + buildInitialTasks() + .map((t) => t.branch) + .filter(Boolean), + ); + + return rivetDevFixture.openPullRequests + .filter((pr) => !existingBranches.has(pr.headRefName)) + .map((pr) => { + const repoId = repoIdFromFullName(pr.repoFullName); + return { + id: `pr-${repoId}-${pr.number}`, + repoId, + title: pr.title, + status: "idle" as const, + repoName: pr.repoFullName, + updatedAtMs: new Date(pr.updatedAt).getTime(), + branch: pr.headRefName, + pullRequest: { number: pr.number, status: pr.draft ? ("draft" as const) : ("ready" as const) }, + tabs: [], + fileChanges: [], + diffs: {}, + fileTree: [], + minutesUsed: 0, + }; + }); +} + +export function buildInitialMockLayoutViewModel(): TaskWorkbenchSnapshot { const repos = buildMockRepos(); - const tasks = buildInitialTasks(); + const tasks = [...buildInitialTasks(), ...buildPrTasks()]; return { - organizationId: "default", + workspaceId: "default", repos, - repositories: groupWorkspaceRepositories(repos, tasks), + projects: groupWorkbenchProjects(repos, tasks), tasks, }; } -export function groupWorkspaceRepositories(repos: WorkspaceRepo[], tasks: Task[]): WorkspaceRepositorySection[] { - const grouped = new Map(); +export function groupWorkbenchProjects(repos: WorkbenchRepo[], tasks: Task[]): WorkbenchProjectSection[] { + const grouped = new Map(); for (const repo of repos) { grouped.set(repo.id, { @@ -1408,11 +1214,11 @@ export function groupWorkspaceRepositories(repos: WorkspaceRepo[], tasks: Task[] } return [...grouped.values()] - .map((repository) => ({ - ...repository, - tasks: [...repository.tasks].sort((a, b) => b.updatedAtMs - a.updatedAtMs), - updatedAtMs: repository.tasks.length > 0 ? Math.max(...repository.tasks.map((task) => task.updatedAtMs)) : repository.updatedAtMs, + .map((project) => ({ + ...project, + tasks: [...project.tasks].sort((a, b) => b.updatedAtMs - a.updatedAtMs), + updatedAtMs: project.tasks.length > 0 ? Math.max(...project.tasks.map((task) => task.updatedAtMs)) : project.updatedAtMs, })) - .filter((repository) => repository.tasks.length > 0) + .filter((project) => project.tasks.length > 0) .sort((a, b) => b.updatedAtMs - a.updatedAtMs); } diff --git a/foundry/packages/client/src/workspace-client.ts b/foundry/packages/client/src/workspace-client.ts deleted file mode 100644 index 6662352..0000000 --- a/foundry/packages/client/src/workspace-client.ts +++ /dev/null @@ -1,66 +0,0 @@ -import type { - TaskWorkspaceAddSessionResponse, - TaskWorkspaceChangeModelInput, - TaskWorkspaceChangeOwnerInput, - TaskWorkspaceCreateTaskInput, - TaskWorkspaceCreateTaskResponse, - TaskWorkspaceDiffInput, - TaskWorkspaceRenameInput, - TaskWorkspaceRenameSessionInput, - TaskWorkspaceSelectInput, - TaskWorkspaceSetSessionUnreadInput, - TaskWorkspaceSendMessageInput, - TaskWorkspaceSnapshot, - TaskWorkspaceSessionInput, - TaskWorkspaceUpdateDraftInput, -} from "@sandbox-agent/foundry-shared"; -import type { BackendClient } from "./backend-client.js"; -import { getSharedMockWorkspaceClient } from "./mock/workspace-client.js"; -import { createRemoteWorkspaceClient } from "./remote/workspace-client.js"; - -export type TaskWorkspaceClientMode = "mock" | "remote"; - -export interface CreateTaskWorkspaceClientOptions { - mode: TaskWorkspaceClientMode; - backend?: BackendClient; - organizationId?: string; -} - -export interface TaskWorkspaceClient { - getSnapshot(): TaskWorkspaceSnapshot; - subscribe(listener: () => void): () => void; - createTask(input: TaskWorkspaceCreateTaskInput): Promise; - markTaskUnread(input: TaskWorkspaceSelectInput): Promise; - renameTask(input: TaskWorkspaceRenameInput): Promise; - archiveTask(input: TaskWorkspaceSelectInput): Promise; - publishPr(input: TaskWorkspaceSelectInput): Promise; - revertFile(input: TaskWorkspaceDiffInput): Promise; - updateDraft(input: TaskWorkspaceUpdateDraftInput): Promise; - sendMessage(input: TaskWorkspaceSendMessageInput): Promise; - stopAgent(input: TaskWorkspaceSessionInput): Promise; - selectSession(input: TaskWorkspaceSessionInput): Promise; - setSessionUnread(input: TaskWorkspaceSetSessionUnreadInput): Promise; - renameSession(input: TaskWorkspaceRenameSessionInput): Promise; - closeSession(input: TaskWorkspaceSessionInput): Promise; - addSession(input: TaskWorkspaceSelectInput): Promise; - changeModel(input: TaskWorkspaceChangeModelInput): Promise; - changeOwner(input: TaskWorkspaceChangeOwnerInput): Promise; -} - -export function createTaskWorkspaceClient(options: CreateTaskWorkspaceClientOptions): TaskWorkspaceClient { - if (options.mode === "mock") { - return getSharedMockWorkspaceClient(); - } - - if (!options.backend) { - throw new Error("Remote task workspace client requires a backend client"); - } - if (!options.organizationId) { - throw new Error("Remote task workspace client requires a organization id"); - } - - return createRemoteWorkspaceClient({ - backend: options.backend, - organizationId: options.organizationId, - }); -} diff --git a/foundry/packages/client/test/e2e/full-integration-e2e.test.ts b/foundry/packages/client/test/e2e/full-integration-e2e.test.ts index 21eaf6b..bdb7c1e 100644 --- a/foundry/packages/client/test/e2e/full-integration-e2e.test.ts +++ b/foundry/packages/client/test/e2e/full-integration-e2e.test.ts @@ -1,8 +1,7 @@ import { randomUUID } from "node:crypto"; import { describe, expect, it } from "vitest"; -import type { AuditLogEvent as HistoryEvent, RepoOverview } from "@sandbox-agent/foundry-shared"; +import type { HistoryEvent, RepoOverview } from "@sandbox-agent/foundry-shared"; import { createBackendClient } from "../../src/backend-client.js"; -import { requireImportedRepo } from "./helpers.js"; const RUN_FULL_E2E = process.env.HF_ENABLE_DAEMON_FULL_E2E === "1"; @@ -107,9 +106,9 @@ async function ensureRemoteBranchExists(token: string, fullName: string, branchN } describe("e2e(client): full integration stack workflow", () => { - it.skipIf(!RUN_FULL_E2E)("uses an imported repo, loads branch graph, and executes a stack restack action", { timeout: 8 * 60_000 }, async () => { + it.skipIf(!RUN_FULL_E2E)("adds repo, loads branch graph, and executes a stack restack action", { timeout: 8 * 60_000 }, async () => { const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/v1/rivet"; - const organizationId = process.env.HF_E2E_WORKSPACE?.trim() || "default"; + const workspaceId = process.env.HF_E2E_WORKSPACE?.trim() || "default"; const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO"); const githubToken = requiredEnv("GITHUB_TOKEN"); const { fullName } = parseGithubRepo(repoRemote); @@ -118,27 +117,56 @@ describe("e2e(client): full integration stack workflow", () => { const client = createBackendClient({ endpoint, - defaultOrganizationId: organizationId, + defaultWorkspaceId: workspaceId, }); try { await ensureRemoteBranchExists(githubToken, fullName, seededBranch); - const repo = await requireImportedRepo(client, organizationId, repoRemote); + const repo = await client.addRepo(workspaceId, repoRemote); expect(repo.remoteUrl).toBe(normalizedRepoRemote); const overview = await poll( "repo overview includes seeded branch", 90_000, 1_000, - async () => client.getRepoOverview(organizationId, repo.repoId), - (value) => value.branches.some((row: RepoOverview["branches"][number]) => row.branchName === seededBranch), + async () => client.getRepoOverview(workspaceId, repo.repoId), + (value) => value.branches.some((row) => row.branchName === seededBranch), ); - const postActionOverview = await client.getRepoOverview(organizationId, repo.repoId); - const seededRow = postActionOverview.branches.find((row: RepoOverview["branches"][number]) => row.branchName === seededBranch); + if (!overview.stackAvailable) { + throw new Error( + "git-spice is unavailable for this repo during full integration e2e; set HF_GIT_SPICE_BIN or install git-spice in the backend container", + ); + } + + const stackResult = await client.runRepoStackAction({ + workspaceId, + repoId: repo.repoId, + action: "restack_repo", + }); + expect(stackResult.executed).toBe(true); + expect(stackResult.action).toBe("restack_repo"); + + await poll( + "repo stack action history event", + 60_000, + 1_000, + async () => client.listHistory({ workspaceId, limit: 200 }), + (events) => + events.some((event) => { + if (event.kind !== "repo.stack_action") { + return false; + } + const payload = parseHistoryPayload(event); + return payload.action === "restack_repo"; + }), + ); + + const postActionOverview = await client.getRepoOverview(workspaceId, repo.repoId); + const seededRow = postActionOverview.branches.find((row) => row.branchName === seededBranch); expect(Boolean(seededRow)).toBe(true); - expect(postActionOverview.fetchedAt).toBeGreaterThanOrEqual(overview.fetchedAt); + expect(postActionOverview.fetchedAt).toBeGreaterThan(overview.fetchedAt); } finally { await githubApi(githubToken, `repos/${fullName}/git/refs/heads/${encodeURIComponent(seededBranch)}`, { method: "DELETE" }).catch(() => {}); } diff --git a/foundry/packages/client/test/e2e/github-pr-e2e.test.ts b/foundry/packages/client/test/e2e/github-pr-e2e.test.ts index 89dd638..c468717 100644 --- a/foundry/packages/client/test/e2e/github-pr-e2e.test.ts +++ b/foundry/packages/client/test/e2e/github-pr-e2e.test.ts @@ -1,7 +1,6 @@ import { describe, expect, it } from "vitest"; -import type { AuditLogEvent as HistoryEvent, TaskRecord } from "@sandbox-agent/foundry-shared"; +import type { TaskRecord, HistoryEvent } from "@sandbox-agent/foundry-shared"; import { createBackendClient } from "../../src/backend-client.js"; -import { requireImportedRepo } from "./helpers.js"; const RUN_E2E = process.env.HF_ENABLE_DAEMON_E2E === "1"; @@ -80,22 +79,20 @@ function parseHistoryPayload(event: HistoryEvent): Record { } } -async function debugDump(client: ReturnType, organizationId: string, repoId: string, taskId: string): Promise { +async function debugDump(client: ReturnType, workspaceId: string, taskId: string): Promise { try { - const task = await client.getTask(organizationId, repoId, taskId); - const detail = await client.getTaskDetail(organizationId, repoId, taskId).catch(() => null); - const history = await client.listHistory({ organizationId, taskId, limit: 80 }).catch(() => []); + const task = await client.getTask(workspaceId, taskId); + const history = await client.listHistory({ workspaceId, taskId, limit: 80 }).catch(() => []); const historySummary = history .slice(0, 20) .map((e) => `${new Date(e.createdAt).toISOString()} ${e.kind}`) .join("\n"); let sessionEventsSummary = ""; - const activeSessionId = detail?.activeSessionId ?? null; - if (task.activeSandboxId && activeSessionId) { + if (task.activeSandboxId && task.activeSessionId) { const events = await client - .listSandboxSessionEvents(organizationId, task.sandboxProviderId, task.activeSandboxId, { - sessionId: activeSessionId, + .listSandboxSessionEvents(workspaceId, task.providerId, task.activeSandboxId, { + sessionId: task.activeSessionId, limit: 50, }) .then((r) => r.items) @@ -111,11 +108,13 @@ async function debugDump(client: ReturnType, organiz JSON.stringify( { status: task.status, + statusMessage: task.statusMessage, title: task.title, branchName: task.branchName, activeSandboxId: task.activeSandboxId, - activeSessionId, - pullRequestUrl: detail?.pullRequest?.url ?? null, + activeSessionId: task.activeSessionId, + prUrl: task.prUrl, + prSubmitted: task.prSubmitted, }, null, 2, @@ -146,7 +145,7 @@ async function githubApi(token: string, path: string, init?: RequestInit): Promi describe("e2e: backend -> sandbox-agent -> git -> PR", () => { it.skipIf(!RUN_E2E)("creates a task, waits for agent to implement, and opens a PR", { timeout: 15 * 60_000 }, async () => { const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/v1/rivet"; - const organizationId = process.env.HF_E2E_WORKSPACE?.trim() || "default"; + const workspaceId = process.env.HF_E2E_WORKSPACE?.trim() || "default"; const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO"); const githubToken = requiredEnv("GITHUB_TOKEN"); @@ -156,13 +155,13 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { const client = createBackendClient({ endpoint, - defaultOrganizationId: organizationId, + defaultWorkspaceId: workspaceId, }); - const repo = await requireImportedRepo(client, organizationId, repoRemote); + const repo = await client.addRepo(workspaceId, repoRemote); const created = await client.createTask({ - organizationId, + workspaceId, repoId: repo.repoId, task: [ "E2E test task:", @@ -172,7 +171,7 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { "4. git push the branch to origin", "5. Stop when done (agent should go idle).", ].join("\n"), - sandboxProviderId: "local", + providerId: "daytona", explicitTitle: `test(e2e): ${runId}`, explicitBranchName: `e2e/${runId}`, }); @@ -186,10 +185,10 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { try { const namedAndProvisioned = await poll( "task naming + sandbox provisioning", - // Cold local sandbox startup can exceed a few minutes on first run. + // Cold Daytona snapshot/image preparation can exceed 5 minutes on first run. 8 * 60_000, 1_000, - async () => client.getTask(organizationId, repo.repoId, created.taskId), + async () => client.getTask(workspaceId, created.taskId), (h) => Boolean(h.title && h.branchName && h.activeSandboxId), (h) => { if (h.status !== lastStatus) { @@ -200,18 +199,18 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { } }, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); + const dump = await debugDump(client, workspaceId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }); branchName = namedAndProvisioned.branchName!; sandboxId = namedAndProvisioned.activeSandboxId!; - const withSession = await poll>>( + const withSession = await poll( "task to create active session", 3 * 60_000, 1_500, - async () => client.getTaskDetail(organizationId, repo.repoId, created.taskId), + async () => client.getTask(workspaceId, created.taskId), (h) => Boolean(h.activeSessionId), (h) => { if (h.status === "error") { @@ -219,7 +218,7 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { } }, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); + const dump = await debugDump(client, workspaceId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }); @@ -231,14 +230,14 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { 2_000, async () => ( - await client.listSandboxSessionEvents(organizationId, namedAndProvisioned.sandboxProviderId, sandboxId!, { + await client.listSandboxSessionEvents(workspaceId, withSession.providerId, sandboxId!, { sessionId: sessionId!, limit: 40, }) ).items, (events) => events.length > 0, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); + const dump = await debugDump(client, workspaceId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }); @@ -246,7 +245,7 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { "task to reach idle state", 8 * 60_000, 2_000, - async () => client.getTask(organizationId, repo.repoId, created.taskId), + async () => client.getTask(workspaceId, created.taskId), (h) => h.status === "idle", (h) => { if (h.status === "error") { @@ -254,7 +253,7 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { } }, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); + const dump = await debugDump(client, workspaceId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }); @@ -262,11 +261,11 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { "PR creation history event", 3 * 60_000, 2_000, - async () => client.listHistory({ organizationId, taskId: created.taskId, limit: 200 }), + async () => client.listHistory({ workspaceId, taskId: created.taskId, limit: 200 }), (events) => events.some((e) => e.kind === "task.pr_created"), ) .catch(async (err) => { - const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); + const dump = await debugDump(client, workspaceId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }) .then((events) => events.find((e) => e.kind === "task.pr_created")!); @@ -287,32 +286,32 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { expect(prFiles.some((f) => f.filename === expectedFile)).toBe(true); // Close the task and assert the sandbox is released (stopped). - await client.runAction(organizationId, repo.repoId, created.taskId, "archive"); + await client.runAction(workspaceId, created.taskId, "archive"); - await poll>>( + await poll( "task to become archived (session released)", 60_000, 1_000, - async () => client.getTaskDetail(organizationId, repo.repoId, created.taskId), + async () => client.getTask(workspaceId, created.taskId), (h) => h.status === "archived" && h.activeSessionId === null, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); + const dump = await debugDump(client, workspaceId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }); if (sandboxId) { - await poll<{ sandboxProviderId: string; sandboxId: string; state: string; at: number }>( - "sandbox to stop", + await poll<{ providerId: string; sandboxId: string; state: string; at: number }>( + "daytona sandbox to stop", 2 * 60_000, 2_000, - async () => client.sandboxProviderState(organizationId, "local", sandboxId!), + async () => client.sandboxProviderState(workspaceId, "daytona", sandboxId!), (s) => { const st = String(s.state).toLowerCase(); - return st.includes("destroyed") || st.includes("stopped") || st.includes("suspended") || st.includes("paused"); + return st.includes("stopped") || st.includes("suspended") || st.includes("paused"); }, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); - const state = await client.sandboxProviderState(organizationId, "local", sandboxId!).catch(() => null); + const dump = await debugDump(client, workspaceId, created.taskId); + const state = await client.sandboxProviderState(workspaceId, "daytona", sandboxId!).catch(() => null); throw new Error(`${err instanceof Error ? err.message : String(err)}\n` + `sandbox state: ${state ? state.state : "unknown"}\n` + `${dump}`); }); } diff --git a/foundry/packages/client/test/e2e/helpers.ts b/foundry/packages/client/test/e2e/helpers.ts deleted file mode 100644 index 0e15c51..0000000 --- a/foundry/packages/client/test/e2e/helpers.ts +++ /dev/null @@ -1,84 +0,0 @@ -import type { RepoRecord } from "@sandbox-agent/foundry-shared"; -import type { BackendClient } from "../../src/backend-client.js"; - -function normalizeRepoSelector(value: string): string { - let normalized = value.trim(); - if (!normalized) { - return ""; - } - - normalized = normalized.replace(/\/+$/, ""); - if (/^[A-Za-z0-9_.-]+\/[A-Za-z0-9_.-]+$/.test(normalized)) { - return `https://github.com/${normalized}.git`; - } - - if (/^(?:www\.)?github\.com\/.+/i.test(normalized)) { - normalized = `https://${normalized.replace(/^www\./i, "")}`; - } - - try { - if (/^https?:\/\//i.test(normalized)) { - const url = new URL(normalized); - const hostname = url.hostname.replace(/^www\./i, ""); - if (hostname.toLowerCase() === "github.com") { - const parts = url.pathname.split("/").filter(Boolean); - if (parts.length >= 2) { - return `${url.protocol}//${hostname}/${parts[0]}/${(parts[1] ?? "").replace(/\.git$/i, "")}.git`; - } - } - url.search = ""; - url.hash = ""; - return url.toString().replace(/\/+$/, ""); - } - } catch { - // Keep the selector as-is for matching below. - } - - return normalized; -} - -function githubRepoFullNameFromSelector(value: string): string | null { - const normalized = normalizeRepoSelector(value); - try { - const url = new URL(normalized); - if (url.hostname.replace(/^www\./i, "").toLowerCase() !== "github.com") { - return null; - } - const parts = url.pathname.replace(/\/+$/, "").split("/").filter(Boolean); - if (parts.length < 2) { - return null; - } - return `${parts[0]}/${(parts[1] ?? "").replace(/\.git$/i, "")}`; - } catch { - return null; - } -} - -export async function requireImportedRepo(client: BackendClient, organizationId: string, repoSelector: string): Promise { - const selector = repoSelector.trim(); - if (!selector) { - throw new Error("Missing repo selector"); - } - - const normalizedSelector = normalizeRepoSelector(selector); - const selectorFullName = githubRepoFullNameFromSelector(selector); - const repos = await client.listRepos(organizationId); - const match = repos.find((repo) => { - if (repo.repoId === selector) { - return true; - } - if (normalizeRepoSelector(repo.remoteUrl) === normalizedSelector) { - return true; - } - const repoFullName = githubRepoFullNameFromSelector(repo.remoteUrl); - return Boolean(selectorFullName && repoFullName && repoFullName === selectorFullName); - }); - - if (!match) { - throw new Error( - `Repo not available in organization ${organizationId}: ${repoSelector}. Create it in GitHub first, then sync repos in Foundry before running this test.`, - ); - } - - return match; -} diff --git a/foundry/packages/client/test/e2e/workbench-e2e.test.ts b/foundry/packages/client/test/e2e/workbench-e2e.test.ts new file mode 100644 index 0000000..5d85125 --- /dev/null +++ b/foundry/packages/client/test/e2e/workbench-e2e.test.ts @@ -0,0 +1,307 @@ +import { execFile } from "node:child_process"; +import { promisify } from "node:util"; +import { describe, expect, it } from "vitest"; +import type { TaskWorkbenchSnapshot, WorkbenchAgentTab, WorkbenchTask, WorkbenchModelId, WorkbenchTranscriptEvent } from "@sandbox-agent/foundry-shared"; +import { createBackendClient } from "../../src/backend-client.js"; + +const RUN_WORKBENCH_E2E = process.env.HF_ENABLE_DAEMON_WORKBENCH_E2E === "1"; +const execFileAsync = promisify(execFile); + +function requiredEnv(name: string): string { + const value = process.env[name]?.trim(); + if (!value) { + throw new Error(`Missing required env var: ${name}`); + } + return value; +} + +function workbenchModelEnv(name: string, fallback: WorkbenchModelId): WorkbenchModelId { + const value = process.env[name]?.trim(); + switch (value) { + case "claude-sonnet-4": + case "claude-opus-4": + case "gpt-4o": + case "o3": + return value; + default: + return fallback; + } +} + +async function sleep(ms: number): Promise { + await new Promise((resolve) => setTimeout(resolve, ms)); +} + +async function seedSandboxFile(workspaceId: string, taskId: string, filePath: string, content: string): Promise { + const repoPath = `/root/.local/share/foundry/local-sandboxes/${workspaceId}/${taskId}/repo`; + const script = [ + `cd ${JSON.stringify(repoPath)}`, + `mkdir -p ${JSON.stringify(filePath.includes("/") ? filePath.slice(0, filePath.lastIndexOf("/")) : ".")}`, + `printf '%s\\n' ${JSON.stringify(content)} > ${JSON.stringify(filePath)}`, + ].join(" && "); + await execFileAsync("docker", ["exec", "foundry-backend-1", "bash", "-lc", script]); +} + +async function poll(label: string, timeoutMs: number, intervalMs: number, fn: () => Promise, isDone: (value: T) => boolean): Promise { + const startedAt = Date.now(); + let lastValue: T; + + for (;;) { + lastValue = await fn(); + if (isDone(lastValue)) { + return lastValue; + } + if (Date.now() - startedAt > timeoutMs) { + throw new Error(`timed out waiting for ${label}`); + } + await sleep(intervalMs); + } +} + +function findTask(snapshot: TaskWorkbenchSnapshot, taskId: string): WorkbenchTask { + const task = snapshot.tasks.find((candidate) => candidate.id === taskId); + if (!task) { + throw new Error(`task ${taskId} missing from snapshot`); + } + return task; +} + +function findTab(task: WorkbenchTask, tabId: string): WorkbenchAgentTab { + const tab = task.tabs.find((candidate) => candidate.id === tabId); + if (!tab) { + throw new Error(`tab ${tabId} missing from task ${task.id}`); + } + return tab; +} + +function extractEventText(event: WorkbenchTranscriptEvent): string { + const payload = event.payload; + if (!payload || typeof payload !== "object") { + return String(payload ?? ""); + } + + const envelope = payload as { + method?: unknown; + params?: unknown; + result?: unknown; + error?: unknown; + }; + + const params = envelope.params; + if (params && typeof params === "object") { + const update = (params as { update?: unknown }).update; + if (update && typeof update === "object") { + const content = (update as { content?: unknown }).content; + if (content && typeof content === "object") { + const chunkText = (content as { text?: unknown }).text; + if (typeof chunkText === "string") { + return chunkText; + } + } + } + + const text = (params as { text?: unknown }).text; + if (typeof text === "string" && text.trim()) { + return text.trim(); + } + const prompt = (params as { prompt?: Array<{ text?: unknown }> }).prompt; + if (Array.isArray(prompt)) { + const value = prompt + .map((item) => (typeof item?.text === "string" ? item.text.trim() : "")) + .filter(Boolean) + .join("\n"); + if (value) { + return value; + } + } + } + + const result = envelope.result; + if (result && typeof result === "object") { + const text = (result as { text?: unknown }).text; + if (typeof text === "string" && text.trim()) { + return text.trim(); + } + } + + if (envelope.error) { + return JSON.stringify(envelope.error); + } + + if (typeof envelope.method === "string") { + return envelope.method; + } + + return JSON.stringify(payload); +} + +function transcriptIncludesAgentText(transcript: WorkbenchTranscriptEvent[], expectedText: string): boolean { + return transcript + .filter((event) => event.sender === "agent") + .map((event) => extractEventText(event)) + .join("") + .includes(expectedText); +} + +describe("e2e(client): workbench flows", () => { + it.skipIf(!RUN_WORKBENCH_E2E)("creates a task, adds sessions, exchanges messages, and manages workbench state", { timeout: 20 * 60_000 }, async () => { + const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/v1/rivet"; + const workspaceId = process.env.HF_E2E_WORKSPACE?.trim() || "default"; + const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO"); + const model = workbenchModelEnv("HF_E2E_MODEL", "gpt-4o"); + const runId = `wb-${Date.now().toString(36)}`; + const expectedFile = `${runId}.txt`; + const expectedInitialReply = `WORKBENCH_READY_${runId}`; + const expectedReply = `WORKBENCH_ACK_${runId}`; + + const client = createBackendClient({ + endpoint, + defaultWorkspaceId: workspaceId, + }); + + const repo = await client.addRepo(workspaceId, repoRemote); + const created = await client.createWorkbenchTask(workspaceId, { + repoId: repo.repoId, + title: `Workbench E2E ${runId}`, + branch: `e2e/${runId}`, + model, + task: `Reply with exactly: ${expectedInitialReply}`, + }); + + const provisioned = await poll( + "task provisioning", + 12 * 60_000, + 2_000, + async () => findTask(await client.getWorkbench(workspaceId), created.taskId), + (task) => task.branch === `e2e/${runId}` && task.tabs.length > 0, + ); + + const primaryTab = provisioned.tabs[0]!; + + const initialCompleted = await poll( + "initial agent response", + 12 * 60_000, + 2_000, + async () => findTask(await client.getWorkbench(workspaceId), created.taskId), + (task) => { + const tab = findTab(task, primaryTab.id); + return task.status === "idle" && tab.status === "idle" && transcriptIncludesAgentText(tab.transcript, expectedInitialReply); + }, + ); + + expect(findTab(initialCompleted, primaryTab.id).sessionId).toBeTruthy(); + expect(transcriptIncludesAgentText(findTab(initialCompleted, primaryTab.id).transcript, expectedInitialReply)).toBe(true); + + await seedSandboxFile(workspaceId, created.taskId, expectedFile, runId); + + const fileSeeded = await poll( + "seeded sandbox file reflected in workbench", + 30_000, + 1_000, + async () => findTask(await client.getWorkbench(workspaceId), created.taskId), + (task) => task.fileChanges.some((file) => file.path === expectedFile), + ); + expect(fileSeeded.fileChanges.some((file) => file.path === expectedFile)).toBe(true); + + await client.renameWorkbenchTask(workspaceId, { + taskId: created.taskId, + value: `Workbench E2E ${runId} Renamed`, + }); + await client.renameWorkbenchSession(workspaceId, { + taskId: created.taskId, + tabId: primaryTab.id, + title: "Primary Session", + }); + + const secondTab = await client.createWorkbenchSession(workspaceId, { + taskId: created.taskId, + model, + }); + + await client.renameWorkbenchSession(workspaceId, { + taskId: created.taskId, + tabId: secondTab.tabId, + title: "Follow-up Session", + }); + + await client.updateWorkbenchDraft(workspaceId, { + taskId: created.taskId, + tabId: secondTab.tabId, + text: `Reply with exactly: ${expectedReply}`, + attachments: [ + { + id: `${expectedFile}:1`, + filePath: expectedFile, + lineNumber: 1, + lineContent: runId, + }, + ], + }); + + const drafted = findTask(await client.getWorkbench(workspaceId), created.taskId); + expect(findTab(drafted, secondTab.tabId).draft.text).toContain(expectedReply); + expect(findTab(drafted, secondTab.tabId).draft.attachments).toHaveLength(1); + + await client.sendWorkbenchMessage(workspaceId, { + taskId: created.taskId, + tabId: secondTab.tabId, + text: `Reply with exactly: ${expectedReply}`, + attachments: [], + }); + + const withSecondReply = await poll( + "follow-up session response", + 10 * 60_000, + 2_000, + async () => findTask(await client.getWorkbench(workspaceId), created.taskId), + (task) => { + const tab = findTab(task, secondTab.tabId); + return tab.status === "idle" && transcriptIncludesAgentText(tab.transcript, expectedReply); + }, + ); + + const secondTranscript = findTab(withSecondReply, secondTab.tabId).transcript; + expect(transcriptIncludesAgentText(secondTranscript, expectedReply)).toBe(true); + + await client.setWorkbenchSessionUnread(workspaceId, { + taskId: created.taskId, + tabId: secondTab.tabId, + unread: false, + }); + await client.markWorkbenchUnread(workspaceId, { taskId: created.taskId }); + + const unreadSnapshot = findTask(await client.getWorkbench(workspaceId), created.taskId); + expect(unreadSnapshot.tabs.some((tab) => tab.unread)).toBe(true); + + await client.closeWorkbenchSession(workspaceId, { + taskId: created.taskId, + tabId: secondTab.tabId, + }); + + const closedSnapshot = await poll( + "secondary session closed", + 30_000, + 1_000, + async () => findTask(await client.getWorkbench(workspaceId), created.taskId), + (task) => !task.tabs.some((tab) => tab.id === secondTab.tabId), + ); + expect(closedSnapshot.tabs).toHaveLength(1); + + await client.revertWorkbenchFile(workspaceId, { + taskId: created.taskId, + path: expectedFile, + }); + + const revertedSnapshot = await poll( + "file revert reflected in workbench", + 30_000, + 1_000, + async () => findTask(await client.getWorkbench(workspaceId), created.taskId), + (task) => !task.fileChanges.some((file) => file.path === expectedFile), + ); + + expect(revertedSnapshot.fileChanges.some((file) => file.path === expectedFile)).toBe(false); + expect(revertedSnapshot.title).toBe(`Workbench E2E ${runId} Renamed`); + expect(findTab(revertedSnapshot, primaryTab.id).sessionName).toBe("Primary Session"); + }); +}); diff --git a/foundry/packages/client/test/e2e/workspace-load-e2e.test.ts b/foundry/packages/client/test/e2e/workbench-load-e2e.test.ts similarity index 76% rename from foundry/packages/client/test/e2e/workspace-load-e2e.test.ts rename to foundry/packages/client/test/e2e/workbench-load-e2e.test.ts index f9fc244..3eba239 100644 --- a/foundry/packages/client/test/e2e/workspace-load-e2e.test.ts +++ b/foundry/packages/client/test/e2e/workbench-load-e2e.test.ts @@ -1,20 +1,19 @@ import { describe, expect, it } from "vitest"; import { createFoundryLogger, - type TaskWorkspaceSnapshot, - type WorkspaceSession, - type WorkspaceTask, - type WorkspaceModelId, - type WorkspaceTranscriptEvent, + type TaskWorkbenchSnapshot, + type WorkbenchAgentTab, + type WorkbenchTask, + type WorkbenchModelId, + type WorkbenchTranscriptEvent, } from "@sandbox-agent/foundry-shared"; import { createBackendClient } from "../../src/backend-client.js"; -import { requireImportedRepo } from "./helpers.js"; const RUN_WORKBENCH_LOAD_E2E = process.env.HF_ENABLE_DAEMON_WORKBENCH_LOAD_E2E === "1"; const logger = createFoundryLogger({ service: "foundry-client-e2e", bindings: { - suite: "workspace-load", + suite: "workbench-load", }, }); @@ -26,9 +25,17 @@ function requiredEnv(name: string): string { return value; } -function workspaceModelEnv(name: string, fallback: WorkspaceModelId): WorkspaceModelId { +function workbenchModelEnv(name: string, fallback: WorkbenchModelId): WorkbenchModelId { const value = process.env[name]?.trim(); - return value && value.length > 0 ? value : fallback; + switch (value) { + case "claude-sonnet-4": + case "claude-opus-4": + case "gpt-4o": + case "o3": + return value; + default: + return fallback; + } } function intEnv(name: string, fallback: number): number { @@ -60,7 +67,7 @@ async function poll(label: string, timeoutMs: number, intervalMs: number, fn: } } -function findTask(snapshot: TaskWorkspaceSnapshot, taskId: string): WorkspaceTask { +function findTask(snapshot: TaskWorkbenchSnapshot, taskId: string): WorkbenchTask { const task = snapshot.tasks.find((candidate) => candidate.id === taskId); if (!task) { throw new Error(`task ${taskId} missing from snapshot`); @@ -68,15 +75,15 @@ function findTask(snapshot: TaskWorkspaceSnapshot, taskId: string): WorkspaceTas return task; } -function findTab(task: WorkspaceTask, sessionId: string): WorkspaceSession { - const tab = task.sessions.find((candidate) => candidate.id === sessionId); +function findTab(task: WorkbenchTask, tabId: string): WorkbenchAgentTab { + const tab = task.tabs.find((candidate) => candidate.id === tabId); if (!tab) { - throw new Error(`tab ${sessionId} missing from task ${task.id}`); + throw new Error(`tab ${tabId} missing from task ${task.id}`); } return tab; } -function extractEventText(event: WorkspaceTranscriptEvent): string { +function extractEventText(event: WorkbenchTranscriptEvent): string { const payload = event.payload; if (!payload || typeof payload !== "object") { return String(payload ?? ""); @@ -126,7 +133,7 @@ function extractEventText(event: WorkspaceTranscriptEvent): string { return typeof envelope.method === "string" ? envelope.method : JSON.stringify(payload); } -function transcriptIncludesAgentText(transcript: WorkspaceTranscriptEvent[], expectedText: string): boolean { +function transcriptIncludesAgentText(transcript: WorkbenchTranscriptEvent[], expectedText: string): boolean { return transcript .filter((event) => event.sender === "agent") .map((event) => extractEventText(event)) @@ -138,9 +145,9 @@ function average(values: number[]): number { return values.reduce((sum, value) => sum + value, 0) / Math.max(values.length, 1); } -async function measureWorkspaceSnapshot( +async function measureWorkbenchSnapshot( client: ReturnType, - organizationId: string, + workspaceId: string, iterations: number, ): Promise<{ avgMs: number; @@ -151,23 +158,23 @@ async function measureWorkspaceSnapshot( transcriptEventCount: number; }> { const durations: number[] = []; - let snapshot: TaskWorkspaceSnapshot | null = null; + let snapshot: TaskWorkbenchSnapshot | null = null; for (let index = 0; index < iterations; index += 1) { const startedAt = performance.now(); - snapshot = await client.getWorkspace(organizationId); + snapshot = await client.getWorkbench(workspaceId); durations.push(performance.now() - startedAt); } const finalSnapshot = snapshot ?? { - organizationId, + workspaceId, repos: [], - repositories: [], + projects: [], tasks: [], }; const payloadBytes = Buffer.byteLength(JSON.stringify(finalSnapshot), "utf8"); - const tabCount = finalSnapshot.tasks.reduce((sum, task) => sum + task.sessions.length, 0); - const transcriptEventCount = finalSnapshot.tasks.reduce((sum, task) => sum + task.sessions.reduce((tabSum, tab) => tabSum + tab.transcript.length, 0), 0); + const tabCount = finalSnapshot.tasks.reduce((sum, task) => sum + task.tabs.length, 0); + const transcriptEventCount = finalSnapshot.tasks.reduce((sum, task) => sum + task.tabs.reduce((tabSum, tab) => tabSum + tab.transcript.length, 0), 0); return { avgMs: Math.round(average(durations)), @@ -179,22 +186,22 @@ async function measureWorkspaceSnapshot( }; } -describe("e2e(client): workspace load", () => { +describe("e2e(client): workbench load", () => { it.skipIf(!RUN_WORKBENCH_LOAD_E2E)("runs a simple sequential load profile against the real backend", { timeout: 30 * 60_000 }, async () => { const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/v1/rivet"; - const organizationId = process.env.HF_E2E_WORKSPACE?.trim() || "default"; + const workspaceId = process.env.HF_E2E_WORKSPACE?.trim() || "default"; const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO"); - const model = workspaceModelEnv("HF_E2E_MODEL", "gpt-5.3-codex"); + const model = workbenchModelEnv("HF_E2E_MODEL", "gpt-4o"); const taskCount = intEnv("HF_LOAD_TASK_COUNT", 3); const extraSessionCount = intEnv("HF_LOAD_EXTRA_SESSION_COUNT", 2); const pollIntervalMs = intEnv("HF_LOAD_POLL_INTERVAL_MS", 2_000); const client = createBackendClient({ endpoint, - defaultOrganizationId: organizationId, + defaultWorkspaceId: workspaceId, }); - const repo = await requireImportedRepo(client, organizationId, repoRemote); + const repo = await client.addRepo(workspaceId, repoRemote); const createTaskLatencies: number[] = []; const provisionLatencies: number[] = []; const createSessionLatencies: number[] = []; @@ -208,16 +215,16 @@ describe("e2e(client): workspace load", () => { transcriptEventCount: number; }> = []; - snapshotSeries.push(await measureWorkspaceSnapshot(client, organizationId, 2)); + snapshotSeries.push(await measureWorkbenchSnapshot(client, workspaceId, 2)); for (let taskIndex = 0; taskIndex < taskCount; taskIndex += 1) { const runId = `load-${taskIndex}-${Date.now().toString(36)}`; const initialReply = `LOAD_INIT_${runId}`; const createStartedAt = performance.now(); - const created = await client.createWorkspaceTask(organizationId, { + const created = await client.createWorkbenchTask(workspaceId, { repoId: repo.repoId, - title: `Workspace Load ${runId}`, + title: `Workbench Load ${runId}`, branch: `load/${runId}`, model, task: `Reply with exactly: ${initialReply}`, @@ -229,32 +236,30 @@ describe("e2e(client): workspace load", () => { `task ${runId} provisioning`, 12 * 60_000, pollIntervalMs, - async () => findTask(await client.getWorkspace(organizationId), created.taskId), + async () => findTask(await client.getWorkbench(workspaceId), created.taskId), (task) => { - const tab = task.sessions[0]; + const tab = task.tabs[0]; return Boolean(tab && task.status === "idle" && tab.status === "idle" && transcriptIncludesAgentText(tab.transcript, initialReply)); }, ); provisionLatencies.push(performance.now() - provisionStartedAt); - expect(provisioned.sessions.length).toBeGreaterThan(0); - const primaryTab = provisioned.sessions[0]!; + expect(provisioned.tabs.length).toBeGreaterThan(0); + const primaryTab = provisioned.tabs[0]!; expect(transcriptIncludesAgentText(primaryTab.transcript, initialReply)).toBe(true); for (let sessionIndex = 0; sessionIndex < extraSessionCount; sessionIndex += 1) { const expectedReply = `LOAD_REPLY_${runId}_${sessionIndex}`; const createSessionStartedAt = performance.now(); - const createdSession = await client.createWorkspaceSession(organizationId, { - repoId: repo.repoId, + const createdSession = await client.createWorkbenchSession(workspaceId, { taskId: created.taskId, model, }); createSessionLatencies.push(performance.now() - createSessionStartedAt); - await client.sendWorkspaceMessage(organizationId, { - repoId: repo.repoId, + await client.sendWorkbenchMessage(workspaceId, { taskId: created.taskId, - sessionId: createdSession.sessionId, + tabId: createdSession.tabId, text: `Run pwd in the repo, then reply with exactly: ${expectedReply}`, attachments: [], }); @@ -264,25 +269,25 @@ describe("e2e(client): workspace load", () => { `task ${runId} session ${sessionIndex} reply`, 10 * 60_000, pollIntervalMs, - async () => findTask(await client.getWorkspace(organizationId), created.taskId), + async () => findTask(await client.getWorkbench(workspaceId), created.taskId), (task) => { - const tab = findTab(task, createdSession.sessionId); + const tab = findTab(task, createdSession.tabId); return tab.status === "idle" && transcriptIncludesAgentText(tab.transcript, expectedReply); }, ); messageRoundTripLatencies.push(performance.now() - messageStartedAt); - expect(transcriptIncludesAgentText(findTab(withReply, createdSession.sessionId).transcript, expectedReply)).toBe(true); + expect(transcriptIncludesAgentText(findTab(withReply, createdSession.tabId).transcript, expectedReply)).toBe(true); } - const snapshotMetrics = await measureWorkspaceSnapshot(client, organizationId, 3); + const snapshotMetrics = await measureWorkbenchSnapshot(client, workspaceId, 3); snapshotSeries.push(snapshotMetrics); logger.info( { taskIndex: taskIndex + 1, ...snapshotMetrics, }, - "workspace_load_snapshot", + "workbench_load_snapshot", ); } @@ -304,7 +309,7 @@ describe("e2e(client): workspace load", () => { snapshotTranscriptFinalCount: lastSnapshot.transcriptEventCount, }; - logger.info(summary, "workspace_load_summary"); + logger.info(summary, "workbench_load_summary"); expect(createTaskLatencies.length).toBe(taskCount); expect(provisionLatencies.length).toBe(taskCount); diff --git a/foundry/packages/client/test/e2e/workspace-e2e.test.ts b/foundry/packages/client/test/e2e/workspace-e2e.test.ts deleted file mode 100644 index 1de2065..0000000 --- a/foundry/packages/client/test/e2e/workspace-e2e.test.ts +++ /dev/null @@ -1,307 +0,0 @@ -import { describe, expect, it } from "vitest"; -import type { TaskWorkspaceSnapshot, WorkspaceSession, WorkspaceTask, WorkspaceModelId, WorkspaceTranscriptEvent } from "@sandbox-agent/foundry-shared"; -import { createBackendClient } from "../../src/backend-client.js"; -import { requireImportedRepo } from "./helpers.js"; - -const RUN_WORKBENCH_E2E = process.env.HF_ENABLE_DAEMON_WORKBENCH_E2E === "1"; - -function requiredEnv(name: string): string { - const value = process.env[name]?.trim(); - if (!value) { - throw new Error(`Missing required env var: ${name}`); - } - return value; -} - -function workspaceModelEnv(name: string, fallback: WorkspaceModelId): WorkspaceModelId { - const value = process.env[name]?.trim(); - return value && value.length > 0 ? value : fallback; -} - -async function sleep(ms: number): Promise { - await new Promise((resolve) => setTimeout(resolve, ms)); -} - -async function poll(label: string, timeoutMs: number, intervalMs: number, fn: () => Promise, isDone: (value: T) => boolean): Promise { - const startedAt = Date.now(); - let lastValue: T; - - for (;;) { - lastValue = await fn(); - if (isDone(lastValue)) { - return lastValue; - } - if (Date.now() - startedAt > timeoutMs) { - throw new Error(`timed out waiting for ${label}`); - } - await sleep(intervalMs); - } -} - -function findTask(snapshot: TaskWorkspaceSnapshot, taskId: string): WorkspaceTask { - const task = snapshot.tasks.find((candidate) => candidate.id === taskId); - if (!task) { - throw new Error(`task ${taskId} missing from snapshot`); - } - return task; -} - -function findTab(task: WorkspaceTask, sessionId: string): WorkspaceSession { - const tab = task.sessions.find((candidate) => candidate.id === sessionId); - if (!tab) { - throw new Error(`tab ${sessionId} missing from task ${task.id}`); - } - return tab; -} - -function extractEventText(event: WorkspaceTranscriptEvent): string { - const payload = event.payload; - if (!payload || typeof payload !== "object") { - return String(payload ?? ""); - } - - const envelope = payload as { - method?: unknown; - params?: unknown; - result?: unknown; - error?: unknown; - }; - - const params = envelope.params; - if (params && typeof params === "object") { - const update = (params as { update?: unknown }).update; - if (update && typeof update === "object") { - const content = (update as { content?: unknown }).content; - if (content && typeof content === "object") { - const chunkText = (content as { text?: unknown }).text; - if (typeof chunkText === "string") { - return chunkText; - } - } - } - - const text = (params as { text?: unknown }).text; - if (typeof text === "string" && text.trim()) { - return text.trim(); - } - const prompt = (params as { prompt?: Array<{ text?: unknown }> }).prompt; - if (Array.isArray(prompt)) { - const value = prompt - .map((item) => (typeof item?.text === "string" ? item.text.trim() : "")) - .filter(Boolean) - .join("\n"); - if (value) { - return value; - } - } - } - - const result = envelope.result; - if (result && typeof result === "object") { - const text = (result as { text?: unknown }).text; - if (typeof text === "string" && text.trim()) { - return text.trim(); - } - } - - if (envelope.error) { - return JSON.stringify(envelope.error); - } - - if (typeof envelope.method === "string") { - return envelope.method; - } - - return JSON.stringify(payload); -} - -function transcriptIncludesAgentText(transcript: WorkspaceTranscriptEvent[], expectedText: string): boolean { - return transcript - .filter((event) => event.sender === "agent") - .map((event) => extractEventText(event)) - .join("") - .includes(expectedText); -} - -describe("e2e(client): workspace flows", () => { - it.skipIf(!RUN_WORKBENCH_E2E)( - "creates a task from an imported repo, adds sessions, exchanges messages, and manages workspace state", - { timeout: 20 * 60_000 }, - async () => { - const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/v1/rivet"; - const organizationId = process.env.HF_E2E_WORKSPACE?.trim() || "default"; - const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO"); - const model = workspaceModelEnv("HF_E2E_MODEL", "gpt-5.3-codex"); - const runId = `wb-${Date.now().toString(36)}`; - const expectedFile = `${runId}.txt`; - const expectedInitialReply = `WORKBENCH_READY_${runId}`; - const expectedReply = `WORKBENCH_ACK_${runId}`; - - const client = createBackendClient({ - endpoint, - defaultOrganizationId: organizationId, - }); - - const repo = await requireImportedRepo(client, organizationId, repoRemote); - const created = await client.createWorkspaceTask(organizationId, { - repoId: repo.repoId, - title: `Workspace E2E ${runId}`, - branch: `e2e/${runId}`, - model, - task: `Reply with exactly: ${expectedInitialReply}`, - }); - - const provisioned = await poll( - "task provisioning", - 12 * 60_000, - 2_000, - async () => findTask(await client.getWorkspace(organizationId), created.taskId), - (task) => task.branch === `e2e/${runId}` && task.sessions.length > 0, - ); - - const primaryTab = provisioned.sessions[0]!; - - const initialCompleted = await poll( - "initial agent response", - 12 * 60_000, - 2_000, - async () => findTask(await client.getWorkspace(organizationId), created.taskId), - (task) => { - const tab = findTab(task, primaryTab.id); - return task.status === "idle" && tab.status === "idle" && transcriptIncludesAgentText(tab.transcript, expectedInitialReply); - }, - ); - - expect(findTab(initialCompleted, primaryTab.id).sessionId).toBeTruthy(); - expect(transcriptIncludesAgentText(findTab(initialCompleted, primaryTab.id).transcript, expectedInitialReply)).toBe(true); - - await client.renameWorkspaceTask(organizationId, { - repoId: repo.repoId, - taskId: created.taskId, - value: `Workspace E2E ${runId} Renamed`, - }); - await client.renameWorkspaceSession(organizationId, { - repoId: repo.repoId, - taskId: created.taskId, - sessionId: primaryTab.id, - title: "Primary Session", - }); - - const secondTab = await client.createWorkspaceSession(organizationId, { - repoId: repo.repoId, - taskId: created.taskId, - model, - }); - - await client.renameWorkspaceSession(organizationId, { - repoId: repo.repoId, - taskId: created.taskId, - sessionId: secondTab.sessionId, - title: "Follow-up Session", - }); - - await client.updateWorkspaceDraft(organizationId, { - repoId: repo.repoId, - taskId: created.taskId, - sessionId: secondTab.sessionId, - text: [ - `Create a file named ${expectedFile} in the repo root.`, - `Write exactly this single line into the file: ${runId}`, - `Then reply with exactly: ${expectedReply}`, - ].join("\n"), - attachments: [ - { - id: `${expectedFile}:1`, - filePath: expectedFile, - lineNumber: 1, - lineContent: runId, - }, - ], - }); - - const drafted = findTask(await client.getWorkspace(organizationId), created.taskId); - expect(findTab(drafted, secondTab.sessionId).draft.text).toContain(expectedReply); - expect(findTab(drafted, secondTab.sessionId).draft.attachments).toHaveLength(1); - - await client.sendWorkspaceMessage(organizationId, { - repoId: repo.repoId, - taskId: created.taskId, - sessionId: secondTab.sessionId, - text: [ - `Create a file named ${expectedFile} in the repo root.`, - `Write exactly this single line into the file: ${runId}`, - `Then reply with exactly: ${expectedReply}`, - ].join("\n"), - attachments: [ - { - id: `${expectedFile}:1`, - filePath: expectedFile, - lineNumber: 1, - lineContent: runId, - }, - ], - }); - - const withSecondReply = await poll( - "follow-up session response", - 10 * 60_000, - 2_000, - async () => findTask(await client.getWorkspace(organizationId), created.taskId), - (task) => { - const tab = findTab(task, secondTab.sessionId); - return ( - tab.status === "idle" && transcriptIncludesAgentText(tab.transcript, expectedReply) && task.fileChanges.some((file) => file.path === expectedFile) - ); - }, - ); - - const secondTranscript = findTab(withSecondReply, secondTab.sessionId).transcript; - expect(transcriptIncludesAgentText(secondTranscript, expectedReply)).toBe(true); - expect(withSecondReply.fileChanges.some((file) => file.path === expectedFile)).toBe(true); - - await client.setWorkspaceSessionUnread(organizationId, { - repoId: repo.repoId, - taskId: created.taskId, - sessionId: secondTab.sessionId, - unread: false, - }); - await client.markWorkspaceUnread(organizationId, { repoId: repo.repoId, taskId: created.taskId }); - - const unreadSnapshot = findTask(await client.getWorkspace(organizationId), created.taskId); - expect(unreadSnapshot.sessions.some((tab) => tab.unread)).toBe(true); - - await client.closeWorkspaceSession(organizationId, { - repoId: repo.repoId, - taskId: created.taskId, - sessionId: secondTab.sessionId, - }); - - const closedSnapshot = await poll( - "secondary session closed", - 30_000, - 1_000, - async () => findTask(await client.getWorkspace(organizationId), created.taskId), - (task) => !task.sessions.some((tab) => tab.id === secondTab.sessionId), - ); - expect(closedSnapshot.sessions).toHaveLength(1); - - await client.revertWorkspaceFile(organizationId, { - repoId: repo.repoId, - taskId: created.taskId, - path: expectedFile, - }); - - const revertedSnapshot = await poll( - "file revert reflected in workspace", - 30_000, - 1_000, - async () => findTask(await client.getWorkspace(organizationId), created.taskId), - (task) => !task.fileChanges.some((file) => file.path === expectedFile), - ); - - expect(revertedSnapshot.fileChanges.some((file) => file.path === expectedFile)).toBe(false); - expect(revertedSnapshot.title).toBe(`Workspace E2E ${runId} Renamed`); - expect(findTab(revertedSnapshot, primaryTab.id).sessionName).toBe("Primary Session"); - }, - ); -}); diff --git a/foundry/packages/client/test/interest-manager.test.ts b/foundry/packages/client/test/interest-manager.test.ts new file mode 100644 index 0000000..188195c --- /dev/null +++ b/foundry/packages/client/test/interest-manager.test.ts @@ -0,0 +1,171 @@ +import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; +import type { WorkspaceEvent, WorkspaceSummarySnapshot } from "@sandbox-agent/foundry-shared"; +import type { ActorConn, BackendClient } from "../src/backend-client.js"; +import { RemoteInterestManager } from "../src/interest/remote-manager.js"; + +class FakeActorConn implements ActorConn { + private readonly listeners = new Map void>>(); + private readonly errorListeners = new Set<(error: unknown) => void>(); + disposeCount = 0; + + on(event: string, listener: (payload: any) => void): () => void { + let current = this.listeners.get(event); + if (!current) { + current = new Set(); + this.listeners.set(event, current); + } + current.add(listener); + return () => { + current?.delete(listener); + if (current?.size === 0) { + this.listeners.delete(event); + } + }; + } + + onError(listener: (error: unknown) => void): () => void { + this.errorListeners.add(listener); + return () => { + this.errorListeners.delete(listener); + }; + } + + emit(event: string, payload: unknown): void { + for (const listener of this.listeners.get(event) ?? []) { + listener(payload); + } + } + + emitError(error: unknown): void { + for (const listener of this.errorListeners) { + listener(error); + } + } + + async dispose(): Promise { + this.disposeCount += 1; + } +} + +function workspaceSnapshot(): WorkspaceSummarySnapshot { + return { + workspaceId: "ws-1", + repos: [{ id: "repo-1", label: "repo-1", taskCount: 1, latestActivityMs: 10 }], + taskSummaries: [ + { + id: "task-1", + repoId: "repo-1", + title: "Initial task", + status: "idle", + repoName: "repo-1", + updatedAtMs: 10, + branch: "main", + pullRequest: null, + sessionsSummary: [], + }, + ], + }; +} + +function createBackend(conn: FakeActorConn, snapshot: WorkspaceSummarySnapshot): BackendClient { + return { + connectWorkspace: vi.fn(async () => conn), + getWorkspaceSummary: vi.fn(async () => snapshot), + } as unknown as BackendClient; +} + +async function flushAsyncWork(): Promise { + await Promise.resolve(); + await Promise.resolve(); +} + +describe("RemoteInterestManager", () => { + beforeEach(() => { + vi.useFakeTimers(); + }); + + afterEach(() => { + vi.useRealTimers(); + }); + + it("shares one connection per topic key and applies incoming events", async () => { + const conn = new FakeActorConn(); + const backend = createBackend(conn, workspaceSnapshot()); + const manager = new RemoteInterestManager(backend); + const params = { workspaceId: "ws-1" } as const; + const listenerA = vi.fn(); + const listenerB = vi.fn(); + + const unsubscribeA = manager.subscribe("workspace", params, listenerA); + const unsubscribeB = manager.subscribe("workspace", params, listenerB); + await flushAsyncWork(); + + expect(backend.connectWorkspace).toHaveBeenCalledTimes(1); + expect(backend.getWorkspaceSummary).toHaveBeenCalledTimes(1); + expect(manager.getStatus("workspace", params)).toBe("connected"); + expect(manager.getSnapshot("workspace", params)?.taskSummaries[0]?.title).toBe("Initial task"); + + conn.emit("workspaceUpdated", { + type: "taskSummaryUpdated", + taskSummary: { + id: "task-1", + repoId: "repo-1", + title: "Updated task", + status: "running", + repoName: "repo-1", + updatedAtMs: 20, + branch: "feature/live", + pullRequest: null, + sessionsSummary: [], + }, + } satisfies WorkspaceEvent); + + expect(manager.getSnapshot("workspace", params)?.taskSummaries[0]?.title).toBe("Updated task"); + expect(listenerA).toHaveBeenCalled(); + expect(listenerB).toHaveBeenCalled(); + + unsubscribeA(); + unsubscribeB(); + manager.dispose(); + }); + + it("keeps a topic warm during the grace period and tears it down afterwards", async () => { + const conn = new FakeActorConn(); + const backend = createBackend(conn, workspaceSnapshot()); + const manager = new RemoteInterestManager(backend); + const params = { workspaceId: "ws-1" } as const; + + const unsubscribeA = manager.subscribe("workspace", params, () => {}); + await flushAsyncWork(); + unsubscribeA(); + + vi.advanceTimersByTime(29_000); + + const unsubscribeB = manager.subscribe("workspace", params, () => {}); + await flushAsyncWork(); + + expect(backend.connectWorkspace).toHaveBeenCalledTimes(1); + expect(conn.disposeCount).toBe(0); + + unsubscribeB(); + vi.advanceTimersByTime(30_000); + + expect(conn.disposeCount).toBe(1); + expect(manager.getSnapshot("workspace", params)).toBeUndefined(); + }); + + it("surfaces connection errors to subscribers", async () => { + const conn = new FakeActorConn(); + const backend = createBackend(conn, workspaceSnapshot()); + const manager = new RemoteInterestManager(backend); + const params = { workspaceId: "ws-1" } as const; + + manager.subscribe("workspace", params, () => {}); + await flushAsyncWork(); + + conn.emitError(new Error("socket dropped")); + + expect(manager.getStatus("workspace", params)).toBe("error"); + expect(manager.getError("workspace", params)?.message).toBe("socket dropped"); + }); +}); diff --git a/foundry/packages/client/test/keys.test.ts b/foundry/packages/client/test/keys.test.ts index 6b93ec1..281d0a9 100644 --- a/foundry/packages/client/test/keys.test.ts +++ b/foundry/packages/client/test/keys.test.ts @@ -1,12 +1,21 @@ import { describe, expect, it } from "vitest"; -import { auditLogKey, organizationKey, taskKey, taskSandboxKey } from "../src/keys.js"; +import { taskKey, taskStatusSyncKey, historyKey, projectBranchSyncKey, projectKey, projectPrSyncKey, sandboxInstanceKey, workspaceKey } from "../src/keys.js"; describe("actor keys", () => { - it("prefixes every key with organization namespace", () => { - const keys = [organizationKey("default"), taskKey("default", "repo", "task"), taskSandboxKey("default", "sbx"), auditLogKey("default")]; + it("prefixes every key with workspace namespace", () => { + const keys = [ + workspaceKey("default"), + projectKey("default", "repo"), + taskKey("default", "repo", "task"), + sandboxInstanceKey("default", "daytona", "sbx"), + historyKey("default", "repo"), + projectPrSyncKey("default", "repo"), + projectBranchSyncKey("default", "repo"), + taskStatusSyncKey("default", "repo", "task", "sandbox-1", "session-1"), + ]; for (const key of keys) { - expect(key[0]).toBe("org"); + expect(key[0]).toBe("ws"); expect(key[1]).toBe("default"); } }); diff --git a/foundry/packages/client/test/subscription-manager.test.ts b/foundry/packages/client/test/subscription-manager.test.ts deleted file mode 100644 index f0a29c2..0000000 --- a/foundry/packages/client/test/subscription-manager.test.ts +++ /dev/null @@ -1,225 +0,0 @@ -import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; -import type { OrganizationEvent, OrganizationSummarySnapshot } from "@sandbox-agent/foundry-shared"; -import type { ActorConn, BackendClient } from "../src/backend-client.js"; -import { RemoteSubscriptionManager } from "../src/subscription/remote-manager.js"; - -class FakeActorConn implements ActorConn { - private readonly listeners = new Map void>>(); - private readonly errorListeners = new Set<(error: unknown) => void>(); - disposeCount = 0; - - on(event: string, listener: (payload: any) => void): () => void { - let current = this.listeners.get(event); - if (!current) { - current = new Set(); - this.listeners.set(event, current); - } - current.add(listener); - return () => { - current?.delete(listener); - if (current?.size === 0) { - this.listeners.delete(event); - } - }; - } - - onError(listener: (error: unknown) => void): () => void { - this.errorListeners.add(listener); - return () => { - this.errorListeners.delete(listener); - }; - } - - emit(event: string, payload: unknown): void { - for (const listener of this.listeners.get(event) ?? []) { - listener(payload); - } - } - - emitError(error: unknown): void { - for (const listener of this.errorListeners) { - listener(error); - } - } - - async dispose(): Promise { - this.disposeCount += 1; - } -} - -function organizationSnapshot(): OrganizationSummarySnapshot { - return { - organizationId: "org-1", - github: { - connectedAccount: "octocat", - installationStatus: "connected", - syncStatus: "synced", - importedRepoCount: 1, - lastSyncLabel: "Synced just now", - lastSyncAt: 10, - lastWebhookAt: null, - lastWebhookEvent: "", - syncGeneration: 1, - syncPhase: null, - processedRepositoryCount: 1, - totalRepositoryCount: 1, - }, - repos: [{ id: "repo-1", label: "repo-1", taskCount: 1, latestActivityMs: 10 }], - taskSummaries: [ - { - id: "task-1", - repoId: "repo-1", - title: "Initial task", - status: "idle", - repoName: "repo-1", - updatedAtMs: 10, - branch: "main", - pullRequest: null, - activeSessionId: null, - sessionsSummary: [], - primaryUserLogin: null, - primaryUserAvatarUrl: null, - }, - ], - }; -} - -function createBackend(conn: FakeActorConn, snapshot: OrganizationSummarySnapshot): BackendClient { - return { - connectOrganization: vi.fn(async () => conn), - getOrganizationSummary: vi.fn(async () => snapshot), - } as unknown as BackendClient; -} - -async function flushAsyncWork(): Promise { - await Promise.resolve(); - await Promise.resolve(); -} - -describe("RemoteSubscriptionManager", () => { - beforeEach(() => { - vi.useFakeTimers(); - }); - - afterEach(() => { - vi.useRealTimers(); - }); - - it("shares one connection per topic key and applies incoming events", async () => { - const conn = new FakeActorConn(); - const backend = createBackend(conn, organizationSnapshot()); - const manager = new RemoteSubscriptionManager(backend); - const params = { organizationId: "org-1" } as const; - const listenerA = vi.fn(); - const listenerB = vi.fn(); - - const unsubscribeA = manager.subscribe("organization", params, listenerA); - const unsubscribeB = manager.subscribe("organization", params, listenerB); - await flushAsyncWork(); - - expect(backend.connectOrganization).toHaveBeenCalledTimes(1); - expect(backend.getOrganizationSummary).toHaveBeenCalledTimes(1); - expect(manager.getStatus("organization", params)).toBe("connected"); - expect(manager.getSnapshot("organization", params)?.taskSummaries[0]?.title).toBe("Initial task"); - expect(manager.listDebugTopics()).toEqual([ - expect.objectContaining({ - topicKey: "organization", - cacheKey: "organization:org-1", - listenerCount: 2, - status: "connected", - }), - ]); - - conn.emit("organizationUpdated", { - type: "organizationUpdated", - snapshot: { - organizationId: "org-1", - github: { - connectedAccount: "octocat", - installationStatus: "connected", - syncStatus: "syncing", - importedRepoCount: 1, - lastSyncLabel: "Syncing repositories...", - lastSyncAt: 10, - lastWebhookAt: null, - lastWebhookEvent: "", - syncGeneration: 2, - syncPhase: "syncing_branches", - processedRepositoryCount: 1, - totalRepositoryCount: 3, - }, - repos: [], - taskSummaries: [ - { - id: "task-1", - repoId: "repo-1", - title: "Updated task", - status: "running", - repoName: "repo-1", - updatedAtMs: 20, - branch: "feature/live", - pullRequest: null, - activeSessionId: null, - sessionsSummary: [], - primaryUserLogin: null, - primaryUserAvatarUrl: null, - }, - ], - }, - } satisfies OrganizationEvent); - - // applyEvent chains onto an internal promise — flush the microtask queue - await flushAsyncWork(); - - expect(manager.getSnapshot("organization", params)?.taskSummaries[0]?.title).toBe("Updated task"); - expect(listenerA).toHaveBeenCalled(); - expect(listenerB).toHaveBeenCalled(); - expect(manager.listDebugTopics()[0]?.lastRefreshAt).toEqual(expect.any(Number)); - - unsubscribeA(); - unsubscribeB(); - manager.dispose(); - }); - - it("keeps a topic warm during the grace period and tears it down afterwards", async () => { - const conn = new FakeActorConn(); - const backend = createBackend(conn, organizationSnapshot()); - const manager = new RemoteSubscriptionManager(backend); - const params = { organizationId: "org-1" } as const; - - const unsubscribeA = manager.subscribe("organization", params, () => {}); - await flushAsyncWork(); - unsubscribeA(); - - vi.advanceTimersByTime(29_000); - expect(manager.listDebugTopics()).toEqual([]); - - const unsubscribeB = manager.subscribe("organization", params, () => {}); - await flushAsyncWork(); - - expect(backend.connectOrganization).toHaveBeenCalledTimes(1); - expect(conn.disposeCount).toBe(0); - - unsubscribeB(); - expect(manager.listDebugTopics()).toEqual([]); - vi.advanceTimersByTime(30_000); - - expect(conn.disposeCount).toBe(1); - expect(manager.getSnapshot("organization", params)).toBeUndefined(); - }); - - it("surfaces connection errors to subscribers", async () => { - const conn = new FakeActorConn(); - const backend = createBackend(conn, organizationSnapshot()); - const manager = new RemoteSubscriptionManager(backend); - const params = { organizationId: "org-1" } as const; - - manager.subscribe("organization", params, () => {}); - await flushAsyncWork(); - - conn.emitError(new Error("socket dropped")); - - expect(manager.getStatus("organization", params)).toBe("error"); - expect(manager.getError("organization", params)?.message).toBe("socket dropped"); - }); -}); diff --git a/foundry/packages/client/test/view-model.test.ts b/foundry/packages/client/test/view-model.test.ts index d418c2f..d80b5f1 100644 --- a/foundry/packages/client/test/view-model.test.ts +++ b/foundry/packages/client/test/view-model.test.ts @@ -3,28 +3,40 @@ import type { TaskRecord } from "@sandbox-agent/foundry-shared"; import { filterTasks, formatRelativeAge, fuzzyMatch, summarizeTasks } from "../src/view-model.js"; const sample: TaskRecord = { - organizationId: "default", + workspaceId: "default", repoId: "repo-a", repoRemote: "https://example.com/repo-a.git", taskId: "task-1", branchName: "feature/test", title: "Test Title", task: "Do test", - sandboxProviderId: "local", + providerId: "daytona", status: "running", + statusMessage: null, activeSandboxId: "sandbox-1", - pullRequest: null, + activeSessionId: "session-1", sandboxes: [ { sandboxId: "sandbox-1", - sandboxProviderId: "local", + providerId: "daytona", sandboxActorId: null, - switchTarget: "sandbox://local/sandbox-1", + switchTarget: "daytona://sandbox-1", cwd: null, createdAt: 1, updatedAt: 1, }, ], + agentType: null, + prSubmitted: false, + diffStat: null, + prUrl: null, + prAuthor: null, + ciStatus: null, + reviewStatus: null, + reviewer: null, + conflictsWithMain: null, + hasUnpushed: null, + parentBranch: null, createdAt: 1, updatedAt: 1, }; @@ -47,7 +59,7 @@ describe("search helpers", () => { }, ]; expect(filterTasks(rows, "doc")).toHaveLength(1); - expect(filterTasks(rows, "intro")).toHaveLength(1); + expect(filterTasks(rows, "h2")).toHaveLength(1); expect(filterTasks(rows, "test")).toHaveLength(2); }); }); @@ -61,8 +73,8 @@ describe("summary helpers", () => { it("summarizes by status and provider", () => { const rows: TaskRecord[] = [ sample, - { ...sample, taskId: "task-2", status: "idle", sandboxProviderId: "local" }, - { ...sample, taskId: "task-3", status: "error", sandboxProviderId: "local" }, + { ...sample, taskId: "task-2", status: "idle", providerId: "daytona" }, + { ...sample, taskId: "task-3", status: "error", providerId: "daytona" }, ]; const summary = summarizeTasks(rows); @@ -70,6 +82,6 @@ describe("summary helpers", () => { expect(summary.byStatus.running).toBe(1); expect(summary.byStatus.idle).toBe(1); expect(summary.byStatus.error).toBe(1); - expect(summary.byProvider.local).toBe(3); + expect(summary.byProvider.daytona).toBe(3); }); }); diff --git a/foundry/packages/client/tsconfig.build.json b/foundry/packages/client/tsconfig.build.json deleted file mode 100644 index 35bcdb2..0000000 --- a/foundry/packages/client/tsconfig.build.json +++ /dev/null @@ -1,6 +0,0 @@ -{ - "extends": "./tsconfig.json", - "compilerOptions": { - "ignoreDeprecations": "6.0" - } -} diff --git a/foundry/packages/desktop/src-tauri/gen/schemas/acl-manifests.json b/foundry/packages/desktop/src-tauri/gen/schemas/acl-manifests.json index 6844932..86cdb1f 100644 --- a/foundry/packages/desktop/src-tauri/gen/schemas/acl-manifests.json +++ b/foundry/packages/desktop/src-tauri/gen/schemas/acl-manifests.json @@ -1,1922 +1 @@ -{ - "core": { - "default_permission": { - "identifier": "default", - "description": "Default core plugins set.", - "permissions": [ - "core:path:default", - "core:event:default", - "core:window:default", - "core:webview:default", - "core:app:default", - "core:image:default", - "core:resources:default", - "core:menu:default", - "core:tray:default" - ] - }, - "permissions": {}, - "permission_sets": {}, - "global_scope_schema": null - }, - "core:app": { - "default_permission": { - "identifier": "default", - "description": "Default permissions for the plugin.", - "permissions": [ - "allow-version", - "allow-name", - "allow-tauri-version", - "allow-identifier", - "allow-bundle-type", - "allow-register-listener", - "allow-remove-listener" - ] - }, - "permissions": { - "allow-app-hide": { - "identifier": "allow-app-hide", - "description": "Enables the app_hide command without any pre-configured scope.", - "commands": { "allow": ["app_hide"], "deny": [] } - }, - "allow-app-show": { - "identifier": "allow-app-show", - "description": "Enables the app_show command without any pre-configured scope.", - "commands": { "allow": ["app_show"], "deny": [] } - }, - "allow-bundle-type": { - "identifier": "allow-bundle-type", - "description": "Enables the bundle_type command without any pre-configured scope.", - "commands": { "allow": ["bundle_type"], "deny": [] } - }, - "allow-default-window-icon": { - "identifier": "allow-default-window-icon", - "description": "Enables the default_window_icon command without any pre-configured scope.", - "commands": { "allow": ["default_window_icon"], "deny": [] } - }, - "allow-fetch-data-store-identifiers": { - "identifier": "allow-fetch-data-store-identifiers", - "description": "Enables the fetch_data_store_identifiers command without any pre-configured scope.", - "commands": { "allow": ["fetch_data_store_identifiers"], "deny": [] } - }, - "allow-identifier": { - "identifier": "allow-identifier", - "description": "Enables the identifier command without any pre-configured scope.", - "commands": { "allow": ["identifier"], "deny": [] } - }, - "allow-name": { - "identifier": "allow-name", - "description": "Enables the name command without any pre-configured scope.", - "commands": { "allow": ["name"], "deny": [] } - }, - "allow-register-listener": { - "identifier": "allow-register-listener", - "description": "Enables the register_listener command without any pre-configured scope.", - "commands": { "allow": ["register_listener"], "deny": [] } - }, - "allow-remove-data-store": { - "identifier": "allow-remove-data-store", - "description": "Enables the remove_data_store command without any pre-configured scope.", - "commands": { "allow": ["remove_data_store"], "deny": [] } - }, - "allow-remove-listener": { - "identifier": "allow-remove-listener", - "description": "Enables the remove_listener command without any pre-configured scope.", - "commands": { "allow": ["remove_listener"], "deny": [] } - }, - "allow-set-app-theme": { - "identifier": "allow-set-app-theme", - "description": "Enables the set_app_theme command without any pre-configured scope.", - "commands": { "allow": ["set_app_theme"], "deny": [] } - }, - "allow-set-dock-visibility": { - "identifier": "allow-set-dock-visibility", - "description": "Enables the set_dock_visibility command without any pre-configured scope.", - "commands": { "allow": ["set_dock_visibility"], "deny": [] } - }, - "allow-tauri-version": { - "identifier": "allow-tauri-version", - "description": "Enables the tauri_version command without any pre-configured scope.", - "commands": { "allow": ["tauri_version"], "deny": [] } - }, - "allow-version": { - "identifier": "allow-version", - "description": "Enables the version command without any pre-configured scope.", - "commands": { "allow": ["version"], "deny": [] } - }, - "deny-app-hide": { - "identifier": "deny-app-hide", - "description": "Denies the app_hide command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["app_hide"] } - }, - "deny-app-show": { - "identifier": "deny-app-show", - "description": "Denies the app_show command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["app_show"] } - }, - "deny-bundle-type": { - "identifier": "deny-bundle-type", - "description": "Denies the bundle_type command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["bundle_type"] } - }, - "deny-default-window-icon": { - "identifier": "deny-default-window-icon", - "description": "Denies the default_window_icon command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["default_window_icon"] } - }, - "deny-fetch-data-store-identifiers": { - "identifier": "deny-fetch-data-store-identifiers", - "description": "Denies the fetch_data_store_identifiers command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["fetch_data_store_identifiers"] } - }, - "deny-identifier": { - "identifier": "deny-identifier", - "description": "Denies the identifier command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["identifier"] } - }, - "deny-name": { - "identifier": "deny-name", - "description": "Denies the name command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["name"] } - }, - "deny-register-listener": { - "identifier": "deny-register-listener", - "description": "Denies the register_listener command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["register_listener"] } - }, - "deny-remove-data-store": { - "identifier": "deny-remove-data-store", - "description": "Denies the remove_data_store command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["remove_data_store"] } - }, - "deny-remove-listener": { - "identifier": "deny-remove-listener", - "description": "Denies the remove_listener command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["remove_listener"] } - }, - "deny-set-app-theme": { - "identifier": "deny-set-app-theme", - "description": "Denies the set_app_theme command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_app_theme"] } - }, - "deny-set-dock-visibility": { - "identifier": "deny-set-dock-visibility", - "description": "Denies the set_dock_visibility command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_dock_visibility"] } - }, - "deny-tauri-version": { - "identifier": "deny-tauri-version", - "description": "Denies the tauri_version command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["tauri_version"] } - }, - "deny-version": { - "identifier": "deny-version", - "description": "Denies the version command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["version"] } - } - }, - "permission_sets": {}, - "global_scope_schema": null - }, - "core:event": { - "default_permission": { - "identifier": "default", - "description": "Default permissions for the plugin, which enables all commands.", - "permissions": ["allow-listen", "allow-unlisten", "allow-emit", "allow-emit-to"] - }, - "permissions": { - "allow-emit": { - "identifier": "allow-emit", - "description": "Enables the emit command without any pre-configured scope.", - "commands": { "allow": ["emit"], "deny": [] } - }, - "allow-emit-to": { - "identifier": "allow-emit-to", - "description": "Enables the emit_to command without any pre-configured scope.", - "commands": { "allow": ["emit_to"], "deny": [] } - }, - "allow-listen": { - "identifier": "allow-listen", - "description": "Enables the listen command without any pre-configured scope.", - "commands": { "allow": ["listen"], "deny": [] } - }, - "allow-unlisten": { - "identifier": "allow-unlisten", - "description": "Enables the unlisten command without any pre-configured scope.", - "commands": { "allow": ["unlisten"], "deny": [] } - }, - "deny-emit": { - "identifier": "deny-emit", - "description": "Denies the emit command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["emit"] } - }, - "deny-emit-to": { - "identifier": "deny-emit-to", - "description": "Denies the emit_to command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["emit_to"] } - }, - "deny-listen": { - "identifier": "deny-listen", - "description": "Denies the listen command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["listen"] } - }, - "deny-unlisten": { - "identifier": "deny-unlisten", - "description": "Denies the unlisten command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["unlisten"] } - } - }, - "permission_sets": {}, - "global_scope_schema": null - }, - "core:image": { - "default_permission": { - "identifier": "default", - "description": "Default permissions for the plugin, which enables all commands.", - "permissions": ["allow-new", "allow-from-bytes", "allow-from-path", "allow-rgba", "allow-size"] - }, - "permissions": { - "allow-from-bytes": { - "identifier": "allow-from-bytes", - "description": "Enables the from_bytes command without any pre-configured scope.", - "commands": { "allow": ["from_bytes"], "deny": [] } - }, - "allow-from-path": { - "identifier": "allow-from-path", - "description": "Enables the from_path command without any pre-configured scope.", - "commands": { "allow": ["from_path"], "deny": [] } - }, - "allow-new": { - "identifier": "allow-new", - "description": "Enables the new command without any pre-configured scope.", - "commands": { "allow": ["new"], "deny": [] } - }, - "allow-rgba": { - "identifier": "allow-rgba", - "description": "Enables the rgba command without any pre-configured scope.", - "commands": { "allow": ["rgba"], "deny": [] } - }, - "allow-size": { - "identifier": "allow-size", - "description": "Enables the size command without any pre-configured scope.", - "commands": { "allow": ["size"], "deny": [] } - }, - "deny-from-bytes": { - "identifier": "deny-from-bytes", - "description": "Denies the from_bytes command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["from_bytes"] } - }, - "deny-from-path": { - "identifier": "deny-from-path", - "description": "Denies the from_path command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["from_path"] } - }, - "deny-new": { - "identifier": "deny-new", - "description": "Denies the new command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["new"] } - }, - "deny-rgba": { - "identifier": "deny-rgba", - "description": "Denies the rgba command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["rgba"] } - }, - "deny-size": { - "identifier": "deny-size", - "description": "Denies the size command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["size"] } - } - }, - "permission_sets": {}, - "global_scope_schema": null - }, - "core:menu": { - "default_permission": { - "identifier": "default", - "description": "Default permissions for the plugin, which enables all commands.", - "permissions": [ - "allow-new", - "allow-append", - "allow-prepend", - "allow-insert", - "allow-remove", - "allow-remove-at", - "allow-items", - "allow-get", - "allow-popup", - "allow-create-default", - "allow-set-as-app-menu", - "allow-set-as-window-menu", - "allow-text", - "allow-set-text", - "allow-is-enabled", - "allow-set-enabled", - "allow-set-accelerator", - "allow-set-as-windows-menu-for-nsapp", - "allow-set-as-help-menu-for-nsapp", - "allow-is-checked", - "allow-set-checked", - "allow-set-icon" - ] - }, - "permissions": { - "allow-append": { - "identifier": "allow-append", - "description": "Enables the append command without any pre-configured scope.", - "commands": { "allow": ["append"], "deny": [] } - }, - "allow-create-default": { - "identifier": "allow-create-default", - "description": "Enables the create_default command without any pre-configured scope.", - "commands": { "allow": ["create_default"], "deny": [] } - }, - "allow-get": { - "identifier": "allow-get", - "description": "Enables the get command without any pre-configured scope.", - "commands": { "allow": ["get"], "deny": [] } - }, - "allow-insert": { - "identifier": "allow-insert", - "description": "Enables the insert command without any pre-configured scope.", - "commands": { "allow": ["insert"], "deny": [] } - }, - "allow-is-checked": { - "identifier": "allow-is-checked", - "description": "Enables the is_checked command without any pre-configured scope.", - "commands": { "allow": ["is_checked"], "deny": [] } - }, - "allow-is-enabled": { - "identifier": "allow-is-enabled", - "description": "Enables the is_enabled command without any pre-configured scope.", - "commands": { "allow": ["is_enabled"], "deny": [] } - }, - "allow-items": { - "identifier": "allow-items", - "description": "Enables the items command without any pre-configured scope.", - "commands": { "allow": ["items"], "deny": [] } - }, - "allow-new": { - "identifier": "allow-new", - "description": "Enables the new command without any pre-configured scope.", - "commands": { "allow": ["new"], "deny": [] } - }, - "allow-popup": { - "identifier": "allow-popup", - "description": "Enables the popup command without any pre-configured scope.", - "commands": { "allow": ["popup"], "deny": [] } - }, - "allow-prepend": { - "identifier": "allow-prepend", - "description": "Enables the prepend command without any pre-configured scope.", - "commands": { "allow": ["prepend"], "deny": [] } - }, - "allow-remove": { - "identifier": "allow-remove", - "description": "Enables the remove command without any pre-configured scope.", - "commands": { "allow": ["remove"], "deny": [] } - }, - "allow-remove-at": { - "identifier": "allow-remove-at", - "description": "Enables the remove_at command without any pre-configured scope.", - "commands": { "allow": ["remove_at"], "deny": [] } - }, - "allow-set-accelerator": { - "identifier": "allow-set-accelerator", - "description": "Enables the set_accelerator command without any pre-configured scope.", - "commands": { "allow": ["set_accelerator"], "deny": [] } - }, - "allow-set-as-app-menu": { - "identifier": "allow-set-as-app-menu", - "description": "Enables the set_as_app_menu command without any pre-configured scope.", - "commands": { "allow": ["set_as_app_menu"], "deny": [] } - }, - "allow-set-as-help-menu-for-nsapp": { - "identifier": "allow-set-as-help-menu-for-nsapp", - "description": "Enables the set_as_help_menu_for_nsapp command without any pre-configured scope.", - "commands": { "allow": ["set_as_help_menu_for_nsapp"], "deny": [] } - }, - "allow-set-as-window-menu": { - "identifier": "allow-set-as-window-menu", - "description": "Enables the set_as_window_menu command without any pre-configured scope.", - "commands": { "allow": ["set_as_window_menu"], "deny": [] } - }, - "allow-set-as-windows-menu-for-nsapp": { - "identifier": "allow-set-as-windows-menu-for-nsapp", - "description": "Enables the set_as_windows_menu_for_nsapp command without any pre-configured scope.", - "commands": { "allow": ["set_as_windows_menu_for_nsapp"], "deny": [] } - }, - "allow-set-checked": { - "identifier": "allow-set-checked", - "description": "Enables the set_checked command without any pre-configured scope.", - "commands": { "allow": ["set_checked"], "deny": [] } - }, - "allow-set-enabled": { - "identifier": "allow-set-enabled", - "description": "Enables the set_enabled command without any pre-configured scope.", - "commands": { "allow": ["set_enabled"], "deny": [] } - }, - "allow-set-icon": { - "identifier": "allow-set-icon", - "description": "Enables the set_icon command without any pre-configured scope.", - "commands": { "allow": ["set_icon"], "deny": [] } - }, - "allow-set-text": { - "identifier": "allow-set-text", - "description": "Enables the set_text command without any pre-configured scope.", - "commands": { "allow": ["set_text"], "deny": [] } - }, - "allow-text": { - "identifier": "allow-text", - "description": "Enables the text command without any pre-configured scope.", - "commands": { "allow": ["text"], "deny": [] } - }, - "deny-append": { - "identifier": "deny-append", - "description": "Denies the append command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["append"] } - }, - "deny-create-default": { - "identifier": "deny-create-default", - "description": "Denies the create_default command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["create_default"] } - }, - "deny-get": { - "identifier": "deny-get", - "description": "Denies the get command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["get"] } - }, - "deny-insert": { - "identifier": "deny-insert", - "description": "Denies the insert command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["insert"] } - }, - "deny-is-checked": { - "identifier": "deny-is-checked", - "description": "Denies the is_checked command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_checked"] } - }, - "deny-is-enabled": { - "identifier": "deny-is-enabled", - "description": "Denies the is_enabled command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_enabled"] } - }, - "deny-items": { - "identifier": "deny-items", - "description": "Denies the items command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["items"] } - }, - "deny-new": { - "identifier": "deny-new", - "description": "Denies the new command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["new"] } - }, - "deny-popup": { - "identifier": "deny-popup", - "description": "Denies the popup command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["popup"] } - }, - "deny-prepend": { - "identifier": "deny-prepend", - "description": "Denies the prepend command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["prepend"] } - }, - "deny-remove": { - "identifier": "deny-remove", - "description": "Denies the remove command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["remove"] } - }, - "deny-remove-at": { - "identifier": "deny-remove-at", - "description": "Denies the remove_at command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["remove_at"] } - }, - "deny-set-accelerator": { - "identifier": "deny-set-accelerator", - "description": "Denies the set_accelerator command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_accelerator"] } - }, - "deny-set-as-app-menu": { - "identifier": "deny-set-as-app-menu", - "description": "Denies the set_as_app_menu command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_as_app_menu"] } - }, - "deny-set-as-help-menu-for-nsapp": { - "identifier": "deny-set-as-help-menu-for-nsapp", - "description": "Denies the set_as_help_menu_for_nsapp command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_as_help_menu_for_nsapp"] } - }, - "deny-set-as-window-menu": { - "identifier": "deny-set-as-window-menu", - "description": "Denies the set_as_window_menu command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_as_window_menu"] } - }, - "deny-set-as-windows-menu-for-nsapp": { - "identifier": "deny-set-as-windows-menu-for-nsapp", - "description": "Denies the set_as_windows_menu_for_nsapp command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_as_windows_menu_for_nsapp"] } - }, - "deny-set-checked": { - "identifier": "deny-set-checked", - "description": "Denies the set_checked command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_checked"] } - }, - "deny-set-enabled": { - "identifier": "deny-set-enabled", - "description": "Denies the set_enabled command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_enabled"] } - }, - "deny-set-icon": { - "identifier": "deny-set-icon", - "description": "Denies the set_icon command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_icon"] } - }, - "deny-set-text": { - "identifier": "deny-set-text", - "description": "Denies the set_text command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_text"] } - }, - "deny-text": { - "identifier": "deny-text", - "description": "Denies the text command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["text"] } - } - }, - "permission_sets": {}, - "global_scope_schema": null - }, - "core:path": { - "default_permission": { - "identifier": "default", - "description": "Default permissions for the plugin, which enables all commands.", - "permissions": [ - "allow-resolve-directory", - "allow-resolve", - "allow-normalize", - "allow-join", - "allow-dirname", - "allow-extname", - "allow-basename", - "allow-is-absolute" - ] - }, - "permissions": { - "allow-basename": { - "identifier": "allow-basename", - "description": "Enables the basename command without any pre-configured scope.", - "commands": { "allow": ["basename"], "deny": [] } - }, - "allow-dirname": { - "identifier": "allow-dirname", - "description": "Enables the dirname command without any pre-configured scope.", - "commands": { "allow": ["dirname"], "deny": [] } - }, - "allow-extname": { - "identifier": "allow-extname", - "description": "Enables the extname command without any pre-configured scope.", - "commands": { "allow": ["extname"], "deny": [] } - }, - "allow-is-absolute": { - "identifier": "allow-is-absolute", - "description": "Enables the is_absolute command without any pre-configured scope.", - "commands": { "allow": ["is_absolute"], "deny": [] } - }, - "allow-join": { - "identifier": "allow-join", - "description": "Enables the join command without any pre-configured scope.", - "commands": { "allow": ["join"], "deny": [] } - }, - "allow-normalize": { - "identifier": "allow-normalize", - "description": "Enables the normalize command without any pre-configured scope.", - "commands": { "allow": ["normalize"], "deny": [] } - }, - "allow-resolve": { - "identifier": "allow-resolve", - "description": "Enables the resolve command without any pre-configured scope.", - "commands": { "allow": ["resolve"], "deny": [] } - }, - "allow-resolve-directory": { - "identifier": "allow-resolve-directory", - "description": "Enables the resolve_directory command without any pre-configured scope.", - "commands": { "allow": ["resolve_directory"], "deny": [] } - }, - "deny-basename": { - "identifier": "deny-basename", - "description": "Denies the basename command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["basename"] } - }, - "deny-dirname": { - "identifier": "deny-dirname", - "description": "Denies the dirname command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["dirname"] } - }, - "deny-extname": { - "identifier": "deny-extname", - "description": "Denies the extname command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["extname"] } - }, - "deny-is-absolute": { - "identifier": "deny-is-absolute", - "description": "Denies the is_absolute command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_absolute"] } - }, - "deny-join": { - "identifier": "deny-join", - "description": "Denies the join command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["join"] } - }, - "deny-normalize": { - "identifier": "deny-normalize", - "description": "Denies the normalize command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["normalize"] } - }, - "deny-resolve": { - "identifier": "deny-resolve", - "description": "Denies the resolve command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["resolve"] } - }, - "deny-resolve-directory": { - "identifier": "deny-resolve-directory", - "description": "Denies the resolve_directory command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["resolve_directory"] } - } - }, - "permission_sets": {}, - "global_scope_schema": null - }, - "core:resources": { - "default_permission": { - "identifier": "default", - "description": "Default permissions for the plugin, which enables all commands.", - "permissions": ["allow-close"] - }, - "permissions": { - "allow-close": { - "identifier": "allow-close", - "description": "Enables the close command without any pre-configured scope.", - "commands": { "allow": ["close"], "deny": [] } - }, - "deny-close": { - "identifier": "deny-close", - "description": "Denies the close command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["close"] } - } - }, - "permission_sets": {}, - "global_scope_schema": null - }, - "core:tray": { - "default_permission": { - "identifier": "default", - "description": "Default permissions for the plugin, which enables all commands.", - "permissions": [ - "allow-new", - "allow-get-by-id", - "allow-remove-by-id", - "allow-set-icon", - "allow-set-menu", - "allow-set-tooltip", - "allow-set-title", - "allow-set-visible", - "allow-set-temp-dir-path", - "allow-set-icon-as-template", - "allow-set-show-menu-on-left-click" - ] - }, - "permissions": { - "allow-get-by-id": { - "identifier": "allow-get-by-id", - "description": "Enables the get_by_id command without any pre-configured scope.", - "commands": { "allow": ["get_by_id"], "deny": [] } - }, - "allow-new": { - "identifier": "allow-new", - "description": "Enables the new command without any pre-configured scope.", - "commands": { "allow": ["new"], "deny": [] } - }, - "allow-remove-by-id": { - "identifier": "allow-remove-by-id", - "description": "Enables the remove_by_id command without any pre-configured scope.", - "commands": { "allow": ["remove_by_id"], "deny": [] } - }, - "allow-set-icon": { - "identifier": "allow-set-icon", - "description": "Enables the set_icon command without any pre-configured scope.", - "commands": { "allow": ["set_icon"], "deny": [] } - }, - "allow-set-icon-as-template": { - "identifier": "allow-set-icon-as-template", - "description": "Enables the set_icon_as_template command without any pre-configured scope.", - "commands": { "allow": ["set_icon_as_template"], "deny": [] } - }, - "allow-set-menu": { - "identifier": "allow-set-menu", - "description": "Enables the set_menu command without any pre-configured scope.", - "commands": { "allow": ["set_menu"], "deny": [] } - }, - "allow-set-show-menu-on-left-click": { - "identifier": "allow-set-show-menu-on-left-click", - "description": "Enables the set_show_menu_on_left_click command without any pre-configured scope.", - "commands": { "allow": ["set_show_menu_on_left_click"], "deny": [] } - }, - "allow-set-temp-dir-path": { - "identifier": "allow-set-temp-dir-path", - "description": "Enables the set_temp_dir_path command without any pre-configured scope.", - "commands": { "allow": ["set_temp_dir_path"], "deny": [] } - }, - "allow-set-title": { - "identifier": "allow-set-title", - "description": "Enables the set_title command without any pre-configured scope.", - "commands": { "allow": ["set_title"], "deny": [] } - }, - "allow-set-tooltip": { - "identifier": "allow-set-tooltip", - "description": "Enables the set_tooltip command without any pre-configured scope.", - "commands": { "allow": ["set_tooltip"], "deny": [] } - }, - "allow-set-visible": { - "identifier": "allow-set-visible", - "description": "Enables the set_visible command without any pre-configured scope.", - "commands": { "allow": ["set_visible"], "deny": [] } - }, - "deny-get-by-id": { - "identifier": "deny-get-by-id", - "description": "Denies the get_by_id command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["get_by_id"] } - }, - "deny-new": { - "identifier": "deny-new", - "description": "Denies the new command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["new"] } - }, - "deny-remove-by-id": { - "identifier": "deny-remove-by-id", - "description": "Denies the remove_by_id command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["remove_by_id"] } - }, - "deny-set-icon": { - "identifier": "deny-set-icon", - "description": "Denies the set_icon command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_icon"] } - }, - "deny-set-icon-as-template": { - "identifier": "deny-set-icon-as-template", - "description": "Denies the set_icon_as_template command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_icon_as_template"] } - }, - "deny-set-menu": { - "identifier": "deny-set-menu", - "description": "Denies the set_menu command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_menu"] } - }, - "deny-set-show-menu-on-left-click": { - "identifier": "deny-set-show-menu-on-left-click", - "description": "Denies the set_show_menu_on_left_click command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_show_menu_on_left_click"] } - }, - "deny-set-temp-dir-path": { - "identifier": "deny-set-temp-dir-path", - "description": "Denies the set_temp_dir_path command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_temp_dir_path"] } - }, - "deny-set-title": { - "identifier": "deny-set-title", - "description": "Denies the set_title command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_title"] } - }, - "deny-set-tooltip": { - "identifier": "deny-set-tooltip", - "description": "Denies the set_tooltip command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_tooltip"] } - }, - "deny-set-visible": { - "identifier": "deny-set-visible", - "description": "Denies the set_visible command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_visible"] } - } - }, - "permission_sets": {}, - "global_scope_schema": null - }, - "core:webview": { - "default_permission": { - "identifier": "default", - "description": "Default permissions for the plugin.", - "permissions": ["allow-get-all-webviews", "allow-webview-position", "allow-webview-size", "allow-internal-toggle-devtools"] - }, - "permissions": { - "allow-clear-all-browsing-data": { - "identifier": "allow-clear-all-browsing-data", - "description": "Enables the clear_all_browsing_data command without any pre-configured scope.", - "commands": { "allow": ["clear_all_browsing_data"], "deny": [] } - }, - "allow-create-webview": { - "identifier": "allow-create-webview", - "description": "Enables the create_webview command without any pre-configured scope.", - "commands": { "allow": ["create_webview"], "deny": [] } - }, - "allow-create-webview-window": { - "identifier": "allow-create-webview-window", - "description": "Enables the create_webview_window command without any pre-configured scope.", - "commands": { "allow": ["create_webview_window"], "deny": [] } - }, - "allow-get-all-webviews": { - "identifier": "allow-get-all-webviews", - "description": "Enables the get_all_webviews command without any pre-configured scope.", - "commands": { "allow": ["get_all_webviews"], "deny": [] } - }, - "allow-internal-toggle-devtools": { - "identifier": "allow-internal-toggle-devtools", - "description": "Enables the internal_toggle_devtools command without any pre-configured scope.", - "commands": { "allow": ["internal_toggle_devtools"], "deny": [] } - }, - "allow-print": { - "identifier": "allow-print", - "description": "Enables the print command without any pre-configured scope.", - "commands": { "allow": ["print"], "deny": [] } - }, - "allow-reparent": { - "identifier": "allow-reparent", - "description": "Enables the reparent command without any pre-configured scope.", - "commands": { "allow": ["reparent"], "deny": [] } - }, - "allow-set-webview-auto-resize": { - "identifier": "allow-set-webview-auto-resize", - "description": "Enables the set_webview_auto_resize command without any pre-configured scope.", - "commands": { "allow": ["set_webview_auto_resize"], "deny": [] } - }, - "allow-set-webview-background-color": { - "identifier": "allow-set-webview-background-color", - "description": "Enables the set_webview_background_color command without any pre-configured scope.", - "commands": { "allow": ["set_webview_background_color"], "deny": [] } - }, - "allow-set-webview-focus": { - "identifier": "allow-set-webview-focus", - "description": "Enables the set_webview_focus command without any pre-configured scope.", - "commands": { "allow": ["set_webview_focus"], "deny": [] } - }, - "allow-set-webview-position": { - "identifier": "allow-set-webview-position", - "description": "Enables the set_webview_position command without any pre-configured scope.", - "commands": { "allow": ["set_webview_position"], "deny": [] } - }, - "allow-set-webview-size": { - "identifier": "allow-set-webview-size", - "description": "Enables the set_webview_size command without any pre-configured scope.", - "commands": { "allow": ["set_webview_size"], "deny": [] } - }, - "allow-set-webview-zoom": { - "identifier": "allow-set-webview-zoom", - "description": "Enables the set_webview_zoom command without any pre-configured scope.", - "commands": { "allow": ["set_webview_zoom"], "deny": [] } - }, - "allow-webview-close": { - "identifier": "allow-webview-close", - "description": "Enables the webview_close command without any pre-configured scope.", - "commands": { "allow": ["webview_close"], "deny": [] } - }, - "allow-webview-hide": { - "identifier": "allow-webview-hide", - "description": "Enables the webview_hide command without any pre-configured scope.", - "commands": { "allow": ["webview_hide"], "deny": [] } - }, - "allow-webview-position": { - "identifier": "allow-webview-position", - "description": "Enables the webview_position command without any pre-configured scope.", - "commands": { "allow": ["webview_position"], "deny": [] } - }, - "allow-webview-show": { - "identifier": "allow-webview-show", - "description": "Enables the webview_show command without any pre-configured scope.", - "commands": { "allow": ["webview_show"], "deny": [] } - }, - "allow-webview-size": { - "identifier": "allow-webview-size", - "description": "Enables the webview_size command without any pre-configured scope.", - "commands": { "allow": ["webview_size"], "deny": [] } - }, - "deny-clear-all-browsing-data": { - "identifier": "deny-clear-all-browsing-data", - "description": "Denies the clear_all_browsing_data command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["clear_all_browsing_data"] } - }, - "deny-create-webview": { - "identifier": "deny-create-webview", - "description": "Denies the create_webview command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["create_webview"] } - }, - "deny-create-webview-window": { - "identifier": "deny-create-webview-window", - "description": "Denies the create_webview_window command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["create_webview_window"] } - }, - "deny-get-all-webviews": { - "identifier": "deny-get-all-webviews", - "description": "Denies the get_all_webviews command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["get_all_webviews"] } - }, - "deny-internal-toggle-devtools": { - "identifier": "deny-internal-toggle-devtools", - "description": "Denies the internal_toggle_devtools command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["internal_toggle_devtools"] } - }, - "deny-print": { - "identifier": "deny-print", - "description": "Denies the print command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["print"] } - }, - "deny-reparent": { - "identifier": "deny-reparent", - "description": "Denies the reparent command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["reparent"] } - }, - "deny-set-webview-auto-resize": { - "identifier": "deny-set-webview-auto-resize", - "description": "Denies the set_webview_auto_resize command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_webview_auto_resize"] } - }, - "deny-set-webview-background-color": { - "identifier": "deny-set-webview-background-color", - "description": "Denies the set_webview_background_color command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_webview_background_color"] } - }, - "deny-set-webview-focus": { - "identifier": "deny-set-webview-focus", - "description": "Denies the set_webview_focus command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_webview_focus"] } - }, - "deny-set-webview-position": { - "identifier": "deny-set-webview-position", - "description": "Denies the set_webview_position command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_webview_position"] } - }, - "deny-set-webview-size": { - "identifier": "deny-set-webview-size", - "description": "Denies the set_webview_size command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_webview_size"] } - }, - "deny-set-webview-zoom": { - "identifier": "deny-set-webview-zoom", - "description": "Denies the set_webview_zoom command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_webview_zoom"] } - }, - "deny-webview-close": { - "identifier": "deny-webview-close", - "description": "Denies the webview_close command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["webview_close"] } - }, - "deny-webview-hide": { - "identifier": "deny-webview-hide", - "description": "Denies the webview_hide command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["webview_hide"] } - }, - "deny-webview-position": { - "identifier": "deny-webview-position", - "description": "Denies the webview_position command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["webview_position"] } - }, - "deny-webview-show": { - "identifier": "deny-webview-show", - "description": "Denies the webview_show command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["webview_show"] } - }, - "deny-webview-size": { - "identifier": "deny-webview-size", - "description": "Denies the webview_size command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["webview_size"] } - } - }, - "permission_sets": {}, - "global_scope_schema": null - }, - "core:window": { - "default_permission": { - "identifier": "default", - "description": "Default permissions for the plugin.", - "permissions": [ - "allow-get-all-windows", - "allow-scale-factor", - "allow-inner-position", - "allow-outer-position", - "allow-inner-size", - "allow-outer-size", - "allow-is-fullscreen", - "allow-is-minimized", - "allow-is-maximized", - "allow-is-focused", - "allow-is-decorated", - "allow-is-resizable", - "allow-is-maximizable", - "allow-is-minimizable", - "allow-is-closable", - "allow-is-visible", - "allow-is-enabled", - "allow-title", - "allow-current-monitor", - "allow-primary-monitor", - "allow-monitor-from-point", - "allow-available-monitors", - "allow-cursor-position", - "allow-theme", - "allow-is-always-on-top", - "allow-internal-toggle-maximize" - ] - }, - "permissions": { - "allow-available-monitors": { - "identifier": "allow-available-monitors", - "description": "Enables the available_monitors command without any pre-configured scope.", - "commands": { "allow": ["available_monitors"], "deny": [] } - }, - "allow-center": { - "identifier": "allow-center", - "description": "Enables the center command without any pre-configured scope.", - "commands": { "allow": ["center"], "deny": [] } - }, - "allow-close": { - "identifier": "allow-close", - "description": "Enables the close command without any pre-configured scope.", - "commands": { "allow": ["close"], "deny": [] } - }, - "allow-create": { - "identifier": "allow-create", - "description": "Enables the create command without any pre-configured scope.", - "commands": { "allow": ["create"], "deny": [] } - }, - "allow-current-monitor": { - "identifier": "allow-current-monitor", - "description": "Enables the current_monitor command without any pre-configured scope.", - "commands": { "allow": ["current_monitor"], "deny": [] } - }, - "allow-cursor-position": { - "identifier": "allow-cursor-position", - "description": "Enables the cursor_position command without any pre-configured scope.", - "commands": { "allow": ["cursor_position"], "deny": [] } - }, - "allow-destroy": { - "identifier": "allow-destroy", - "description": "Enables the destroy command without any pre-configured scope.", - "commands": { "allow": ["destroy"], "deny": [] } - }, - "allow-get-all-windows": { - "identifier": "allow-get-all-windows", - "description": "Enables the get_all_windows command without any pre-configured scope.", - "commands": { "allow": ["get_all_windows"], "deny": [] } - }, - "allow-hide": { - "identifier": "allow-hide", - "description": "Enables the hide command without any pre-configured scope.", - "commands": { "allow": ["hide"], "deny": [] } - }, - "allow-inner-position": { - "identifier": "allow-inner-position", - "description": "Enables the inner_position command without any pre-configured scope.", - "commands": { "allow": ["inner_position"], "deny": [] } - }, - "allow-inner-size": { - "identifier": "allow-inner-size", - "description": "Enables the inner_size command without any pre-configured scope.", - "commands": { "allow": ["inner_size"], "deny": [] } - }, - "allow-internal-toggle-maximize": { - "identifier": "allow-internal-toggle-maximize", - "description": "Enables the internal_toggle_maximize command without any pre-configured scope.", - "commands": { "allow": ["internal_toggle_maximize"], "deny": [] } - }, - "allow-is-always-on-top": { - "identifier": "allow-is-always-on-top", - "description": "Enables the is_always_on_top command without any pre-configured scope.", - "commands": { "allow": ["is_always_on_top"], "deny": [] } - }, - "allow-is-closable": { - "identifier": "allow-is-closable", - "description": "Enables the is_closable command without any pre-configured scope.", - "commands": { "allow": ["is_closable"], "deny": [] } - }, - "allow-is-decorated": { - "identifier": "allow-is-decorated", - "description": "Enables the is_decorated command without any pre-configured scope.", - "commands": { "allow": ["is_decorated"], "deny": [] } - }, - "allow-is-enabled": { - "identifier": "allow-is-enabled", - "description": "Enables the is_enabled command without any pre-configured scope.", - "commands": { "allow": ["is_enabled"], "deny": [] } - }, - "allow-is-focused": { - "identifier": "allow-is-focused", - "description": "Enables the is_focused command without any pre-configured scope.", - "commands": { "allow": ["is_focused"], "deny": [] } - }, - "allow-is-fullscreen": { - "identifier": "allow-is-fullscreen", - "description": "Enables the is_fullscreen command without any pre-configured scope.", - "commands": { "allow": ["is_fullscreen"], "deny": [] } - }, - "allow-is-maximizable": { - "identifier": "allow-is-maximizable", - "description": "Enables the is_maximizable command without any pre-configured scope.", - "commands": { "allow": ["is_maximizable"], "deny": [] } - }, - "allow-is-maximized": { - "identifier": "allow-is-maximized", - "description": "Enables the is_maximized command without any pre-configured scope.", - "commands": { "allow": ["is_maximized"], "deny": [] } - }, - "allow-is-minimizable": { - "identifier": "allow-is-minimizable", - "description": "Enables the is_minimizable command without any pre-configured scope.", - "commands": { "allow": ["is_minimizable"], "deny": [] } - }, - "allow-is-minimized": { - "identifier": "allow-is-minimized", - "description": "Enables the is_minimized command without any pre-configured scope.", - "commands": { "allow": ["is_minimized"], "deny": [] } - }, - "allow-is-resizable": { - "identifier": "allow-is-resizable", - "description": "Enables the is_resizable command without any pre-configured scope.", - "commands": { "allow": ["is_resizable"], "deny": [] } - }, - "allow-is-visible": { - "identifier": "allow-is-visible", - "description": "Enables the is_visible command without any pre-configured scope.", - "commands": { "allow": ["is_visible"], "deny": [] } - }, - "allow-maximize": { - "identifier": "allow-maximize", - "description": "Enables the maximize command without any pre-configured scope.", - "commands": { "allow": ["maximize"], "deny": [] } - }, - "allow-minimize": { - "identifier": "allow-minimize", - "description": "Enables the minimize command without any pre-configured scope.", - "commands": { "allow": ["minimize"], "deny": [] } - }, - "allow-monitor-from-point": { - "identifier": "allow-monitor-from-point", - "description": "Enables the monitor_from_point command without any pre-configured scope.", - "commands": { "allow": ["monitor_from_point"], "deny": [] } - }, - "allow-outer-position": { - "identifier": "allow-outer-position", - "description": "Enables the outer_position command without any pre-configured scope.", - "commands": { "allow": ["outer_position"], "deny": [] } - }, - "allow-outer-size": { - "identifier": "allow-outer-size", - "description": "Enables the outer_size command without any pre-configured scope.", - "commands": { "allow": ["outer_size"], "deny": [] } - }, - "allow-primary-monitor": { - "identifier": "allow-primary-monitor", - "description": "Enables the primary_monitor command without any pre-configured scope.", - "commands": { "allow": ["primary_monitor"], "deny": [] } - }, - "allow-request-user-attention": { - "identifier": "allow-request-user-attention", - "description": "Enables the request_user_attention command without any pre-configured scope.", - "commands": { "allow": ["request_user_attention"], "deny": [] } - }, - "allow-scale-factor": { - "identifier": "allow-scale-factor", - "description": "Enables the scale_factor command without any pre-configured scope.", - "commands": { "allow": ["scale_factor"], "deny": [] } - }, - "allow-set-always-on-bottom": { - "identifier": "allow-set-always-on-bottom", - "description": "Enables the set_always_on_bottom command without any pre-configured scope.", - "commands": { "allow": ["set_always_on_bottom"], "deny": [] } - }, - "allow-set-always-on-top": { - "identifier": "allow-set-always-on-top", - "description": "Enables the set_always_on_top command without any pre-configured scope.", - "commands": { "allow": ["set_always_on_top"], "deny": [] } - }, - "allow-set-background-color": { - "identifier": "allow-set-background-color", - "description": "Enables the set_background_color command without any pre-configured scope.", - "commands": { "allow": ["set_background_color"], "deny": [] } - }, - "allow-set-badge-count": { - "identifier": "allow-set-badge-count", - "description": "Enables the set_badge_count command without any pre-configured scope.", - "commands": { "allow": ["set_badge_count"], "deny": [] } - }, - "allow-set-badge-label": { - "identifier": "allow-set-badge-label", - "description": "Enables the set_badge_label command without any pre-configured scope.", - "commands": { "allow": ["set_badge_label"], "deny": [] } - }, - "allow-set-closable": { - "identifier": "allow-set-closable", - "description": "Enables the set_closable command without any pre-configured scope.", - "commands": { "allow": ["set_closable"], "deny": [] } - }, - "allow-set-content-protected": { - "identifier": "allow-set-content-protected", - "description": "Enables the set_content_protected command without any pre-configured scope.", - "commands": { "allow": ["set_content_protected"], "deny": [] } - }, - "allow-set-cursor-grab": { - "identifier": "allow-set-cursor-grab", - "description": "Enables the set_cursor_grab command without any pre-configured scope.", - "commands": { "allow": ["set_cursor_grab"], "deny": [] } - }, - "allow-set-cursor-icon": { - "identifier": "allow-set-cursor-icon", - "description": "Enables the set_cursor_icon command without any pre-configured scope.", - "commands": { "allow": ["set_cursor_icon"], "deny": [] } - }, - "allow-set-cursor-position": { - "identifier": "allow-set-cursor-position", - "description": "Enables the set_cursor_position command without any pre-configured scope.", - "commands": { "allow": ["set_cursor_position"], "deny": [] } - }, - "allow-set-cursor-visible": { - "identifier": "allow-set-cursor-visible", - "description": "Enables the set_cursor_visible command without any pre-configured scope.", - "commands": { "allow": ["set_cursor_visible"], "deny": [] } - }, - "allow-set-decorations": { - "identifier": "allow-set-decorations", - "description": "Enables the set_decorations command without any pre-configured scope.", - "commands": { "allow": ["set_decorations"], "deny": [] } - }, - "allow-set-effects": { - "identifier": "allow-set-effects", - "description": "Enables the set_effects command without any pre-configured scope.", - "commands": { "allow": ["set_effects"], "deny": [] } - }, - "allow-set-enabled": { - "identifier": "allow-set-enabled", - "description": "Enables the set_enabled command without any pre-configured scope.", - "commands": { "allow": ["set_enabled"], "deny": [] } - }, - "allow-set-focus": { - "identifier": "allow-set-focus", - "description": "Enables the set_focus command without any pre-configured scope.", - "commands": { "allow": ["set_focus"], "deny": [] } - }, - "allow-set-focusable": { - "identifier": "allow-set-focusable", - "description": "Enables the set_focusable command without any pre-configured scope.", - "commands": { "allow": ["set_focusable"], "deny": [] } - }, - "allow-set-fullscreen": { - "identifier": "allow-set-fullscreen", - "description": "Enables the set_fullscreen command without any pre-configured scope.", - "commands": { "allow": ["set_fullscreen"], "deny": [] } - }, - "allow-set-icon": { - "identifier": "allow-set-icon", - "description": "Enables the set_icon command without any pre-configured scope.", - "commands": { "allow": ["set_icon"], "deny": [] } - }, - "allow-set-ignore-cursor-events": { - "identifier": "allow-set-ignore-cursor-events", - "description": "Enables the set_ignore_cursor_events command without any pre-configured scope.", - "commands": { "allow": ["set_ignore_cursor_events"], "deny": [] } - }, - "allow-set-max-size": { - "identifier": "allow-set-max-size", - "description": "Enables the set_max_size command without any pre-configured scope.", - "commands": { "allow": ["set_max_size"], "deny": [] } - }, - "allow-set-maximizable": { - "identifier": "allow-set-maximizable", - "description": "Enables the set_maximizable command without any pre-configured scope.", - "commands": { "allow": ["set_maximizable"], "deny": [] } - }, - "allow-set-min-size": { - "identifier": "allow-set-min-size", - "description": "Enables the set_min_size command without any pre-configured scope.", - "commands": { "allow": ["set_min_size"], "deny": [] } - }, - "allow-set-minimizable": { - "identifier": "allow-set-minimizable", - "description": "Enables the set_minimizable command without any pre-configured scope.", - "commands": { "allow": ["set_minimizable"], "deny": [] } - }, - "allow-set-overlay-icon": { - "identifier": "allow-set-overlay-icon", - "description": "Enables the set_overlay_icon command without any pre-configured scope.", - "commands": { "allow": ["set_overlay_icon"], "deny": [] } - }, - "allow-set-position": { - "identifier": "allow-set-position", - "description": "Enables the set_position command without any pre-configured scope.", - "commands": { "allow": ["set_position"], "deny": [] } - }, - "allow-set-progress-bar": { - "identifier": "allow-set-progress-bar", - "description": "Enables the set_progress_bar command without any pre-configured scope.", - "commands": { "allow": ["set_progress_bar"], "deny": [] } - }, - "allow-set-resizable": { - "identifier": "allow-set-resizable", - "description": "Enables the set_resizable command without any pre-configured scope.", - "commands": { "allow": ["set_resizable"], "deny": [] } - }, - "allow-set-shadow": { - "identifier": "allow-set-shadow", - "description": "Enables the set_shadow command without any pre-configured scope.", - "commands": { "allow": ["set_shadow"], "deny": [] } - }, - "allow-set-simple-fullscreen": { - "identifier": "allow-set-simple-fullscreen", - "description": "Enables the set_simple_fullscreen command without any pre-configured scope.", - "commands": { "allow": ["set_simple_fullscreen"], "deny": [] } - }, - "allow-set-size": { - "identifier": "allow-set-size", - "description": "Enables the set_size command without any pre-configured scope.", - "commands": { "allow": ["set_size"], "deny": [] } - }, - "allow-set-size-constraints": { - "identifier": "allow-set-size-constraints", - "description": "Enables the set_size_constraints command without any pre-configured scope.", - "commands": { "allow": ["set_size_constraints"], "deny": [] } - }, - "allow-set-skip-taskbar": { - "identifier": "allow-set-skip-taskbar", - "description": "Enables the set_skip_taskbar command without any pre-configured scope.", - "commands": { "allow": ["set_skip_taskbar"], "deny": [] } - }, - "allow-set-theme": { - "identifier": "allow-set-theme", - "description": "Enables the set_theme command without any pre-configured scope.", - "commands": { "allow": ["set_theme"], "deny": [] } - }, - "allow-set-title": { - "identifier": "allow-set-title", - "description": "Enables the set_title command without any pre-configured scope.", - "commands": { "allow": ["set_title"], "deny": [] } - }, - "allow-set-title-bar-style": { - "identifier": "allow-set-title-bar-style", - "description": "Enables the set_title_bar_style command without any pre-configured scope.", - "commands": { "allow": ["set_title_bar_style"], "deny": [] } - }, - "allow-set-visible-on-all-organizations": { - "identifier": "allow-set-visible-on-all-organizations", - "description": "Enables the set_visible_on_all_organizations command without any pre-configured scope.", - "commands": { "allow": ["set_visible_on_all_organizations"], "deny": [] } - }, - "allow-show": { - "identifier": "allow-show", - "description": "Enables the show command without any pre-configured scope.", - "commands": { "allow": ["show"], "deny": [] } - }, - "allow-start-dragging": { - "identifier": "allow-start-dragging", - "description": "Enables the start_dragging command without any pre-configured scope.", - "commands": { "allow": ["start_dragging"], "deny": [] } - }, - "allow-start-resize-dragging": { - "identifier": "allow-start-resize-dragging", - "description": "Enables the start_resize_dragging command without any pre-configured scope.", - "commands": { "allow": ["start_resize_dragging"], "deny": [] } - }, - "allow-theme": { - "identifier": "allow-theme", - "description": "Enables the theme command without any pre-configured scope.", - "commands": { "allow": ["theme"], "deny": [] } - }, - "allow-title": { - "identifier": "allow-title", - "description": "Enables the title command without any pre-configured scope.", - "commands": { "allow": ["title"], "deny": [] } - }, - "allow-toggle-maximize": { - "identifier": "allow-toggle-maximize", - "description": "Enables the toggle_maximize command without any pre-configured scope.", - "commands": { "allow": ["toggle_maximize"], "deny": [] } - }, - "allow-unmaximize": { - "identifier": "allow-unmaximize", - "description": "Enables the unmaximize command without any pre-configured scope.", - "commands": { "allow": ["unmaximize"], "deny": [] } - }, - "allow-unminimize": { - "identifier": "allow-unminimize", - "description": "Enables the unminimize command without any pre-configured scope.", - "commands": { "allow": ["unminimize"], "deny": [] } - }, - "deny-available-monitors": { - "identifier": "deny-available-monitors", - "description": "Denies the available_monitors command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["available_monitors"] } - }, - "deny-center": { - "identifier": "deny-center", - "description": "Denies the center command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["center"] } - }, - "deny-close": { - "identifier": "deny-close", - "description": "Denies the close command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["close"] } - }, - "deny-create": { - "identifier": "deny-create", - "description": "Denies the create command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["create"] } - }, - "deny-current-monitor": { - "identifier": "deny-current-monitor", - "description": "Denies the current_monitor command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["current_monitor"] } - }, - "deny-cursor-position": { - "identifier": "deny-cursor-position", - "description": "Denies the cursor_position command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["cursor_position"] } - }, - "deny-destroy": { - "identifier": "deny-destroy", - "description": "Denies the destroy command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["destroy"] } - }, - "deny-get-all-windows": { - "identifier": "deny-get-all-windows", - "description": "Denies the get_all_windows command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["get_all_windows"] } - }, - "deny-hide": { - "identifier": "deny-hide", - "description": "Denies the hide command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["hide"] } - }, - "deny-inner-position": { - "identifier": "deny-inner-position", - "description": "Denies the inner_position command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["inner_position"] } - }, - "deny-inner-size": { - "identifier": "deny-inner-size", - "description": "Denies the inner_size command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["inner_size"] } - }, - "deny-internal-toggle-maximize": { - "identifier": "deny-internal-toggle-maximize", - "description": "Denies the internal_toggle_maximize command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["internal_toggle_maximize"] } - }, - "deny-is-always-on-top": { - "identifier": "deny-is-always-on-top", - "description": "Denies the is_always_on_top command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_always_on_top"] } - }, - "deny-is-closable": { - "identifier": "deny-is-closable", - "description": "Denies the is_closable command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_closable"] } - }, - "deny-is-decorated": { - "identifier": "deny-is-decorated", - "description": "Denies the is_decorated command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_decorated"] } - }, - "deny-is-enabled": { - "identifier": "deny-is-enabled", - "description": "Denies the is_enabled command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_enabled"] } - }, - "deny-is-focused": { - "identifier": "deny-is-focused", - "description": "Denies the is_focused command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_focused"] } - }, - "deny-is-fullscreen": { - "identifier": "deny-is-fullscreen", - "description": "Denies the is_fullscreen command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_fullscreen"] } - }, - "deny-is-maximizable": { - "identifier": "deny-is-maximizable", - "description": "Denies the is_maximizable command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_maximizable"] } - }, - "deny-is-maximized": { - "identifier": "deny-is-maximized", - "description": "Denies the is_maximized command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_maximized"] } - }, - "deny-is-minimizable": { - "identifier": "deny-is-minimizable", - "description": "Denies the is_minimizable command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_minimizable"] } - }, - "deny-is-minimized": { - "identifier": "deny-is-minimized", - "description": "Denies the is_minimized command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_minimized"] } - }, - "deny-is-resizable": { - "identifier": "deny-is-resizable", - "description": "Denies the is_resizable command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_resizable"] } - }, - "deny-is-visible": { - "identifier": "deny-is-visible", - "description": "Denies the is_visible command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["is_visible"] } - }, - "deny-maximize": { - "identifier": "deny-maximize", - "description": "Denies the maximize command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["maximize"] } - }, - "deny-minimize": { - "identifier": "deny-minimize", - "description": "Denies the minimize command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["minimize"] } - }, - "deny-monitor-from-point": { - "identifier": "deny-monitor-from-point", - "description": "Denies the monitor_from_point command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["monitor_from_point"] } - }, - "deny-outer-position": { - "identifier": "deny-outer-position", - "description": "Denies the outer_position command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["outer_position"] } - }, - "deny-outer-size": { - "identifier": "deny-outer-size", - "description": "Denies the outer_size command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["outer_size"] } - }, - "deny-primary-monitor": { - "identifier": "deny-primary-monitor", - "description": "Denies the primary_monitor command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["primary_monitor"] } - }, - "deny-request-user-attention": { - "identifier": "deny-request-user-attention", - "description": "Denies the request_user_attention command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["request_user_attention"] } - }, - "deny-scale-factor": { - "identifier": "deny-scale-factor", - "description": "Denies the scale_factor command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["scale_factor"] } - }, - "deny-set-always-on-bottom": { - "identifier": "deny-set-always-on-bottom", - "description": "Denies the set_always_on_bottom command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_always_on_bottom"] } - }, - "deny-set-always-on-top": { - "identifier": "deny-set-always-on-top", - "description": "Denies the set_always_on_top command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_always_on_top"] } - }, - "deny-set-background-color": { - "identifier": "deny-set-background-color", - "description": "Denies the set_background_color command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_background_color"] } - }, - "deny-set-badge-count": { - "identifier": "deny-set-badge-count", - "description": "Denies the set_badge_count command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_badge_count"] } - }, - "deny-set-badge-label": { - "identifier": "deny-set-badge-label", - "description": "Denies the set_badge_label command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_badge_label"] } - }, - "deny-set-closable": { - "identifier": "deny-set-closable", - "description": "Denies the set_closable command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_closable"] } - }, - "deny-set-content-protected": { - "identifier": "deny-set-content-protected", - "description": "Denies the set_content_protected command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_content_protected"] } - }, - "deny-set-cursor-grab": { - "identifier": "deny-set-cursor-grab", - "description": "Denies the set_cursor_grab command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_cursor_grab"] } - }, - "deny-set-cursor-icon": { - "identifier": "deny-set-cursor-icon", - "description": "Denies the set_cursor_icon command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_cursor_icon"] } - }, - "deny-set-cursor-position": { - "identifier": "deny-set-cursor-position", - "description": "Denies the set_cursor_position command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_cursor_position"] } - }, - "deny-set-cursor-visible": { - "identifier": "deny-set-cursor-visible", - "description": "Denies the set_cursor_visible command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_cursor_visible"] } - }, - "deny-set-decorations": { - "identifier": "deny-set-decorations", - "description": "Denies the set_decorations command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_decorations"] } - }, - "deny-set-effects": { - "identifier": "deny-set-effects", - "description": "Denies the set_effects command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_effects"] } - }, - "deny-set-enabled": { - "identifier": "deny-set-enabled", - "description": "Denies the set_enabled command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_enabled"] } - }, - "deny-set-focus": { - "identifier": "deny-set-focus", - "description": "Denies the set_focus command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_focus"] } - }, - "deny-set-focusable": { - "identifier": "deny-set-focusable", - "description": "Denies the set_focusable command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_focusable"] } - }, - "deny-set-fullscreen": { - "identifier": "deny-set-fullscreen", - "description": "Denies the set_fullscreen command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_fullscreen"] } - }, - "deny-set-icon": { - "identifier": "deny-set-icon", - "description": "Denies the set_icon command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_icon"] } - }, - "deny-set-ignore-cursor-events": { - "identifier": "deny-set-ignore-cursor-events", - "description": "Denies the set_ignore_cursor_events command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_ignore_cursor_events"] } - }, - "deny-set-max-size": { - "identifier": "deny-set-max-size", - "description": "Denies the set_max_size command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_max_size"] } - }, - "deny-set-maximizable": { - "identifier": "deny-set-maximizable", - "description": "Denies the set_maximizable command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_maximizable"] } - }, - "deny-set-min-size": { - "identifier": "deny-set-min-size", - "description": "Denies the set_min_size command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_min_size"] } - }, - "deny-set-minimizable": { - "identifier": "deny-set-minimizable", - "description": "Denies the set_minimizable command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_minimizable"] } - }, - "deny-set-overlay-icon": { - "identifier": "deny-set-overlay-icon", - "description": "Denies the set_overlay_icon command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_overlay_icon"] } - }, - "deny-set-position": { - "identifier": "deny-set-position", - "description": "Denies the set_position command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_position"] } - }, - "deny-set-progress-bar": { - "identifier": "deny-set-progress-bar", - "description": "Denies the set_progress_bar command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_progress_bar"] } - }, - "deny-set-resizable": { - "identifier": "deny-set-resizable", - "description": "Denies the set_resizable command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_resizable"] } - }, - "deny-set-shadow": { - "identifier": "deny-set-shadow", - "description": "Denies the set_shadow command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_shadow"] } - }, - "deny-set-simple-fullscreen": { - "identifier": "deny-set-simple-fullscreen", - "description": "Denies the set_simple_fullscreen command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_simple_fullscreen"] } - }, - "deny-set-size": { - "identifier": "deny-set-size", - "description": "Denies the set_size command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_size"] } - }, - "deny-set-size-constraints": { - "identifier": "deny-set-size-constraints", - "description": "Denies the set_size_constraints command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_size_constraints"] } - }, - "deny-set-skip-taskbar": { - "identifier": "deny-set-skip-taskbar", - "description": "Denies the set_skip_taskbar command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_skip_taskbar"] } - }, - "deny-set-theme": { - "identifier": "deny-set-theme", - "description": "Denies the set_theme command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_theme"] } - }, - "deny-set-title": { - "identifier": "deny-set-title", - "description": "Denies the set_title command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_title"] } - }, - "deny-set-title-bar-style": { - "identifier": "deny-set-title-bar-style", - "description": "Denies the set_title_bar_style command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_title_bar_style"] } - }, - "deny-set-visible-on-all-organizations": { - "identifier": "deny-set-visible-on-all-organizations", - "description": "Denies the set_visible_on_all_organizations command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["set_visible_on_all_organizations"] } - }, - "deny-show": { - "identifier": "deny-show", - "description": "Denies the show command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["show"] } - }, - "deny-start-dragging": { - "identifier": "deny-start-dragging", - "description": "Denies the start_dragging command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["start_dragging"] } - }, - "deny-start-resize-dragging": { - "identifier": "deny-start-resize-dragging", - "description": "Denies the start_resize_dragging command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["start_resize_dragging"] } - }, - "deny-theme": { - "identifier": "deny-theme", - "description": "Denies the theme command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["theme"] } - }, - "deny-title": { - "identifier": "deny-title", - "description": "Denies the title command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["title"] } - }, - "deny-toggle-maximize": { - "identifier": "deny-toggle-maximize", - "description": "Denies the toggle_maximize command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["toggle_maximize"] } - }, - "deny-unmaximize": { - "identifier": "deny-unmaximize", - "description": "Denies the unmaximize command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["unmaximize"] } - }, - "deny-unminimize": { - "identifier": "deny-unminimize", - "description": "Denies the unminimize command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["unminimize"] } - } - }, - "permission_sets": {}, - "global_scope_schema": null - }, - "shell": { - "default_permission": { - "identifier": "default", - "description": "This permission set configures which\nshell functionality is exposed by default.\n\n#### Granted Permissions\n\nIt allows to use the `open` functionality with a reasonable\nscope pre-configured. It will allow opening `http(s)://`,\n`tel:` and `mailto:` links.\n", - "permissions": ["allow-open"] - }, - "permissions": { - "allow-execute": { - "identifier": "allow-execute", - "description": "Enables the execute command without any pre-configured scope.", - "commands": { "allow": ["execute"], "deny": [] } - }, - "allow-kill": { - "identifier": "allow-kill", - "description": "Enables the kill command without any pre-configured scope.", - "commands": { "allow": ["kill"], "deny": [] } - }, - "allow-open": { - "identifier": "allow-open", - "description": "Enables the open command without any pre-configured scope.", - "commands": { "allow": ["open"], "deny": [] } - }, - "allow-spawn": { - "identifier": "allow-spawn", - "description": "Enables the spawn command without any pre-configured scope.", - "commands": { "allow": ["spawn"], "deny": [] } - }, - "allow-stdin-write": { - "identifier": "allow-stdin-write", - "description": "Enables the stdin_write command without any pre-configured scope.", - "commands": { "allow": ["stdin_write"], "deny": [] } - }, - "deny-execute": { - "identifier": "deny-execute", - "description": "Denies the execute command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["execute"] } - }, - "deny-kill": { - "identifier": "deny-kill", - "description": "Denies the kill command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["kill"] } - }, - "deny-open": { - "identifier": "deny-open", - "description": "Denies the open command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["open"] } - }, - "deny-spawn": { - "identifier": "deny-spawn", - "description": "Denies the spawn command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["spawn"] } - }, - "deny-stdin-write": { - "identifier": "deny-stdin-write", - "description": "Denies the stdin_write command without any pre-configured scope.", - "commands": { "allow": [], "deny": ["stdin_write"] } - } - }, - "permission_sets": {}, - "global_scope_schema": { - "$schema": "http://json-schema.org/draft-07/schema#", - "anyOf": [ - { - "additionalProperties": false, - "properties": { - "args": { "allOf": [{ "$ref": "#/definitions/ShellScopeEntryAllowedArgs" }], "description": "The allowed arguments for the command execution." }, - "cmd": { - "description": "The command name. It can start with a variable that resolves to a system base directory. The variables are: `$AUDIO`, `$CACHE`, `$CONFIG`, `$DATA`, `$LOCALDATA`, `$DESKTOP`, `$DOCUMENT`, `$DOWNLOAD`, `$EXE`, `$FONT`, `$HOME`, `$PICTURE`, `$PUBLIC`, `$RUNTIME`, `$TEMPLATE`, `$VIDEO`, `$RESOURCE`, `$LOG`, `$TEMP`, `$APPCONFIG`, `$APPDATA`, `$APPLOCALDATA`, `$APPCACHE`, `$APPLOG`.", - "type": "string" - }, - "name": { - "description": "The name for this allowed shell command configuration.\n\nThis name will be used inside of the webview API to call this command along with any specified arguments.", - "type": "string" - } - }, - "required": ["cmd", "name"], - "type": "object" - }, - { - "additionalProperties": false, - "properties": { - "args": { "allOf": [{ "$ref": "#/definitions/ShellScopeEntryAllowedArgs" }], "description": "The allowed arguments for the command execution." }, - "name": { - "description": "The name for this allowed shell command configuration.\n\nThis name will be used inside of the webview API to call this command along with any specified arguments.", - "type": "string" - }, - "sidecar": { "description": "If this command is a sidecar command.", "type": "boolean" } - }, - "required": ["name", "sidecar"], - "type": "object" - } - ], - "definitions": { - "ShellScopeEntryAllowedArg": { - "anyOf": [ - { "description": "A non-configurable argument that is passed to the command in the order it was specified.", "type": "string" }, - { - "additionalProperties": false, - "description": "A variable that is set while calling the command from the webview API.", - "properties": { - "raw": { - "default": false, - "description": "Marks the validator as a raw regex, meaning the plugin should not make any modification at runtime.\n\nThis means the regex will not match on the entire string by default, which might be exploited if your regex allow unexpected input to be considered valid. When using this option, make sure your regex is correct.", - "type": "boolean" - }, - "validator": { - "description": "[regex] validator to require passed values to conform to an expected input.\n\nThis will require the argument value passed to this variable to match the `validator` regex before it will be executed.\n\nThe regex string is by default surrounded by `^...$` to match the full string. For example the `https?://\\w+` regex would be registered as `^https?://\\w+$`.\n\n[regex]: ", - "type": "string" - } - }, - "required": ["validator"], - "type": "object" - } - ], - "description": "A command argument allowed to be executed by the webview API." - }, - "ShellScopeEntryAllowedArgs": { - "anyOf": [ - { "description": "Use a simple boolean to allow all or disable all arguments to this command configuration.", "type": "boolean" }, - { - "description": "A specific set of [`ShellScopeEntryAllowedArg`] that are valid to call for the command configuration.", - "items": { "$ref": "#/definitions/ShellScopeEntryAllowedArg" }, - "type": "array" - } - ], - "description": "A set of command arguments allowed to be executed by the webview API.\n\nA value of `true` will allow any arguments to be passed to the command. `false` will disable all arguments. A list of [`ShellScopeEntryAllowedArg`] will set those arguments as the only valid arguments to be passed to the attached command configuration." - } - }, - "description": "Shell scope entry.", - "title": "ShellScopeEntry" - } - } -} +{"core":{"default_permission":{"identifier":"default","description":"Default core plugins set.","permissions":["core:path:default","core:event:default","core:window:default","core:webview:default","core:app:default","core:image:default","core:resources:default","core:menu:default","core:tray:default"]},"permissions":{},"permission_sets":{},"global_scope_schema":null},"core:app":{"default_permission":{"identifier":"default","description":"Default permissions for the plugin.","permissions":["allow-version","allow-name","allow-tauri-version","allow-identifier","allow-bundle-type","allow-register-listener","allow-remove-listener"]},"permissions":{"allow-app-hide":{"identifier":"allow-app-hide","description":"Enables the app_hide command without any pre-configured scope.","commands":{"allow":["app_hide"],"deny":[]}},"allow-app-show":{"identifier":"allow-app-show","description":"Enables the app_show command without any pre-configured scope.","commands":{"allow":["app_show"],"deny":[]}},"allow-bundle-type":{"identifier":"allow-bundle-type","description":"Enables the bundle_type command without any pre-configured scope.","commands":{"allow":["bundle_type"],"deny":[]}},"allow-default-window-icon":{"identifier":"allow-default-window-icon","description":"Enables the default_window_icon command without any pre-configured scope.","commands":{"allow":["default_window_icon"],"deny":[]}},"allow-fetch-data-store-identifiers":{"identifier":"allow-fetch-data-store-identifiers","description":"Enables the fetch_data_store_identifiers command without any pre-configured scope.","commands":{"allow":["fetch_data_store_identifiers"],"deny":[]}},"allow-identifier":{"identifier":"allow-identifier","description":"Enables the identifier command without any pre-configured scope.","commands":{"allow":["identifier"],"deny":[]}},"allow-name":{"identifier":"allow-name","description":"Enables the name command without any pre-configured scope.","commands":{"allow":["name"],"deny":[]}},"allow-register-listener":{"identifier":"allow-register-listener","description":"Enables the register_listener command without any pre-configured scope.","commands":{"allow":["register_listener"],"deny":[]}},"allow-remove-data-store":{"identifier":"allow-remove-data-store","description":"Enables the remove_data_store command without any pre-configured scope.","commands":{"allow":["remove_data_store"],"deny":[]}},"allow-remove-listener":{"identifier":"allow-remove-listener","description":"Enables the remove_listener command without any pre-configured scope.","commands":{"allow":["remove_listener"],"deny":[]}},"allow-set-app-theme":{"identifier":"allow-set-app-theme","description":"Enables the set_app_theme command without any pre-configured scope.","commands":{"allow":["set_app_theme"],"deny":[]}},"allow-set-dock-visibility":{"identifier":"allow-set-dock-visibility","description":"Enables the set_dock_visibility command without any pre-configured scope.","commands":{"allow":["set_dock_visibility"],"deny":[]}},"allow-tauri-version":{"identifier":"allow-tauri-version","description":"Enables the tauri_version command without any pre-configured scope.","commands":{"allow":["tauri_version"],"deny":[]}},"allow-version":{"identifier":"allow-version","description":"Enables the version command without any pre-configured scope.","commands":{"allow":["version"],"deny":[]}},"deny-app-hide":{"identifier":"deny-app-hide","description":"Denies the app_hide command without any pre-configured scope.","commands":{"allow":[],"deny":["app_hide"]}},"deny-app-show":{"identifier":"deny-app-show","description":"Denies the app_show command without any pre-configured scope.","commands":{"allow":[],"deny":["app_show"]}},"deny-bundle-type":{"identifier":"deny-bundle-type","description":"Denies the bundle_type command without any pre-configured scope.","commands":{"allow":[],"deny":["bundle_type"]}},"deny-default-window-icon":{"identifier":"deny-default-window-icon","description":"Denies the default_window_icon command without any pre-configured scope.","commands":{"allow":[],"deny":["default_window_icon"]}},"deny-fetch-data-store-identifiers":{"identifier":"deny-fetch-data-store-identifiers","description":"Denies the fetch_data_store_identifiers command without any pre-configured scope.","commands":{"allow":[],"deny":["fetch_data_store_identifiers"]}},"deny-identifier":{"identifier":"deny-identifier","description":"Denies the identifier command without any pre-configured scope.","commands":{"allow":[],"deny":["identifier"]}},"deny-name":{"identifier":"deny-name","description":"Denies the name command without any pre-configured scope.","commands":{"allow":[],"deny":["name"]}},"deny-register-listener":{"identifier":"deny-register-listener","description":"Denies the register_listener command without any pre-configured scope.","commands":{"allow":[],"deny":["register_listener"]}},"deny-remove-data-store":{"identifier":"deny-remove-data-store","description":"Denies the remove_data_store command without any pre-configured scope.","commands":{"allow":[],"deny":["remove_data_store"]}},"deny-remove-listener":{"identifier":"deny-remove-listener","description":"Denies the remove_listener command without any pre-configured scope.","commands":{"allow":[],"deny":["remove_listener"]}},"deny-set-app-theme":{"identifier":"deny-set-app-theme","description":"Denies the set_app_theme command without any pre-configured scope.","commands":{"allow":[],"deny":["set_app_theme"]}},"deny-set-dock-visibility":{"identifier":"deny-set-dock-visibility","description":"Denies the set_dock_visibility command without any pre-configured scope.","commands":{"allow":[],"deny":["set_dock_visibility"]}},"deny-tauri-version":{"identifier":"deny-tauri-version","description":"Denies the tauri_version command without any pre-configured scope.","commands":{"allow":[],"deny":["tauri_version"]}},"deny-version":{"identifier":"deny-version","description":"Denies the version command without any pre-configured scope.","commands":{"allow":[],"deny":["version"]}}},"permission_sets":{},"global_scope_schema":null},"core:event":{"default_permission":{"identifier":"default","description":"Default permissions for the plugin, which enables all commands.","permissions":["allow-listen","allow-unlisten","allow-emit","allow-emit-to"]},"permissions":{"allow-emit":{"identifier":"allow-emit","description":"Enables the emit command without any pre-configured scope.","commands":{"allow":["emit"],"deny":[]}},"allow-emit-to":{"identifier":"allow-emit-to","description":"Enables the emit_to command without any pre-configured scope.","commands":{"allow":["emit_to"],"deny":[]}},"allow-listen":{"identifier":"allow-listen","description":"Enables the listen command without any pre-configured scope.","commands":{"allow":["listen"],"deny":[]}},"allow-unlisten":{"identifier":"allow-unlisten","description":"Enables the unlisten command without any pre-configured scope.","commands":{"allow":["unlisten"],"deny":[]}},"deny-emit":{"identifier":"deny-emit","description":"Denies the emit command without any pre-configured scope.","commands":{"allow":[],"deny":["emit"]}},"deny-emit-to":{"identifier":"deny-emit-to","description":"Denies the emit_to command without any pre-configured scope.","commands":{"allow":[],"deny":["emit_to"]}},"deny-listen":{"identifier":"deny-listen","description":"Denies the listen command without any pre-configured scope.","commands":{"allow":[],"deny":["listen"]}},"deny-unlisten":{"identifier":"deny-unlisten","description":"Denies the unlisten command without any pre-configured scope.","commands":{"allow":[],"deny":["unlisten"]}}},"permission_sets":{},"global_scope_schema":null},"core:image":{"default_permission":{"identifier":"default","description":"Default permissions for the plugin, which enables all commands.","permissions":["allow-new","allow-from-bytes","allow-from-path","allow-rgba","allow-size"]},"permissions":{"allow-from-bytes":{"identifier":"allow-from-bytes","description":"Enables the from_bytes command without any pre-configured scope.","commands":{"allow":["from_bytes"],"deny":[]}},"allow-from-path":{"identifier":"allow-from-path","description":"Enables the from_path command without any pre-configured scope.","commands":{"allow":["from_path"],"deny":[]}},"allow-new":{"identifier":"allow-new","description":"Enables the new command without any pre-configured scope.","commands":{"allow":["new"],"deny":[]}},"allow-rgba":{"identifier":"allow-rgba","description":"Enables the rgba command without any pre-configured scope.","commands":{"allow":["rgba"],"deny":[]}},"allow-size":{"identifier":"allow-size","description":"Enables the size command without any pre-configured scope.","commands":{"allow":["size"],"deny":[]}},"deny-from-bytes":{"identifier":"deny-from-bytes","description":"Denies the from_bytes command without any pre-configured scope.","commands":{"allow":[],"deny":["from_bytes"]}},"deny-from-path":{"identifier":"deny-from-path","description":"Denies the from_path command without any pre-configured scope.","commands":{"allow":[],"deny":["from_path"]}},"deny-new":{"identifier":"deny-new","description":"Denies the new command without any pre-configured scope.","commands":{"allow":[],"deny":["new"]}},"deny-rgba":{"identifier":"deny-rgba","description":"Denies the rgba command without any pre-configured scope.","commands":{"allow":[],"deny":["rgba"]}},"deny-size":{"identifier":"deny-size","description":"Denies the size command without any pre-configured scope.","commands":{"allow":[],"deny":["size"]}}},"permission_sets":{},"global_scope_schema":null},"core:menu":{"default_permission":{"identifier":"default","description":"Default permissions for the plugin, which enables all commands.","permissions":["allow-new","allow-append","allow-prepend","allow-insert","allow-remove","allow-remove-at","allow-items","allow-get","allow-popup","allow-create-default","allow-set-as-app-menu","allow-set-as-window-menu","allow-text","allow-set-text","allow-is-enabled","allow-set-enabled","allow-set-accelerator","allow-set-as-windows-menu-for-nsapp","allow-set-as-help-menu-for-nsapp","allow-is-checked","allow-set-checked","allow-set-icon"]},"permissions":{"allow-append":{"identifier":"allow-append","description":"Enables the append command without any pre-configured scope.","commands":{"allow":["append"],"deny":[]}},"allow-create-default":{"identifier":"allow-create-default","description":"Enables the create_default command without any pre-configured scope.","commands":{"allow":["create_default"],"deny":[]}},"allow-get":{"identifier":"allow-get","description":"Enables the get command without any pre-configured scope.","commands":{"allow":["get"],"deny":[]}},"allow-insert":{"identifier":"allow-insert","description":"Enables the insert command without any pre-configured scope.","commands":{"allow":["insert"],"deny":[]}},"allow-is-checked":{"identifier":"allow-is-checked","description":"Enables the is_checked command without any pre-configured scope.","commands":{"allow":["is_checked"],"deny":[]}},"allow-is-enabled":{"identifier":"allow-is-enabled","description":"Enables the is_enabled command without any pre-configured scope.","commands":{"allow":["is_enabled"],"deny":[]}},"allow-items":{"identifier":"allow-items","description":"Enables the items command without any pre-configured scope.","commands":{"allow":["items"],"deny":[]}},"allow-new":{"identifier":"allow-new","description":"Enables the new command without any pre-configured scope.","commands":{"allow":["new"],"deny":[]}},"allow-popup":{"identifier":"allow-popup","description":"Enables the popup command without any pre-configured scope.","commands":{"allow":["popup"],"deny":[]}},"allow-prepend":{"identifier":"allow-prepend","description":"Enables the prepend command without any pre-configured scope.","commands":{"allow":["prepend"],"deny":[]}},"allow-remove":{"identifier":"allow-remove","description":"Enables the remove command without any pre-configured scope.","commands":{"allow":["remove"],"deny":[]}},"allow-remove-at":{"identifier":"allow-remove-at","description":"Enables the remove_at command without any pre-configured scope.","commands":{"allow":["remove_at"],"deny":[]}},"allow-set-accelerator":{"identifier":"allow-set-accelerator","description":"Enables the set_accelerator command without any pre-configured scope.","commands":{"allow":["set_accelerator"],"deny":[]}},"allow-set-as-app-menu":{"identifier":"allow-set-as-app-menu","description":"Enables the set_as_app_menu command without any pre-configured scope.","commands":{"allow":["set_as_app_menu"],"deny":[]}},"allow-set-as-help-menu-for-nsapp":{"identifier":"allow-set-as-help-menu-for-nsapp","description":"Enables the set_as_help_menu_for_nsapp command without any pre-configured scope.","commands":{"allow":["set_as_help_menu_for_nsapp"],"deny":[]}},"allow-set-as-window-menu":{"identifier":"allow-set-as-window-menu","description":"Enables the set_as_window_menu command without any pre-configured scope.","commands":{"allow":["set_as_window_menu"],"deny":[]}},"allow-set-as-windows-menu-for-nsapp":{"identifier":"allow-set-as-windows-menu-for-nsapp","description":"Enables the set_as_windows_menu_for_nsapp command without any pre-configured scope.","commands":{"allow":["set_as_windows_menu_for_nsapp"],"deny":[]}},"allow-set-checked":{"identifier":"allow-set-checked","description":"Enables the set_checked command without any pre-configured scope.","commands":{"allow":["set_checked"],"deny":[]}},"allow-set-enabled":{"identifier":"allow-set-enabled","description":"Enables the set_enabled command without any pre-configured scope.","commands":{"allow":["set_enabled"],"deny":[]}},"allow-set-icon":{"identifier":"allow-set-icon","description":"Enables the set_icon command without any pre-configured scope.","commands":{"allow":["set_icon"],"deny":[]}},"allow-set-text":{"identifier":"allow-set-text","description":"Enables the set_text command without any pre-configured scope.","commands":{"allow":["set_text"],"deny":[]}},"allow-text":{"identifier":"allow-text","description":"Enables the text command without any pre-configured scope.","commands":{"allow":["text"],"deny":[]}},"deny-append":{"identifier":"deny-append","description":"Denies the append command without any pre-configured scope.","commands":{"allow":[],"deny":["append"]}},"deny-create-default":{"identifier":"deny-create-default","description":"Denies the create_default command without any pre-configured scope.","commands":{"allow":[],"deny":["create_default"]}},"deny-get":{"identifier":"deny-get","description":"Denies the get command without any pre-configured scope.","commands":{"allow":[],"deny":["get"]}},"deny-insert":{"identifier":"deny-insert","description":"Denies the insert command without any pre-configured scope.","commands":{"allow":[],"deny":["insert"]}},"deny-is-checked":{"identifier":"deny-is-checked","description":"Denies the is_checked command without any pre-configured scope.","commands":{"allow":[],"deny":["is_checked"]}},"deny-is-enabled":{"identifier":"deny-is-enabled","description":"Denies the is_enabled command without any pre-configured scope.","commands":{"allow":[],"deny":["is_enabled"]}},"deny-items":{"identifier":"deny-items","description":"Denies the items command without any pre-configured scope.","commands":{"allow":[],"deny":["items"]}},"deny-new":{"identifier":"deny-new","description":"Denies the new command without any pre-configured scope.","commands":{"allow":[],"deny":["new"]}},"deny-popup":{"identifier":"deny-popup","description":"Denies the popup command without any pre-configured scope.","commands":{"allow":[],"deny":["popup"]}},"deny-prepend":{"identifier":"deny-prepend","description":"Denies the prepend command without any pre-configured scope.","commands":{"allow":[],"deny":["prepend"]}},"deny-remove":{"identifier":"deny-remove","description":"Denies the remove command without any pre-configured scope.","commands":{"allow":[],"deny":["remove"]}},"deny-remove-at":{"identifier":"deny-remove-at","description":"Denies the remove_at command without any pre-configured scope.","commands":{"allow":[],"deny":["remove_at"]}},"deny-set-accelerator":{"identifier":"deny-set-accelerator","description":"Denies the set_accelerator command without any pre-configured scope.","commands":{"allow":[],"deny":["set_accelerator"]}},"deny-set-as-app-menu":{"identifier":"deny-set-as-app-menu","description":"Denies the set_as_app_menu command without any pre-configured scope.","commands":{"allow":[],"deny":["set_as_app_menu"]}},"deny-set-as-help-menu-for-nsapp":{"identifier":"deny-set-as-help-menu-for-nsapp","description":"Denies the set_as_help_menu_for_nsapp command without any pre-configured scope.","commands":{"allow":[],"deny":["set_as_help_menu_for_nsapp"]}},"deny-set-as-window-menu":{"identifier":"deny-set-as-window-menu","description":"Denies the set_as_window_menu command without any pre-configured scope.","commands":{"allow":[],"deny":["set_as_window_menu"]}},"deny-set-as-windows-menu-for-nsapp":{"identifier":"deny-set-as-windows-menu-for-nsapp","description":"Denies the set_as_windows_menu_for_nsapp command without any pre-configured scope.","commands":{"allow":[],"deny":["set_as_windows_menu_for_nsapp"]}},"deny-set-checked":{"identifier":"deny-set-checked","description":"Denies the set_checked command without any pre-configured scope.","commands":{"allow":[],"deny":["set_checked"]}},"deny-set-enabled":{"identifier":"deny-set-enabled","description":"Denies the set_enabled command without any pre-configured scope.","commands":{"allow":[],"deny":["set_enabled"]}},"deny-set-icon":{"identifier":"deny-set-icon","description":"Denies the set_icon command without any pre-configured scope.","commands":{"allow":[],"deny":["set_icon"]}},"deny-set-text":{"identifier":"deny-set-text","description":"Denies the set_text command without any pre-configured scope.","commands":{"allow":[],"deny":["set_text"]}},"deny-text":{"identifier":"deny-text","description":"Denies the text command without any pre-configured scope.","commands":{"allow":[],"deny":["text"]}}},"permission_sets":{},"global_scope_schema":null},"core:path":{"default_permission":{"identifier":"default","description":"Default permissions for the plugin, which enables all commands.","permissions":["allow-resolve-directory","allow-resolve","allow-normalize","allow-join","allow-dirname","allow-extname","allow-basename","allow-is-absolute"]},"permissions":{"allow-basename":{"identifier":"allow-basename","description":"Enables the basename command without any pre-configured scope.","commands":{"allow":["basename"],"deny":[]}},"allow-dirname":{"identifier":"allow-dirname","description":"Enables the dirname command without any pre-configured scope.","commands":{"allow":["dirname"],"deny":[]}},"allow-extname":{"identifier":"allow-extname","description":"Enables the extname command without any pre-configured scope.","commands":{"allow":["extname"],"deny":[]}},"allow-is-absolute":{"identifier":"allow-is-absolute","description":"Enables the is_absolute command without any pre-configured scope.","commands":{"allow":["is_absolute"],"deny":[]}},"allow-join":{"identifier":"allow-join","description":"Enables the join command without any pre-configured scope.","commands":{"allow":["join"],"deny":[]}},"allow-normalize":{"identifier":"allow-normalize","description":"Enables the normalize command without any pre-configured scope.","commands":{"allow":["normalize"],"deny":[]}},"allow-resolve":{"identifier":"allow-resolve","description":"Enables the resolve command without any pre-configured scope.","commands":{"allow":["resolve"],"deny":[]}},"allow-resolve-directory":{"identifier":"allow-resolve-directory","description":"Enables the resolve_directory command without any pre-configured scope.","commands":{"allow":["resolve_directory"],"deny":[]}},"deny-basename":{"identifier":"deny-basename","description":"Denies the basename command without any pre-configured scope.","commands":{"allow":[],"deny":["basename"]}},"deny-dirname":{"identifier":"deny-dirname","description":"Denies the dirname command without any pre-configured scope.","commands":{"allow":[],"deny":["dirname"]}},"deny-extname":{"identifier":"deny-extname","description":"Denies the extname command without any pre-configured scope.","commands":{"allow":[],"deny":["extname"]}},"deny-is-absolute":{"identifier":"deny-is-absolute","description":"Denies the is_absolute command without any pre-configured scope.","commands":{"allow":[],"deny":["is_absolute"]}},"deny-join":{"identifier":"deny-join","description":"Denies the join command without any pre-configured scope.","commands":{"allow":[],"deny":["join"]}},"deny-normalize":{"identifier":"deny-normalize","description":"Denies the normalize command without any pre-configured scope.","commands":{"allow":[],"deny":["normalize"]}},"deny-resolve":{"identifier":"deny-resolve","description":"Denies the resolve command without any pre-configured scope.","commands":{"allow":[],"deny":["resolve"]}},"deny-resolve-directory":{"identifier":"deny-resolve-directory","description":"Denies the resolve_directory command without any pre-configured scope.","commands":{"allow":[],"deny":["resolve_directory"]}}},"permission_sets":{},"global_scope_schema":null},"core:resources":{"default_permission":{"identifier":"default","description":"Default permissions for the plugin, which enables all commands.","permissions":["allow-close"]},"permissions":{"allow-close":{"identifier":"allow-close","description":"Enables the close command without any pre-configured scope.","commands":{"allow":["close"],"deny":[]}},"deny-close":{"identifier":"deny-close","description":"Denies the close command without any pre-configured scope.","commands":{"allow":[],"deny":["close"]}}},"permission_sets":{},"global_scope_schema":null},"core:tray":{"default_permission":{"identifier":"default","description":"Default permissions for the plugin, which enables all commands.","permissions":["allow-new","allow-get-by-id","allow-remove-by-id","allow-set-icon","allow-set-menu","allow-set-tooltip","allow-set-title","allow-set-visible","allow-set-temp-dir-path","allow-set-icon-as-template","allow-set-show-menu-on-left-click"]},"permissions":{"allow-get-by-id":{"identifier":"allow-get-by-id","description":"Enables the get_by_id command without any pre-configured scope.","commands":{"allow":["get_by_id"],"deny":[]}},"allow-new":{"identifier":"allow-new","description":"Enables the new command without any pre-configured scope.","commands":{"allow":["new"],"deny":[]}},"allow-remove-by-id":{"identifier":"allow-remove-by-id","description":"Enables the remove_by_id command without any pre-configured scope.","commands":{"allow":["remove_by_id"],"deny":[]}},"allow-set-icon":{"identifier":"allow-set-icon","description":"Enables the set_icon command without any pre-configured scope.","commands":{"allow":["set_icon"],"deny":[]}},"allow-set-icon-as-template":{"identifier":"allow-set-icon-as-template","description":"Enables the set_icon_as_template command without any pre-configured scope.","commands":{"allow":["set_icon_as_template"],"deny":[]}},"allow-set-menu":{"identifier":"allow-set-menu","description":"Enables the set_menu command without any pre-configured scope.","commands":{"allow":["set_menu"],"deny":[]}},"allow-set-show-menu-on-left-click":{"identifier":"allow-set-show-menu-on-left-click","description":"Enables the set_show_menu_on_left_click command without any pre-configured scope.","commands":{"allow":["set_show_menu_on_left_click"],"deny":[]}},"allow-set-temp-dir-path":{"identifier":"allow-set-temp-dir-path","description":"Enables the set_temp_dir_path command without any pre-configured scope.","commands":{"allow":["set_temp_dir_path"],"deny":[]}},"allow-set-title":{"identifier":"allow-set-title","description":"Enables the set_title command without any pre-configured scope.","commands":{"allow":["set_title"],"deny":[]}},"allow-set-tooltip":{"identifier":"allow-set-tooltip","description":"Enables the set_tooltip command without any pre-configured scope.","commands":{"allow":["set_tooltip"],"deny":[]}},"allow-set-visible":{"identifier":"allow-set-visible","description":"Enables the set_visible command without any pre-configured scope.","commands":{"allow":["set_visible"],"deny":[]}},"deny-get-by-id":{"identifier":"deny-get-by-id","description":"Denies the get_by_id command without any pre-configured scope.","commands":{"allow":[],"deny":["get_by_id"]}},"deny-new":{"identifier":"deny-new","description":"Denies the new command without any pre-configured scope.","commands":{"allow":[],"deny":["new"]}},"deny-remove-by-id":{"identifier":"deny-remove-by-id","description":"Denies the remove_by_id command without any pre-configured scope.","commands":{"allow":[],"deny":["remove_by_id"]}},"deny-set-icon":{"identifier":"deny-set-icon","description":"Denies the set_icon command without any pre-configured scope.","commands":{"allow":[],"deny":["set_icon"]}},"deny-set-icon-as-template":{"identifier":"deny-set-icon-as-template","description":"Denies the set_icon_as_template command without any pre-configured scope.","commands":{"allow":[],"deny":["set_icon_as_template"]}},"deny-set-menu":{"identifier":"deny-set-menu","description":"Denies the set_menu command without any pre-configured scope.","commands":{"allow":[],"deny":["set_menu"]}},"deny-set-show-menu-on-left-click":{"identifier":"deny-set-show-menu-on-left-click","description":"Denies the set_show_menu_on_left_click command without any pre-configured scope.","commands":{"allow":[],"deny":["set_show_menu_on_left_click"]}},"deny-set-temp-dir-path":{"identifier":"deny-set-temp-dir-path","description":"Denies the set_temp_dir_path command without any pre-configured scope.","commands":{"allow":[],"deny":["set_temp_dir_path"]}},"deny-set-title":{"identifier":"deny-set-title","description":"Denies the set_title command without any pre-configured scope.","commands":{"allow":[],"deny":["set_title"]}},"deny-set-tooltip":{"identifier":"deny-set-tooltip","description":"Denies the set_tooltip command without any pre-configured scope.","commands":{"allow":[],"deny":["set_tooltip"]}},"deny-set-visible":{"identifier":"deny-set-visible","description":"Denies the set_visible command without any pre-configured scope.","commands":{"allow":[],"deny":["set_visible"]}}},"permission_sets":{},"global_scope_schema":null},"core:webview":{"default_permission":{"identifier":"default","description":"Default permissions for the plugin.","permissions":["allow-get-all-webviews","allow-webview-position","allow-webview-size","allow-internal-toggle-devtools"]},"permissions":{"allow-clear-all-browsing-data":{"identifier":"allow-clear-all-browsing-data","description":"Enables the clear_all_browsing_data command without any pre-configured scope.","commands":{"allow":["clear_all_browsing_data"],"deny":[]}},"allow-create-webview":{"identifier":"allow-create-webview","description":"Enables the create_webview command without any pre-configured scope.","commands":{"allow":["create_webview"],"deny":[]}},"allow-create-webview-window":{"identifier":"allow-create-webview-window","description":"Enables the create_webview_window command without any pre-configured scope.","commands":{"allow":["create_webview_window"],"deny":[]}},"allow-get-all-webviews":{"identifier":"allow-get-all-webviews","description":"Enables the get_all_webviews command without any pre-configured scope.","commands":{"allow":["get_all_webviews"],"deny":[]}},"allow-internal-toggle-devtools":{"identifier":"allow-internal-toggle-devtools","description":"Enables the internal_toggle_devtools command without any pre-configured scope.","commands":{"allow":["internal_toggle_devtools"],"deny":[]}},"allow-print":{"identifier":"allow-print","description":"Enables the print command without any pre-configured scope.","commands":{"allow":["print"],"deny":[]}},"allow-reparent":{"identifier":"allow-reparent","description":"Enables the reparent command without any pre-configured scope.","commands":{"allow":["reparent"],"deny":[]}},"allow-set-webview-auto-resize":{"identifier":"allow-set-webview-auto-resize","description":"Enables the set_webview_auto_resize command without any pre-configured scope.","commands":{"allow":["set_webview_auto_resize"],"deny":[]}},"allow-set-webview-background-color":{"identifier":"allow-set-webview-background-color","description":"Enables the set_webview_background_color command without any pre-configured scope.","commands":{"allow":["set_webview_background_color"],"deny":[]}},"allow-set-webview-focus":{"identifier":"allow-set-webview-focus","description":"Enables the set_webview_focus command without any pre-configured scope.","commands":{"allow":["set_webview_focus"],"deny":[]}},"allow-set-webview-position":{"identifier":"allow-set-webview-position","description":"Enables the set_webview_position command without any pre-configured scope.","commands":{"allow":["set_webview_position"],"deny":[]}},"allow-set-webview-size":{"identifier":"allow-set-webview-size","description":"Enables the set_webview_size command without any pre-configured scope.","commands":{"allow":["set_webview_size"],"deny":[]}},"allow-set-webview-zoom":{"identifier":"allow-set-webview-zoom","description":"Enables the set_webview_zoom command without any pre-configured scope.","commands":{"allow":["set_webview_zoom"],"deny":[]}},"allow-webview-close":{"identifier":"allow-webview-close","description":"Enables the webview_close command without any pre-configured scope.","commands":{"allow":["webview_close"],"deny":[]}},"allow-webview-hide":{"identifier":"allow-webview-hide","description":"Enables the webview_hide command without any pre-configured scope.","commands":{"allow":["webview_hide"],"deny":[]}},"allow-webview-position":{"identifier":"allow-webview-position","description":"Enables the webview_position command without any pre-configured scope.","commands":{"allow":["webview_position"],"deny":[]}},"allow-webview-show":{"identifier":"allow-webview-show","description":"Enables the webview_show command without any pre-configured scope.","commands":{"allow":["webview_show"],"deny":[]}},"allow-webview-size":{"identifier":"allow-webview-size","description":"Enables the webview_size command without any pre-configured scope.","commands":{"allow":["webview_size"],"deny":[]}},"deny-clear-all-browsing-data":{"identifier":"deny-clear-all-browsing-data","description":"Denies the clear_all_browsing_data command without any pre-configured scope.","commands":{"allow":[],"deny":["clear_all_browsing_data"]}},"deny-create-webview":{"identifier":"deny-create-webview","description":"Denies the create_webview command without any pre-configured scope.","commands":{"allow":[],"deny":["create_webview"]}},"deny-create-webview-window":{"identifier":"deny-create-webview-window","description":"Denies the create_webview_window command without any pre-configured scope.","commands":{"allow":[],"deny":["create_webview_window"]}},"deny-get-all-webviews":{"identifier":"deny-get-all-webviews","description":"Denies the get_all_webviews command without any pre-configured scope.","commands":{"allow":[],"deny":["get_all_webviews"]}},"deny-internal-toggle-devtools":{"identifier":"deny-internal-toggle-devtools","description":"Denies the internal_toggle_devtools command without any pre-configured scope.","commands":{"allow":[],"deny":["internal_toggle_devtools"]}},"deny-print":{"identifier":"deny-print","description":"Denies the print command without any pre-configured scope.","commands":{"allow":[],"deny":["print"]}},"deny-reparent":{"identifier":"deny-reparent","description":"Denies the reparent command without any pre-configured scope.","commands":{"allow":[],"deny":["reparent"]}},"deny-set-webview-auto-resize":{"identifier":"deny-set-webview-auto-resize","description":"Denies the set_webview_auto_resize command without any pre-configured scope.","commands":{"allow":[],"deny":["set_webview_auto_resize"]}},"deny-set-webview-background-color":{"identifier":"deny-set-webview-background-color","description":"Denies the set_webview_background_color command without any pre-configured scope.","commands":{"allow":[],"deny":["set_webview_background_color"]}},"deny-set-webview-focus":{"identifier":"deny-set-webview-focus","description":"Denies the set_webview_focus command without any pre-configured scope.","commands":{"allow":[],"deny":["set_webview_focus"]}},"deny-set-webview-position":{"identifier":"deny-set-webview-position","description":"Denies the set_webview_position command without any pre-configured scope.","commands":{"allow":[],"deny":["set_webview_position"]}},"deny-set-webview-size":{"identifier":"deny-set-webview-size","description":"Denies the set_webview_size command without any pre-configured scope.","commands":{"allow":[],"deny":["set_webview_size"]}},"deny-set-webview-zoom":{"identifier":"deny-set-webview-zoom","description":"Denies the set_webview_zoom command without any pre-configured scope.","commands":{"allow":[],"deny":["set_webview_zoom"]}},"deny-webview-close":{"identifier":"deny-webview-close","description":"Denies the webview_close command without any pre-configured scope.","commands":{"allow":[],"deny":["webview_close"]}},"deny-webview-hide":{"identifier":"deny-webview-hide","description":"Denies the webview_hide command without any pre-configured scope.","commands":{"allow":[],"deny":["webview_hide"]}},"deny-webview-position":{"identifier":"deny-webview-position","description":"Denies the webview_position command without any pre-configured scope.","commands":{"allow":[],"deny":["webview_position"]}},"deny-webview-show":{"identifier":"deny-webview-show","description":"Denies the webview_show command without any pre-configured scope.","commands":{"allow":[],"deny":["webview_show"]}},"deny-webview-size":{"identifier":"deny-webview-size","description":"Denies the webview_size command without any pre-configured scope.","commands":{"allow":[],"deny":["webview_size"]}}},"permission_sets":{},"global_scope_schema":null},"core:window":{"default_permission":{"identifier":"default","description":"Default permissions for the plugin.","permissions":["allow-get-all-windows","allow-scale-factor","allow-inner-position","allow-outer-position","allow-inner-size","allow-outer-size","allow-is-fullscreen","allow-is-minimized","allow-is-maximized","allow-is-focused","allow-is-decorated","allow-is-resizable","allow-is-maximizable","allow-is-minimizable","allow-is-closable","allow-is-visible","allow-is-enabled","allow-title","allow-current-monitor","allow-primary-monitor","allow-monitor-from-point","allow-available-monitors","allow-cursor-position","allow-theme","allow-is-always-on-top","allow-internal-toggle-maximize"]},"permissions":{"allow-available-monitors":{"identifier":"allow-available-monitors","description":"Enables the available_monitors command without any pre-configured scope.","commands":{"allow":["available_monitors"],"deny":[]}},"allow-center":{"identifier":"allow-center","description":"Enables the center command without any pre-configured scope.","commands":{"allow":["center"],"deny":[]}},"allow-close":{"identifier":"allow-close","description":"Enables the close command without any pre-configured scope.","commands":{"allow":["close"],"deny":[]}},"allow-create":{"identifier":"allow-create","description":"Enables the create command without any pre-configured scope.","commands":{"allow":["create"],"deny":[]}},"allow-current-monitor":{"identifier":"allow-current-monitor","description":"Enables the current_monitor command without any pre-configured scope.","commands":{"allow":["current_monitor"],"deny":[]}},"allow-cursor-position":{"identifier":"allow-cursor-position","description":"Enables the cursor_position command without any pre-configured scope.","commands":{"allow":["cursor_position"],"deny":[]}},"allow-destroy":{"identifier":"allow-destroy","description":"Enables the destroy command without any pre-configured scope.","commands":{"allow":["destroy"],"deny":[]}},"allow-get-all-windows":{"identifier":"allow-get-all-windows","description":"Enables the get_all_windows command without any pre-configured scope.","commands":{"allow":["get_all_windows"],"deny":[]}},"allow-hide":{"identifier":"allow-hide","description":"Enables the hide command without any pre-configured scope.","commands":{"allow":["hide"],"deny":[]}},"allow-inner-position":{"identifier":"allow-inner-position","description":"Enables the inner_position command without any pre-configured scope.","commands":{"allow":["inner_position"],"deny":[]}},"allow-inner-size":{"identifier":"allow-inner-size","description":"Enables the inner_size command without any pre-configured scope.","commands":{"allow":["inner_size"],"deny":[]}},"allow-internal-toggle-maximize":{"identifier":"allow-internal-toggle-maximize","description":"Enables the internal_toggle_maximize command without any pre-configured scope.","commands":{"allow":["internal_toggle_maximize"],"deny":[]}},"allow-is-always-on-top":{"identifier":"allow-is-always-on-top","description":"Enables the is_always_on_top command without any pre-configured scope.","commands":{"allow":["is_always_on_top"],"deny":[]}},"allow-is-closable":{"identifier":"allow-is-closable","description":"Enables the is_closable command without any pre-configured scope.","commands":{"allow":["is_closable"],"deny":[]}},"allow-is-decorated":{"identifier":"allow-is-decorated","description":"Enables the is_decorated command without any pre-configured scope.","commands":{"allow":["is_decorated"],"deny":[]}},"allow-is-enabled":{"identifier":"allow-is-enabled","description":"Enables the is_enabled command without any pre-configured scope.","commands":{"allow":["is_enabled"],"deny":[]}},"allow-is-focused":{"identifier":"allow-is-focused","description":"Enables the is_focused command without any pre-configured scope.","commands":{"allow":["is_focused"],"deny":[]}},"allow-is-fullscreen":{"identifier":"allow-is-fullscreen","description":"Enables the is_fullscreen command without any pre-configured scope.","commands":{"allow":["is_fullscreen"],"deny":[]}},"allow-is-maximizable":{"identifier":"allow-is-maximizable","description":"Enables the is_maximizable command without any pre-configured scope.","commands":{"allow":["is_maximizable"],"deny":[]}},"allow-is-maximized":{"identifier":"allow-is-maximized","description":"Enables the is_maximized command without any pre-configured scope.","commands":{"allow":["is_maximized"],"deny":[]}},"allow-is-minimizable":{"identifier":"allow-is-minimizable","description":"Enables the is_minimizable command without any pre-configured scope.","commands":{"allow":["is_minimizable"],"deny":[]}},"allow-is-minimized":{"identifier":"allow-is-minimized","description":"Enables the is_minimized command without any pre-configured scope.","commands":{"allow":["is_minimized"],"deny":[]}},"allow-is-resizable":{"identifier":"allow-is-resizable","description":"Enables the is_resizable command without any pre-configured scope.","commands":{"allow":["is_resizable"],"deny":[]}},"allow-is-visible":{"identifier":"allow-is-visible","description":"Enables the is_visible command without any pre-configured scope.","commands":{"allow":["is_visible"],"deny":[]}},"allow-maximize":{"identifier":"allow-maximize","description":"Enables the maximize command without any pre-configured scope.","commands":{"allow":["maximize"],"deny":[]}},"allow-minimize":{"identifier":"allow-minimize","description":"Enables the minimize command without any pre-configured scope.","commands":{"allow":["minimize"],"deny":[]}},"allow-monitor-from-point":{"identifier":"allow-monitor-from-point","description":"Enables the monitor_from_point command without any pre-configured scope.","commands":{"allow":["monitor_from_point"],"deny":[]}},"allow-outer-position":{"identifier":"allow-outer-position","description":"Enables the outer_position command without any pre-configured scope.","commands":{"allow":["outer_position"],"deny":[]}},"allow-outer-size":{"identifier":"allow-outer-size","description":"Enables the outer_size command without any pre-configured scope.","commands":{"allow":["outer_size"],"deny":[]}},"allow-primary-monitor":{"identifier":"allow-primary-monitor","description":"Enables the primary_monitor command without any pre-configured scope.","commands":{"allow":["primary_monitor"],"deny":[]}},"allow-request-user-attention":{"identifier":"allow-request-user-attention","description":"Enables the request_user_attention command without any pre-configured scope.","commands":{"allow":["request_user_attention"],"deny":[]}},"allow-scale-factor":{"identifier":"allow-scale-factor","description":"Enables the scale_factor command without any pre-configured scope.","commands":{"allow":["scale_factor"],"deny":[]}},"allow-set-always-on-bottom":{"identifier":"allow-set-always-on-bottom","description":"Enables the set_always_on_bottom command without any pre-configured scope.","commands":{"allow":["set_always_on_bottom"],"deny":[]}},"allow-set-always-on-top":{"identifier":"allow-set-always-on-top","description":"Enables the set_always_on_top command without any pre-configured scope.","commands":{"allow":["set_always_on_top"],"deny":[]}},"allow-set-background-color":{"identifier":"allow-set-background-color","description":"Enables the set_background_color command without any pre-configured scope.","commands":{"allow":["set_background_color"],"deny":[]}},"allow-set-badge-count":{"identifier":"allow-set-badge-count","description":"Enables the set_badge_count command without any pre-configured scope.","commands":{"allow":["set_badge_count"],"deny":[]}},"allow-set-badge-label":{"identifier":"allow-set-badge-label","description":"Enables the set_badge_label command without any pre-configured scope.","commands":{"allow":["set_badge_label"],"deny":[]}},"allow-set-closable":{"identifier":"allow-set-closable","description":"Enables the set_closable command without any pre-configured scope.","commands":{"allow":["set_closable"],"deny":[]}},"allow-set-content-protected":{"identifier":"allow-set-content-protected","description":"Enables the set_content_protected command without any pre-configured scope.","commands":{"allow":["set_content_protected"],"deny":[]}},"allow-set-cursor-grab":{"identifier":"allow-set-cursor-grab","description":"Enables the set_cursor_grab command without any pre-configured scope.","commands":{"allow":["set_cursor_grab"],"deny":[]}},"allow-set-cursor-icon":{"identifier":"allow-set-cursor-icon","description":"Enables the set_cursor_icon command without any pre-configured scope.","commands":{"allow":["set_cursor_icon"],"deny":[]}},"allow-set-cursor-position":{"identifier":"allow-set-cursor-position","description":"Enables the set_cursor_position command without any pre-configured scope.","commands":{"allow":["set_cursor_position"],"deny":[]}},"allow-set-cursor-visible":{"identifier":"allow-set-cursor-visible","description":"Enables the set_cursor_visible command without any pre-configured scope.","commands":{"allow":["set_cursor_visible"],"deny":[]}},"allow-set-decorations":{"identifier":"allow-set-decorations","description":"Enables the set_decorations command without any pre-configured scope.","commands":{"allow":["set_decorations"],"deny":[]}},"allow-set-effects":{"identifier":"allow-set-effects","description":"Enables the set_effects command without any pre-configured scope.","commands":{"allow":["set_effects"],"deny":[]}},"allow-set-enabled":{"identifier":"allow-set-enabled","description":"Enables the set_enabled command without any pre-configured scope.","commands":{"allow":["set_enabled"],"deny":[]}},"allow-set-focus":{"identifier":"allow-set-focus","description":"Enables the set_focus command without any pre-configured scope.","commands":{"allow":["set_focus"],"deny":[]}},"allow-set-focusable":{"identifier":"allow-set-focusable","description":"Enables the set_focusable command without any pre-configured scope.","commands":{"allow":["set_focusable"],"deny":[]}},"allow-set-fullscreen":{"identifier":"allow-set-fullscreen","description":"Enables the set_fullscreen command without any pre-configured scope.","commands":{"allow":["set_fullscreen"],"deny":[]}},"allow-set-icon":{"identifier":"allow-set-icon","description":"Enables the set_icon command without any pre-configured scope.","commands":{"allow":["set_icon"],"deny":[]}},"allow-set-ignore-cursor-events":{"identifier":"allow-set-ignore-cursor-events","description":"Enables the set_ignore_cursor_events command without any pre-configured scope.","commands":{"allow":["set_ignore_cursor_events"],"deny":[]}},"allow-set-max-size":{"identifier":"allow-set-max-size","description":"Enables the set_max_size command without any pre-configured scope.","commands":{"allow":["set_max_size"],"deny":[]}},"allow-set-maximizable":{"identifier":"allow-set-maximizable","description":"Enables the set_maximizable command without any pre-configured scope.","commands":{"allow":["set_maximizable"],"deny":[]}},"allow-set-min-size":{"identifier":"allow-set-min-size","description":"Enables the set_min_size command without any pre-configured scope.","commands":{"allow":["set_min_size"],"deny":[]}},"allow-set-minimizable":{"identifier":"allow-set-minimizable","description":"Enables the set_minimizable command without any pre-configured scope.","commands":{"allow":["set_minimizable"],"deny":[]}},"allow-set-overlay-icon":{"identifier":"allow-set-overlay-icon","description":"Enables the set_overlay_icon command without any pre-configured scope.","commands":{"allow":["set_overlay_icon"],"deny":[]}},"allow-set-position":{"identifier":"allow-set-position","description":"Enables the set_position command without any pre-configured scope.","commands":{"allow":["set_position"],"deny":[]}},"allow-set-progress-bar":{"identifier":"allow-set-progress-bar","description":"Enables the set_progress_bar command without any pre-configured scope.","commands":{"allow":["set_progress_bar"],"deny":[]}},"allow-set-resizable":{"identifier":"allow-set-resizable","description":"Enables the set_resizable command without any pre-configured scope.","commands":{"allow":["set_resizable"],"deny":[]}},"allow-set-shadow":{"identifier":"allow-set-shadow","description":"Enables the set_shadow command without any pre-configured scope.","commands":{"allow":["set_shadow"],"deny":[]}},"allow-set-simple-fullscreen":{"identifier":"allow-set-simple-fullscreen","description":"Enables the set_simple_fullscreen command without any pre-configured scope.","commands":{"allow":["set_simple_fullscreen"],"deny":[]}},"allow-set-size":{"identifier":"allow-set-size","description":"Enables the set_size command without any pre-configured scope.","commands":{"allow":["set_size"],"deny":[]}},"allow-set-size-constraints":{"identifier":"allow-set-size-constraints","description":"Enables the set_size_constraints command without any pre-configured scope.","commands":{"allow":["set_size_constraints"],"deny":[]}},"allow-set-skip-taskbar":{"identifier":"allow-set-skip-taskbar","description":"Enables the set_skip_taskbar command without any pre-configured scope.","commands":{"allow":["set_skip_taskbar"],"deny":[]}},"allow-set-theme":{"identifier":"allow-set-theme","description":"Enables the set_theme command without any pre-configured scope.","commands":{"allow":["set_theme"],"deny":[]}},"allow-set-title":{"identifier":"allow-set-title","description":"Enables the set_title command without any pre-configured scope.","commands":{"allow":["set_title"],"deny":[]}},"allow-set-title-bar-style":{"identifier":"allow-set-title-bar-style","description":"Enables the set_title_bar_style command without any pre-configured scope.","commands":{"allow":["set_title_bar_style"],"deny":[]}},"allow-set-visible-on-all-workspaces":{"identifier":"allow-set-visible-on-all-workspaces","description":"Enables the set_visible_on_all_workspaces command without any pre-configured scope.","commands":{"allow":["set_visible_on_all_workspaces"],"deny":[]}},"allow-show":{"identifier":"allow-show","description":"Enables the show command without any pre-configured scope.","commands":{"allow":["show"],"deny":[]}},"allow-start-dragging":{"identifier":"allow-start-dragging","description":"Enables the start_dragging command without any pre-configured scope.","commands":{"allow":["start_dragging"],"deny":[]}},"allow-start-resize-dragging":{"identifier":"allow-start-resize-dragging","description":"Enables the start_resize_dragging command without any pre-configured scope.","commands":{"allow":["start_resize_dragging"],"deny":[]}},"allow-theme":{"identifier":"allow-theme","description":"Enables the theme command without any pre-configured scope.","commands":{"allow":["theme"],"deny":[]}},"allow-title":{"identifier":"allow-title","description":"Enables the title command without any pre-configured scope.","commands":{"allow":["title"],"deny":[]}},"allow-toggle-maximize":{"identifier":"allow-toggle-maximize","description":"Enables the toggle_maximize command without any pre-configured scope.","commands":{"allow":["toggle_maximize"],"deny":[]}},"allow-unmaximize":{"identifier":"allow-unmaximize","description":"Enables the unmaximize command without any pre-configured scope.","commands":{"allow":["unmaximize"],"deny":[]}},"allow-unminimize":{"identifier":"allow-unminimize","description":"Enables the unminimize command without any pre-configured scope.","commands":{"allow":["unminimize"],"deny":[]}},"deny-available-monitors":{"identifier":"deny-available-monitors","description":"Denies the available_monitors command without any pre-configured scope.","commands":{"allow":[],"deny":["available_monitors"]}},"deny-center":{"identifier":"deny-center","description":"Denies the center command without any pre-configured scope.","commands":{"allow":[],"deny":["center"]}},"deny-close":{"identifier":"deny-close","description":"Denies the close command without any pre-configured scope.","commands":{"allow":[],"deny":["close"]}},"deny-create":{"identifier":"deny-create","description":"Denies the create command without any pre-configured scope.","commands":{"allow":[],"deny":["create"]}},"deny-current-monitor":{"identifier":"deny-current-monitor","description":"Denies the current_monitor command without any pre-configured scope.","commands":{"allow":[],"deny":["current_monitor"]}},"deny-cursor-position":{"identifier":"deny-cursor-position","description":"Denies the cursor_position command without any pre-configured scope.","commands":{"allow":[],"deny":["cursor_position"]}},"deny-destroy":{"identifier":"deny-destroy","description":"Denies the destroy command without any pre-configured scope.","commands":{"allow":[],"deny":["destroy"]}},"deny-get-all-windows":{"identifier":"deny-get-all-windows","description":"Denies the get_all_windows command without any pre-configured scope.","commands":{"allow":[],"deny":["get_all_windows"]}},"deny-hide":{"identifier":"deny-hide","description":"Denies the hide command without any pre-configured scope.","commands":{"allow":[],"deny":["hide"]}},"deny-inner-position":{"identifier":"deny-inner-position","description":"Denies the inner_position command without any pre-configured scope.","commands":{"allow":[],"deny":["inner_position"]}},"deny-inner-size":{"identifier":"deny-inner-size","description":"Denies the inner_size command without any pre-configured scope.","commands":{"allow":[],"deny":["inner_size"]}},"deny-internal-toggle-maximize":{"identifier":"deny-internal-toggle-maximize","description":"Denies the internal_toggle_maximize command without any pre-configured scope.","commands":{"allow":[],"deny":["internal_toggle_maximize"]}},"deny-is-always-on-top":{"identifier":"deny-is-always-on-top","description":"Denies the is_always_on_top command without any pre-configured scope.","commands":{"allow":[],"deny":["is_always_on_top"]}},"deny-is-closable":{"identifier":"deny-is-closable","description":"Denies the is_closable command without any pre-configured scope.","commands":{"allow":[],"deny":["is_closable"]}},"deny-is-decorated":{"identifier":"deny-is-decorated","description":"Denies the is_decorated command without any pre-configured scope.","commands":{"allow":[],"deny":["is_decorated"]}},"deny-is-enabled":{"identifier":"deny-is-enabled","description":"Denies the is_enabled command without any pre-configured scope.","commands":{"allow":[],"deny":["is_enabled"]}},"deny-is-focused":{"identifier":"deny-is-focused","description":"Denies the is_focused command without any pre-configured scope.","commands":{"allow":[],"deny":["is_focused"]}},"deny-is-fullscreen":{"identifier":"deny-is-fullscreen","description":"Denies the is_fullscreen command without any pre-configured scope.","commands":{"allow":[],"deny":["is_fullscreen"]}},"deny-is-maximizable":{"identifier":"deny-is-maximizable","description":"Denies the is_maximizable command without any pre-configured scope.","commands":{"allow":[],"deny":["is_maximizable"]}},"deny-is-maximized":{"identifier":"deny-is-maximized","description":"Denies the is_maximized command without any pre-configured scope.","commands":{"allow":[],"deny":["is_maximized"]}},"deny-is-minimizable":{"identifier":"deny-is-minimizable","description":"Denies the is_minimizable command without any pre-configured scope.","commands":{"allow":[],"deny":["is_minimizable"]}},"deny-is-minimized":{"identifier":"deny-is-minimized","description":"Denies the is_minimized command without any pre-configured scope.","commands":{"allow":[],"deny":["is_minimized"]}},"deny-is-resizable":{"identifier":"deny-is-resizable","description":"Denies the is_resizable command without any pre-configured scope.","commands":{"allow":[],"deny":["is_resizable"]}},"deny-is-visible":{"identifier":"deny-is-visible","description":"Denies the is_visible command without any pre-configured scope.","commands":{"allow":[],"deny":["is_visible"]}},"deny-maximize":{"identifier":"deny-maximize","description":"Denies the maximize command without any pre-configured scope.","commands":{"allow":[],"deny":["maximize"]}},"deny-minimize":{"identifier":"deny-minimize","description":"Denies the minimize command without any pre-configured scope.","commands":{"allow":[],"deny":["minimize"]}},"deny-monitor-from-point":{"identifier":"deny-monitor-from-point","description":"Denies the monitor_from_point command without any pre-configured scope.","commands":{"allow":[],"deny":["monitor_from_point"]}},"deny-outer-position":{"identifier":"deny-outer-position","description":"Denies the outer_position command without any pre-configured scope.","commands":{"allow":[],"deny":["outer_position"]}},"deny-outer-size":{"identifier":"deny-outer-size","description":"Denies the outer_size command without any pre-configured scope.","commands":{"allow":[],"deny":["outer_size"]}},"deny-primary-monitor":{"identifier":"deny-primary-monitor","description":"Denies the primary_monitor command without any pre-configured scope.","commands":{"allow":[],"deny":["primary_monitor"]}},"deny-request-user-attention":{"identifier":"deny-request-user-attention","description":"Denies the request_user_attention command without any pre-configured scope.","commands":{"allow":[],"deny":["request_user_attention"]}},"deny-scale-factor":{"identifier":"deny-scale-factor","description":"Denies the scale_factor command without any pre-configured scope.","commands":{"allow":[],"deny":["scale_factor"]}},"deny-set-always-on-bottom":{"identifier":"deny-set-always-on-bottom","description":"Denies the set_always_on_bottom command without any pre-configured scope.","commands":{"allow":[],"deny":["set_always_on_bottom"]}},"deny-set-always-on-top":{"identifier":"deny-set-always-on-top","description":"Denies the set_always_on_top command without any pre-configured scope.","commands":{"allow":[],"deny":["set_always_on_top"]}},"deny-set-background-color":{"identifier":"deny-set-background-color","description":"Denies the set_background_color command without any pre-configured scope.","commands":{"allow":[],"deny":["set_background_color"]}},"deny-set-badge-count":{"identifier":"deny-set-badge-count","description":"Denies the set_badge_count command without any pre-configured scope.","commands":{"allow":[],"deny":["set_badge_count"]}},"deny-set-badge-label":{"identifier":"deny-set-badge-label","description":"Denies the set_badge_label command without any pre-configured scope.","commands":{"allow":[],"deny":["set_badge_label"]}},"deny-set-closable":{"identifier":"deny-set-closable","description":"Denies the set_closable command without any pre-configured scope.","commands":{"allow":[],"deny":["set_closable"]}},"deny-set-content-protected":{"identifier":"deny-set-content-protected","description":"Denies the set_content_protected command without any pre-configured scope.","commands":{"allow":[],"deny":["set_content_protected"]}},"deny-set-cursor-grab":{"identifier":"deny-set-cursor-grab","description":"Denies the set_cursor_grab command without any pre-configured scope.","commands":{"allow":[],"deny":["set_cursor_grab"]}},"deny-set-cursor-icon":{"identifier":"deny-set-cursor-icon","description":"Denies the set_cursor_icon command without any pre-configured scope.","commands":{"allow":[],"deny":["set_cursor_icon"]}},"deny-set-cursor-position":{"identifier":"deny-set-cursor-position","description":"Denies the set_cursor_position command without any pre-configured scope.","commands":{"allow":[],"deny":["set_cursor_position"]}},"deny-set-cursor-visible":{"identifier":"deny-set-cursor-visible","description":"Denies the set_cursor_visible command without any pre-configured scope.","commands":{"allow":[],"deny":["set_cursor_visible"]}},"deny-set-decorations":{"identifier":"deny-set-decorations","description":"Denies the set_decorations command without any pre-configured scope.","commands":{"allow":[],"deny":["set_decorations"]}},"deny-set-effects":{"identifier":"deny-set-effects","description":"Denies the set_effects command without any pre-configured scope.","commands":{"allow":[],"deny":["set_effects"]}},"deny-set-enabled":{"identifier":"deny-set-enabled","description":"Denies the set_enabled command without any pre-configured scope.","commands":{"allow":[],"deny":["set_enabled"]}},"deny-set-focus":{"identifier":"deny-set-focus","description":"Denies the set_focus command without any pre-configured scope.","commands":{"allow":[],"deny":["set_focus"]}},"deny-set-focusable":{"identifier":"deny-set-focusable","description":"Denies the set_focusable command without any pre-configured scope.","commands":{"allow":[],"deny":["set_focusable"]}},"deny-set-fullscreen":{"identifier":"deny-set-fullscreen","description":"Denies the set_fullscreen command without any pre-configured scope.","commands":{"allow":[],"deny":["set_fullscreen"]}},"deny-set-icon":{"identifier":"deny-set-icon","description":"Denies the set_icon command without any pre-configured scope.","commands":{"allow":[],"deny":["set_icon"]}},"deny-set-ignore-cursor-events":{"identifier":"deny-set-ignore-cursor-events","description":"Denies the set_ignore_cursor_events command without any pre-configured scope.","commands":{"allow":[],"deny":["set_ignore_cursor_events"]}},"deny-set-max-size":{"identifier":"deny-set-max-size","description":"Denies the set_max_size command without any pre-configured scope.","commands":{"allow":[],"deny":["set_max_size"]}},"deny-set-maximizable":{"identifier":"deny-set-maximizable","description":"Denies the set_maximizable command without any pre-configured scope.","commands":{"allow":[],"deny":["set_maximizable"]}},"deny-set-min-size":{"identifier":"deny-set-min-size","description":"Denies the set_min_size command without any pre-configured scope.","commands":{"allow":[],"deny":["set_min_size"]}},"deny-set-minimizable":{"identifier":"deny-set-minimizable","description":"Denies the set_minimizable command without any pre-configured scope.","commands":{"allow":[],"deny":["set_minimizable"]}},"deny-set-overlay-icon":{"identifier":"deny-set-overlay-icon","description":"Denies the set_overlay_icon command without any pre-configured scope.","commands":{"allow":[],"deny":["set_overlay_icon"]}},"deny-set-position":{"identifier":"deny-set-position","description":"Denies the set_position command without any pre-configured scope.","commands":{"allow":[],"deny":["set_position"]}},"deny-set-progress-bar":{"identifier":"deny-set-progress-bar","description":"Denies the set_progress_bar command without any pre-configured scope.","commands":{"allow":[],"deny":["set_progress_bar"]}},"deny-set-resizable":{"identifier":"deny-set-resizable","description":"Denies the set_resizable command without any pre-configured scope.","commands":{"allow":[],"deny":["set_resizable"]}},"deny-set-shadow":{"identifier":"deny-set-shadow","description":"Denies the set_shadow command without any pre-configured scope.","commands":{"allow":[],"deny":["set_shadow"]}},"deny-set-simple-fullscreen":{"identifier":"deny-set-simple-fullscreen","description":"Denies the set_simple_fullscreen command without any pre-configured scope.","commands":{"allow":[],"deny":["set_simple_fullscreen"]}},"deny-set-size":{"identifier":"deny-set-size","description":"Denies the set_size command without any pre-configured scope.","commands":{"allow":[],"deny":["set_size"]}},"deny-set-size-constraints":{"identifier":"deny-set-size-constraints","description":"Denies the set_size_constraints command without any pre-configured scope.","commands":{"allow":[],"deny":["set_size_constraints"]}},"deny-set-skip-taskbar":{"identifier":"deny-set-skip-taskbar","description":"Denies the set_skip_taskbar command without any pre-configured scope.","commands":{"allow":[],"deny":["set_skip_taskbar"]}},"deny-set-theme":{"identifier":"deny-set-theme","description":"Denies the set_theme command without any pre-configured scope.","commands":{"allow":[],"deny":["set_theme"]}},"deny-set-title":{"identifier":"deny-set-title","description":"Denies the set_title command without any pre-configured scope.","commands":{"allow":[],"deny":["set_title"]}},"deny-set-title-bar-style":{"identifier":"deny-set-title-bar-style","description":"Denies the set_title_bar_style command without any pre-configured scope.","commands":{"allow":[],"deny":["set_title_bar_style"]}},"deny-set-visible-on-all-workspaces":{"identifier":"deny-set-visible-on-all-workspaces","description":"Denies the set_visible_on_all_workspaces command without any pre-configured scope.","commands":{"allow":[],"deny":["set_visible_on_all_workspaces"]}},"deny-show":{"identifier":"deny-show","description":"Denies the show command without any pre-configured scope.","commands":{"allow":[],"deny":["show"]}},"deny-start-dragging":{"identifier":"deny-start-dragging","description":"Denies the start_dragging command without any pre-configured scope.","commands":{"allow":[],"deny":["start_dragging"]}},"deny-start-resize-dragging":{"identifier":"deny-start-resize-dragging","description":"Denies the start_resize_dragging command without any pre-configured scope.","commands":{"allow":[],"deny":["start_resize_dragging"]}},"deny-theme":{"identifier":"deny-theme","description":"Denies the theme command without any pre-configured scope.","commands":{"allow":[],"deny":["theme"]}},"deny-title":{"identifier":"deny-title","description":"Denies the title command without any pre-configured scope.","commands":{"allow":[],"deny":["title"]}},"deny-toggle-maximize":{"identifier":"deny-toggle-maximize","description":"Denies the toggle_maximize command without any pre-configured scope.","commands":{"allow":[],"deny":["toggle_maximize"]}},"deny-unmaximize":{"identifier":"deny-unmaximize","description":"Denies the unmaximize command without any pre-configured scope.","commands":{"allow":[],"deny":["unmaximize"]}},"deny-unminimize":{"identifier":"deny-unminimize","description":"Denies the unminimize command without any pre-configured scope.","commands":{"allow":[],"deny":["unminimize"]}}},"permission_sets":{},"global_scope_schema":null},"shell":{"default_permission":{"identifier":"default","description":"This permission set configures which\nshell functionality is exposed by default.\n\n#### Granted Permissions\n\nIt allows to use the `open` functionality with a reasonable\nscope pre-configured. It will allow opening `http(s)://`,\n`tel:` and `mailto:` links.\n","permissions":["allow-open"]},"permissions":{"allow-execute":{"identifier":"allow-execute","description":"Enables the execute command without any pre-configured scope.","commands":{"allow":["execute"],"deny":[]}},"allow-kill":{"identifier":"allow-kill","description":"Enables the kill command without any pre-configured scope.","commands":{"allow":["kill"],"deny":[]}},"allow-open":{"identifier":"allow-open","description":"Enables the open command without any pre-configured scope.","commands":{"allow":["open"],"deny":[]}},"allow-spawn":{"identifier":"allow-spawn","description":"Enables the spawn command without any pre-configured scope.","commands":{"allow":["spawn"],"deny":[]}},"allow-stdin-write":{"identifier":"allow-stdin-write","description":"Enables the stdin_write command without any pre-configured scope.","commands":{"allow":["stdin_write"],"deny":[]}},"deny-execute":{"identifier":"deny-execute","description":"Denies the execute command without any pre-configured scope.","commands":{"allow":[],"deny":["execute"]}},"deny-kill":{"identifier":"deny-kill","description":"Denies the kill command without any pre-configured scope.","commands":{"allow":[],"deny":["kill"]}},"deny-open":{"identifier":"deny-open","description":"Denies the open command without any pre-configured scope.","commands":{"allow":[],"deny":["open"]}},"deny-spawn":{"identifier":"deny-spawn","description":"Denies the spawn command without any pre-configured scope.","commands":{"allow":[],"deny":["spawn"]}},"deny-stdin-write":{"identifier":"deny-stdin-write","description":"Denies the stdin_write command without any pre-configured scope.","commands":{"allow":[],"deny":["stdin_write"]}}},"permission_sets":{},"global_scope_schema":{"$schema":"http://json-schema.org/draft-07/schema#","anyOf":[{"additionalProperties":false,"properties":{"args":{"allOf":[{"$ref":"#/definitions/ShellScopeEntryAllowedArgs"}],"description":"The allowed arguments for the command execution."},"cmd":{"description":"The command name. It can start with a variable that resolves to a system base directory. The variables are: `$AUDIO`, `$CACHE`, `$CONFIG`, `$DATA`, `$LOCALDATA`, `$DESKTOP`, `$DOCUMENT`, `$DOWNLOAD`, `$EXE`, `$FONT`, `$HOME`, `$PICTURE`, `$PUBLIC`, `$RUNTIME`, `$TEMPLATE`, `$VIDEO`, `$RESOURCE`, `$LOG`, `$TEMP`, `$APPCONFIG`, `$APPDATA`, `$APPLOCALDATA`, `$APPCACHE`, `$APPLOG`.","type":"string"},"name":{"description":"The name for this allowed shell command configuration.\n\nThis name will be used inside of the webview API to call this command along with any specified arguments.","type":"string"}},"required":["cmd","name"],"type":"object"},{"additionalProperties":false,"properties":{"args":{"allOf":[{"$ref":"#/definitions/ShellScopeEntryAllowedArgs"}],"description":"The allowed arguments for the command execution."},"name":{"description":"The name for this allowed shell command configuration.\n\nThis name will be used inside of the webview API to call this command along with any specified arguments.","type":"string"},"sidecar":{"description":"If this command is a sidecar command.","type":"boolean"}},"required":["name","sidecar"],"type":"object"}],"definitions":{"ShellScopeEntryAllowedArg":{"anyOf":[{"description":"A non-configurable argument that is passed to the command in the order it was specified.","type":"string"},{"additionalProperties":false,"description":"A variable that is set while calling the command from the webview API.","properties":{"raw":{"default":false,"description":"Marks the validator as a raw regex, meaning the plugin should not make any modification at runtime.\n\nThis means the regex will not match on the entire string by default, which might be exploited if your regex allow unexpected input to be considered valid. When using this option, make sure your regex is correct.","type":"boolean"},"validator":{"description":"[regex] validator to require passed values to conform to an expected input.\n\nThis will require the argument value passed to this variable to match the `validator` regex before it will be executed.\n\nThe regex string is by default surrounded by `^...$` to match the full string. For example the `https?://\\w+` regex would be registered as `^https?://\\w+$`.\n\n[regex]: ","type":"string"}},"required":["validator"],"type":"object"}],"description":"A command argument allowed to be executed by the webview API."},"ShellScopeEntryAllowedArgs":{"anyOf":[{"description":"Use a simple boolean to allow all or disable all arguments to this command configuration.","type":"boolean"},{"description":"A specific set of [`ShellScopeEntryAllowedArg`] that are valid to call for the command configuration.","items":{"$ref":"#/definitions/ShellScopeEntryAllowedArg"},"type":"array"}],"description":"A set of command arguments allowed to be executed by the webview API.\n\nA value of `true` will allow any arguments to be passed to the command. `false` will disable all arguments. A list of [`ShellScopeEntryAllowedArg`] will set those arguments as the only valid arguments to be passed to the attached command configuration."}},"description":"Shell scope entry.","title":"ShellScopeEntry"}}} \ No newline at end of file diff --git a/foundry/packages/desktop/src-tauri/gen/schemas/desktop-schema.json b/foundry/packages/desktop/src-tauri/gen/schemas/desktop-schema.json index 34f0a61..f827fe1 100644 --- a/foundry/packages/desktop/src-tauri/gen/schemas/desktop-schema.json +++ b/foundry/packages/desktop/src-tauri/gen/schemas/desktop-schema.json @@ -21,7 +21,9 @@ { "description": "A list of capabilities.", "type": "object", - "required": ["capabilities"], + "required": [ + "capabilities" + ], "properties": { "capabilities": { "description": "The list of capabilities.", @@ -37,7 +39,10 @@ "Capability": { "description": "A grouping and boundary mechanism developers can use to isolate access to the IPC layer.\n\nIt controls application windows' and webviews' fine grained access to the Tauri core, application, or plugin commands. If a webview or its window is not matching any capability then it has no access to the IPC layer at all.\n\nThis can be done to create groups of windows, based on their required system access, which can reduce impact of frontend vulnerabilities in less privileged windows. Windows can be added to a capability by exact name (e.g. `main-window`) or glob patterns like `*` or `admin-*`. A Window can have none, one, or multiple associated capabilities.\n\n## Example\n\n```json { \"identifier\": \"main-user-files-write\", \"description\": \"This capability allows the `main` window on macOS and Windows access to `filesystem` write related commands and `dialog` commands to enable programmatic access to files selected by the user.\", \"windows\": [ \"main\" ], \"permissions\": [ \"core:default\", \"dialog:open\", { \"identifier\": \"fs:allow-write-text-file\", \"allow\": [{ \"path\": \"$HOME/test.txt\" }] }, ], \"platforms\": [\"macOS\",\"windows\"] } ```", "type": "object", - "required": ["identifier", "permissions"], + "required": [ + "identifier", + "permissions" + ], "properties": { "identifier": { "description": "Identifier of the capability.\n\n## Example\n\n`main-user-files-write`", @@ -88,7 +93,10 @@ }, "platforms": { "description": "Limit which target platforms this capability applies to.\n\nBy default all platforms are targeted.\n\n## Example\n\n`[\"macOS\",\"windows\"]`", - "type": ["array", "null"], + "type": [ + "array", + "null" + ], "items": { "$ref": "#/definitions/Target" } @@ -98,7 +106,9 @@ "CapabilityRemote": { "description": "Configuration for remote URLs that are associated with the capability.", "type": "object", - "required": ["urls"], + "required": [ + "urls" + ], "properties": { "urls": { "description": "Remote domains this capability refers to using the [URLPattern standard](https://urlpattern.spec.whatwg.org/).\n\n## Examples\n\n- \"https://*.mydomain.dev\": allows subdomains of mydomain.dev - \"https://mydomain.dev/api/*\": allows any subpath of mydomain.dev/api", @@ -208,7 +218,10 @@ "anyOf": [ { "type": "object", - "required": ["cmd", "name"], + "required": [ + "cmd", + "name" + ], "properties": { "args": { "description": "The allowed arguments for the command execution.", @@ -231,7 +244,10 @@ }, { "type": "object", - "required": ["name", "sidecar"], + "required": [ + "name", + "sidecar" + ], "properties": { "args": { "description": "The allowed arguments for the command execution.", @@ -262,7 +278,10 @@ "anyOf": [ { "type": "object", - "required": ["cmd", "name"], + "required": [ + "cmd", + "name" + ], "properties": { "args": { "description": "The allowed arguments for the command execution.", @@ -285,7 +304,10 @@ }, { "type": "object", - "required": ["name", "sidecar"], + "required": [ + "name", + "sidecar" + ], "properties": { "args": { "description": "The allowed arguments for the command execution.", @@ -334,14 +356,20 @@ }, "allow": { "description": "Data that defines what is allowed by the scope.", - "type": ["array", "null"], + "type": [ + "array", + "null" + ], "items": { "$ref": "#/definitions/Value" } }, "deny": { "description": "Data that defines what is denied by the scope. This should be prioritized by validation logic.", - "type": ["array", "null"], + "type": [ + "array", + "null" + ], "items": { "$ref": "#/definitions/Value" } @@ -349,7 +377,9 @@ } } ], - "required": ["identifier"] + "required": [ + "identifier" + ] } ] }, @@ -1815,10 +1845,10 @@ "markdownDescription": "Enables the set_title_bar_style command without any pre-configured scope." }, { - "description": "Enables the set_visible_on_all_organizations command without any pre-configured scope.", + "description": "Enables the set_visible_on_all_workspaces command without any pre-configured scope.", "type": "string", - "const": "core:window:allow-set-visible-on-all-organizations", - "markdownDescription": "Enables the set_visible_on_all_organizations command without any pre-configured scope." + "const": "core:window:allow-set-visible-on-all-workspaces", + "markdownDescription": "Enables the set_visible_on_all_workspaces command without any pre-configured scope." }, { "description": "Enables the show command without any pre-configured scope.", @@ -2271,10 +2301,10 @@ "markdownDescription": "Denies the set_title_bar_style command without any pre-configured scope." }, { - "description": "Denies the set_visible_on_all_organizations command without any pre-configured scope.", + "description": "Denies the set_visible_on_all_workspaces command without any pre-configured scope.", "type": "string", - "const": "core:window:deny-set-visible-on-all-organizations", - "markdownDescription": "Denies the set_visible_on_all_organizations command without any pre-configured scope." + "const": "core:window:deny-set-visible-on-all-workspaces", + "markdownDescription": "Denies the set_visible_on_all_workspaces command without any pre-configured scope." }, { "description": "Denies the show command without any pre-configured scope.", @@ -2452,27 +2482,37 @@ { "description": "MacOS.", "type": "string", - "enum": ["macOS"] + "enum": [ + "macOS" + ] }, { "description": "Windows.", "type": "string", - "enum": ["windows"] + "enum": [ + "windows" + ] }, { "description": "Linux.", "type": "string", - "enum": ["linux"] + "enum": [ + "linux" + ] }, { "description": "Android.", "type": "string", - "enum": ["android"] + "enum": [ + "android" + ] }, { "description": "iOS.", "type": "string", - "enum": ["iOS"] + "enum": [ + "iOS" + ] } ] }, @@ -2486,7 +2526,9 @@ { "description": "A variable that is set while calling the command from the webview API.", "type": "object", - "required": ["validator"], + "required": [ + "validator" + ], "properties": { "raw": { "description": "Marks the validator as a raw regex, meaning the plugin should not make any modification at runtime.\n\nThis means the regex will not match on the entire string by default, which might be exploited if your regex allow unexpected input to be considered valid. When using this option, make sure your regex is correct.", @@ -2519,4 +2561,4 @@ ] } } -} +} \ No newline at end of file diff --git a/foundry/packages/desktop/src-tauri/gen/schemas/macOS-schema.json b/foundry/packages/desktop/src-tauri/gen/schemas/macOS-schema.json index 34f0a61..f827fe1 100644 --- a/foundry/packages/desktop/src-tauri/gen/schemas/macOS-schema.json +++ b/foundry/packages/desktop/src-tauri/gen/schemas/macOS-schema.json @@ -21,7 +21,9 @@ { "description": "A list of capabilities.", "type": "object", - "required": ["capabilities"], + "required": [ + "capabilities" + ], "properties": { "capabilities": { "description": "The list of capabilities.", @@ -37,7 +39,10 @@ "Capability": { "description": "A grouping and boundary mechanism developers can use to isolate access to the IPC layer.\n\nIt controls application windows' and webviews' fine grained access to the Tauri core, application, or plugin commands. If a webview or its window is not matching any capability then it has no access to the IPC layer at all.\n\nThis can be done to create groups of windows, based on their required system access, which can reduce impact of frontend vulnerabilities in less privileged windows. Windows can be added to a capability by exact name (e.g. `main-window`) or glob patterns like `*` or `admin-*`. A Window can have none, one, or multiple associated capabilities.\n\n## Example\n\n```json { \"identifier\": \"main-user-files-write\", \"description\": \"This capability allows the `main` window on macOS and Windows access to `filesystem` write related commands and `dialog` commands to enable programmatic access to files selected by the user.\", \"windows\": [ \"main\" ], \"permissions\": [ \"core:default\", \"dialog:open\", { \"identifier\": \"fs:allow-write-text-file\", \"allow\": [{ \"path\": \"$HOME/test.txt\" }] }, ], \"platforms\": [\"macOS\",\"windows\"] } ```", "type": "object", - "required": ["identifier", "permissions"], + "required": [ + "identifier", + "permissions" + ], "properties": { "identifier": { "description": "Identifier of the capability.\n\n## Example\n\n`main-user-files-write`", @@ -88,7 +93,10 @@ }, "platforms": { "description": "Limit which target platforms this capability applies to.\n\nBy default all platforms are targeted.\n\n## Example\n\n`[\"macOS\",\"windows\"]`", - "type": ["array", "null"], + "type": [ + "array", + "null" + ], "items": { "$ref": "#/definitions/Target" } @@ -98,7 +106,9 @@ "CapabilityRemote": { "description": "Configuration for remote URLs that are associated with the capability.", "type": "object", - "required": ["urls"], + "required": [ + "urls" + ], "properties": { "urls": { "description": "Remote domains this capability refers to using the [URLPattern standard](https://urlpattern.spec.whatwg.org/).\n\n## Examples\n\n- \"https://*.mydomain.dev\": allows subdomains of mydomain.dev - \"https://mydomain.dev/api/*\": allows any subpath of mydomain.dev/api", @@ -208,7 +218,10 @@ "anyOf": [ { "type": "object", - "required": ["cmd", "name"], + "required": [ + "cmd", + "name" + ], "properties": { "args": { "description": "The allowed arguments for the command execution.", @@ -231,7 +244,10 @@ }, { "type": "object", - "required": ["name", "sidecar"], + "required": [ + "name", + "sidecar" + ], "properties": { "args": { "description": "The allowed arguments for the command execution.", @@ -262,7 +278,10 @@ "anyOf": [ { "type": "object", - "required": ["cmd", "name"], + "required": [ + "cmd", + "name" + ], "properties": { "args": { "description": "The allowed arguments for the command execution.", @@ -285,7 +304,10 @@ }, { "type": "object", - "required": ["name", "sidecar"], + "required": [ + "name", + "sidecar" + ], "properties": { "args": { "description": "The allowed arguments for the command execution.", @@ -334,14 +356,20 @@ }, "allow": { "description": "Data that defines what is allowed by the scope.", - "type": ["array", "null"], + "type": [ + "array", + "null" + ], "items": { "$ref": "#/definitions/Value" } }, "deny": { "description": "Data that defines what is denied by the scope. This should be prioritized by validation logic.", - "type": ["array", "null"], + "type": [ + "array", + "null" + ], "items": { "$ref": "#/definitions/Value" } @@ -349,7 +377,9 @@ } } ], - "required": ["identifier"] + "required": [ + "identifier" + ] } ] }, @@ -1815,10 +1845,10 @@ "markdownDescription": "Enables the set_title_bar_style command without any pre-configured scope." }, { - "description": "Enables the set_visible_on_all_organizations command without any pre-configured scope.", + "description": "Enables the set_visible_on_all_workspaces command without any pre-configured scope.", "type": "string", - "const": "core:window:allow-set-visible-on-all-organizations", - "markdownDescription": "Enables the set_visible_on_all_organizations command without any pre-configured scope." + "const": "core:window:allow-set-visible-on-all-workspaces", + "markdownDescription": "Enables the set_visible_on_all_workspaces command without any pre-configured scope." }, { "description": "Enables the show command without any pre-configured scope.", @@ -2271,10 +2301,10 @@ "markdownDescription": "Denies the set_title_bar_style command without any pre-configured scope." }, { - "description": "Denies the set_visible_on_all_organizations command without any pre-configured scope.", + "description": "Denies the set_visible_on_all_workspaces command without any pre-configured scope.", "type": "string", - "const": "core:window:deny-set-visible-on-all-organizations", - "markdownDescription": "Denies the set_visible_on_all_organizations command without any pre-configured scope." + "const": "core:window:deny-set-visible-on-all-workspaces", + "markdownDescription": "Denies the set_visible_on_all_workspaces command without any pre-configured scope." }, { "description": "Denies the show command without any pre-configured scope.", @@ -2452,27 +2482,37 @@ { "description": "MacOS.", "type": "string", - "enum": ["macOS"] + "enum": [ + "macOS" + ] }, { "description": "Windows.", "type": "string", - "enum": ["windows"] + "enum": [ + "windows" + ] }, { "description": "Linux.", "type": "string", - "enum": ["linux"] + "enum": [ + "linux" + ] }, { "description": "Android.", "type": "string", - "enum": ["android"] + "enum": [ + "android" + ] }, { "description": "iOS.", "type": "string", - "enum": ["iOS"] + "enum": [ + "iOS" + ] } ] }, @@ -2486,7 +2526,9 @@ { "description": "A variable that is set while calling the command from the webview API.", "type": "object", - "required": ["validator"], + "required": [ + "validator" + ], "properties": { "raw": { "description": "Marks the validator as a raw regex, meaning the plugin should not make any modification at runtime.\n\nThis means the regex will not match on the entire string by default, which might be exploited if your regex allow unexpected input to be considered valid. When using this option, make sure your regex is correct.", @@ -2519,4 +2561,4 @@ ] } } -} +} \ No newline at end of file diff --git a/foundry/packages/frontend/package.json b/foundry/packages/frontend/package.json index 793a12d..6a2e3c4 100644 --- a/foundry/packages/frontend/package.json +++ b/foundry/packages/frontend/package.json @@ -10,12 +10,11 @@ "test": "vitest run" }, "dependencies": { + "@sandbox-agent/react": "workspace:*", "@sandbox-agent/foundry-client": "workspace:*", "@sandbox-agent/foundry-shared": "workspace:*", - "@sandbox-agent/react": "workspace:*", "@tanstack/react-query": "^5.85.5", "@tanstack/react-router": "^1.132.23", - "@tanstack/react-virtual": "^3.13.22", "baseui": "^16.1.1", "lucide-react": "^0.542.0", "react": "^19.1.1", diff --git a/foundry/packages/frontend/src/app/router.tsx b/foundry/packages/frontend/src/app/router.tsx index dd22724..8ee0855 100644 --- a/foundry/packages/frontend/src/app/router.tsx +++ b/foundry/packages/frontend/src/app/router.tsx @@ -1,6 +1,6 @@ import { type ReactNode, useEffect } from "react"; import type { FoundryBillingPlanId } from "@sandbox-agent/foundry-shared"; -import { useSubscription } from "@sandbox-agent/foundry-client"; +import { useInterest } from "@sandbox-agent/foundry-client"; import { Navigate, Outlet, createRootRoute, createRoute, createRouter } from "@tanstack/react-router"; import { MockLayout } from "../components/mock-layout"; import { @@ -11,8 +11,8 @@ import { MockOrganizationSettingsPage, MockSignInPage, } from "../components/mock-onboarding"; -import { defaultOrganizationId, isMockFrontendClient } from "../lib/env"; -import { subscriptionManager } from "../lib/subscription"; +import { defaultWorkspaceId, isMockFrontendClient } from "../lib/env"; +import { interestManager } from "../lib/interest"; import { activeMockOrganization, getMockOrganizationById, isAppSnapshotBootstrapping, useMockAppClient, useMockAppSnapshot } from "../lib/mock-app"; const rootRoute = createRootRoute({ @@ -61,20 +61,20 @@ const organizationCheckoutRoute = createRoute({ component: OrganizationCheckoutRoute, }); -const organizationRoute = createRoute({ +const workspaceRoute = createRoute({ getParentRoute: () => rootRoute, - path: "/organizations/$organizationId", - component: OrganizationLayoutRoute, + path: "/workspaces/$workspaceId", + component: WorkspaceLayoutRoute, }); -const organizationIndexRoute = createRoute({ - getParentRoute: () => organizationRoute, +const workspaceIndexRoute = createRoute({ + getParentRoute: () => workspaceRoute, path: "/", - component: OrganizationRoute, + component: WorkspaceRoute, }); const taskRoute = createRoute({ - getParentRoute: () => organizationRoute, + getParentRoute: () => workspaceRoute, path: "tasks/$taskId", validateSearch: (search: Record) => ({ sessionId: typeof search.sessionId === "string" && search.sessionId.trim().length > 0 ? search.sessionId : undefined, @@ -83,7 +83,7 @@ const taskRoute = createRoute({ }); const repoRoute = createRoute({ - getParentRoute: () => organizationRoute, + getParentRoute: () => workspaceRoute, path: "repos/$repoId", component: RepoRoute, }); @@ -96,7 +96,7 @@ const routeTree = rootRoute.addChildren([ organizationSettingsRoute, organizationBillingRoute, organizationCheckoutRoute, - organizationRoute.addChildren([organizationIndexRoute, taskRoute, repoRoute]), + workspaceRoute.addChildren([workspaceIndexRoute, taskRoute, repoRoute]), ]); export const router = createRouter({ routeTree }); @@ -107,7 +107,7 @@ declare module "@tanstack/react-router" { } } -function OrganizationLayoutRoute() { +function WorkspaceLayoutRoute() { return ; } @@ -142,7 +142,7 @@ function IndexRoute() { const activeOrganization = activeMockOrganization(snapshot); if (activeOrganization) { - return ; + return ; } return ; @@ -238,54 +238,54 @@ function OrganizationCheckoutRoute() { return ; } -function OrganizationRoute() { - const { organizationId } = organizationRoute.useParams(); +function WorkspaceRoute() { + const { workspaceId } = workspaceRoute.useParams(); return ( - - - + + + ); } -function OrganizationView({ - organizationId, +function WorkspaceView({ + workspaceId, selectedTaskId, selectedSessionId, }: { - organizationId: string; + workspaceId: string; selectedTaskId: string | null; selectedSessionId: string | null; }) { - return ; + return ; } function TaskRoute() { - const { organizationId, taskId } = taskRoute.useParams(); + const { workspaceId, taskId } = taskRoute.useParams(); const { sessionId } = taskRoute.useSearch(); return ( - - - + + + ); } -function TaskView({ organizationId, taskId, sessionId }: { organizationId: string; taskId: string; sessionId: string | null }) { - return ; +function TaskView({ workspaceId, taskId, sessionId }: { workspaceId: string; taskId: string; sessionId: string | null }) { + return ; } function RepoRoute() { - const { organizationId, repoId } = repoRoute.useParams(); + const { workspaceId, repoId } = repoRoute.useParams(); return ( - - - + + + ); } -function AppOrganizationGate({ organizationId, children }: { organizationId: string; children: ReactNode }) { +function AppWorkspaceGate({ workspaceId, children }: { workspaceId: string; children: ReactNode }) { const client = useMockAppClient(); const snapshot = useMockAppSnapshot(); - const organization = snapshot.organizations.find((candidate) => candidate.organizationId === organizationId) ?? null; + const organization = snapshot.organizations.find((candidate) => candidate.workspaceId === workspaceId) ?? null; useEffect(() => { if (organization && snapshot.activeOrganizationId !== organization.id) { @@ -294,7 +294,7 @@ function AppOrganizationGate({ organizationId, children }: { organizationId: str }, [client, organization, snapshot.activeOrganizationId]); if (!isMockFrontendClient && isAppSnapshotBootstrapping(snapshot)) { - return ; + return ; } if (snapshot.auth.status === "signed_out") { @@ -308,15 +308,13 @@ function AppOrganizationGate({ organizationId, children }: { organizationId: str return <>{children}; } -function RepoRouteInner({ organizationId, repoId }: { organizationId: string; repoId: string }) { - const organizationState = useSubscription(subscriptionManager, "organization", { organizationId }); - const activeTaskId = organizationState.data?.taskSummaries.find((task) => task.repoId === repoId)?.id; +function RepoRouteInner({ workspaceId, repoId }: { workspaceId: string; repoId: string }) { + const workspaceState = useInterest(interestManager, "workspace", { workspaceId }); + const activeTaskId = workspaceState.data?.taskSummaries.find((task) => task.repoId === repoId)?.id; if (!activeTaskId) { - return ; + return ; } - return ( - - ); + return ; } function RootLayout() { diff --git a/foundry/packages/frontend/src/components/dev-panel.tsx b/foundry/packages/frontend/src/components/dev-panel.tsx index 947331e..f0a176c 100644 --- a/foundry/packages/frontend/src/components/dev-panel.tsx +++ b/foundry/packages/frontend/src/components/dev-panel.tsx @@ -1,92 +1,44 @@ -import { memo, useEffect, useMemo, useState } from "react"; +import { memo, useCallback, useEffect, useMemo, useState } from "react"; import { useStyletron } from "baseui"; import { useFoundryTokens } from "../app/theme"; import { isMockFrontendClient } from "../lib/env"; -import { subscriptionManager } from "../lib/subscription"; -import type { - FoundryAppSnapshot, - FoundryOrganization, - TaskWorkspaceSnapshot, - WorkspaceSandboxSummary, - WorkspaceSessionSummary, - WorkspaceTaskStatus, -} from "@sandbox-agent/foundry-shared"; -import { useSubscription } from "@sandbox-agent/foundry-client"; -import type { DebugSubscriptionTopic } from "@sandbox-agent/foundry-client"; -import { describeTaskState } from "../features/tasks/status"; +import type { FoundryOrganization, TaskWorkbenchSnapshot, WorkbenchTask } from "@sandbox-agent/foundry-shared"; interface DevPanelProps { - organizationId: string; - snapshot: TaskWorkspaceSnapshot; + workspaceId: string; + snapshot: TaskWorkbenchSnapshot; organization?: FoundryOrganization | null; - focusedTask?: DevPanelFocusedTask | null; -} - -export interface DevPanelFocusedTask { - id: string; - repoId: string; - title: string | null; - status: WorkspaceTaskStatus; - branch?: string | null; - activeSandboxId?: string | null; - activeSessionId?: string | null; - sandboxes?: WorkspaceSandboxSummary[]; - sessions?: WorkspaceSessionSummary[]; } interface TopicInfo { label: string; key: string; - /** Parsed params portion of the cache key, or empty if none. */ - params: string; listenerCount: number; hasConnection: boolean; - status: "loading" | "connected" | "error"; lastRefresh: number | null; } -function topicLabel(topic: DebugSubscriptionTopic): string { - switch (topic.topicKey) { - case "app": - return "App"; - case "organization": - return "Organization"; - case "task": - return "Task"; - case "session": - return "Session"; - case "sandboxProcesses": - return "Sandbox"; - } -} - -/** Extract the params portion of a cache key (everything after the first `:`) */ -function topicParams(topic: DebugSubscriptionTopic): string { - const idx = topic.cacheKey.indexOf(":"); - return idx >= 0 ? topic.cacheKey.slice(idx + 1) : ""; -} - function timeAgo(ts: number | null): string { if (!ts) return "never"; const seconds = Math.floor((Date.now() - ts) / 1000); if (seconds < 5) return "now"; - if (seconds < 60) return `${seconds}s ago`; + if (seconds < 60) return `${seconds}s`; const minutes = Math.floor(seconds / 60); - if (minutes < 60) return `${minutes}m ago`; - return `${Math.floor(minutes / 60)}h ago`; + if (minutes < 60) return `${minutes}m`; + return `${Math.floor(minutes / 60)}h`; +} + +function taskStatusLabel(task: WorkbenchTask): string { + if (task.status === "archived") return "archived"; + const hasRunning = task.tabs?.some((tab) => tab.status === "running"); + if (hasRunning) return "running"; + return task.status ?? "idle"; } function statusColor(status: string, t: ReturnType): string { - if (status.startsWith("init_") || status.startsWith("archive_") || status.startsWith("kill_") || status.startsWith("pending_")) { - return t.statusWarning; - } switch (status) { - case "connected": case "running": - case "ready": return t.statusSuccess; - case "loading": - return t.statusWarning; case "archived": return t.textMuted; case "error": @@ -124,15 +76,7 @@ function installStatusColor(status: string, t: ReturnType { - return subscriptionManager.listDebugTopics().map((topic) => ({ - label: topicLabel(topic), - key: topic.cacheKey, - params: topicParams(topic), - listenerCount: topic.listenerCount, - hasConnection: topic.status === "connected", - status: topic.status, - lastRefresh: topic.lastRefreshAt, - })); - }, [now]); + const items: TopicInfo[] = []; - const appState = useSubscription(subscriptionManager, "app", {}); - const organizationState = useSubscription(subscriptionManager, "organization", { organizationId }); - const appSnapshot: FoundryAppSnapshot | null = appState.data ?? null; - const liveGithub = organizationState.data?.github ?? organization?.github ?? null; + // Workbench subscription topic + items.push({ + label: "Workbench", + key: `ws:${workspaceId}`, + listenerCount: 1, + hasConnection: true, + lastRefresh: now, + }); + + // Per-task tab subscriptions + for (const task of snapshot.tasks ?? []) { + if (task.status === "archived") continue; + for (const tab of task.tabs ?? []) { + items.push({ + label: `Tab/${task.title?.slice(0, 16) || task.id.slice(0, 8)}/${tab.sessionName.slice(0, 10)}`, + key: `${workspaceId}:${task.id}:${tab.id}`, + listenerCount: 1, + hasConnection: tab.status === "running", + lastRefresh: tab.status === "running" ? now : null, + }); + } + } + + return items; + }, [workspaceId, snapshot, now]); - const repos = snapshot.repos ?? []; const tasks = snapshot.tasks ?? []; - const prCount = tasks.filter((task) => task.pullRequest != null).length; - const focusedTaskStatus = focusedTask?.status ?? null; - const focusedTaskState = describeTaskState(focusedTaskStatus); - const lastWebhookAt = liveGithub?.lastWebhookAt ?? null; - const hasRecentWebhook = lastWebhookAt != null && now - lastWebhookAt < 5 * 60_000; - const totalOrgs = appSnapshot?.organizations.length ?? 0; - const authStatus = appSnapshot?.auth.status ?? "unknown"; + const repos = snapshot.repos ?? []; + const projects = snapshot.projects ?? []; const mono = css({ fontFamily: "ui-monospace, SFMono-Regular, 'SF Mono', Consolas, monospace", @@ -225,8 +175,8 @@ export const DevPanel = memo(function DevPanel({ organizationId, snapshot, organ {/* Body */}
- {/* Subscription Topics */} -
+ {/* Interest Topics */} +
{topics.map((topic) => (
{topic.label} - {topic.status} - {topic.params && ( - - {topic.params} - - )} + {topic.key.length > 24 ? `...${topic.key.slice(-20)}` : topic.key} {timeAgo(topic.lastRefresh)}
))} {topics.length === 0 && No active subscriptions}
- {/* App State */} -
-
-
- - Auth - {authStatus.replace(/_/g, " ")} -
-
- - -
-
app topic: {appState.status}
-
-
- {/* Snapshot Summary */} -
+
+ -
-
- {focusedTask ? ( -
-
- 0 && ( +
+ {tasks.slice(0, 10).map((task) => { + const status = taskStatusLabel(task); + return ( +
- - {focusedTask.title || focusedTask.id.slice(0, 12)} - - - {focusedTaskStatus ?? focusedTask.status} - -
-
{focusedTaskState.detail}
-
task: {focusedTask.id}
-
repo: {focusedTask.repoId}
-
branch: {focusedTask.branch ?? "-"}
-
- ) : ( - No task focused - )} -
- - {/* Session — only when a task is focused */} - {focusedTask && ( -
- {(focusedTask.sessions?.length ?? 0) > 0 ? ( - focusedTask.sessions!.map((session) => { - const isActive = session.id === focusedTask.activeSessionId; - const thinking = thinkingLabel(session.thinkingSinceMs, now); - return ( -
+ -
- - - {session.sessionName || session.id.slice(0, 12)} - {isActive ? " *" : ""} - - {session.status} -
-
- {session.agent} - {session.model} - {!session.created && not created} - {session.unread && unread} - {thinking && {thinking}} -
- {session.errorMessage && ( -
{session.errorMessage}
- )} - {session.sessionId &&
sid: {session.sessionId}
} -
- ); - }) - ) : ( - No sessions - )} -
- )} - - {/* Sandbox — only when a task is focused */} - {focusedTask && ( -
- {(focusedTask.sandboxes?.length ?? 0) > 0 ? ( - focusedTask.sandboxes!.map((sandbox) => { - const isActive = sandbox.sandboxId === focusedTask.activeSandboxId; - return ( -
-
- - - {sandbox.sandboxId.slice(0, 16)} - {isActive ? " *" : ""} - - {sandbox.sandboxProviderId} -
- {sandbox.cwd &&
cwd: {sandbox.cwd}
} -
- ); - }) - ) : ( - No sandboxes - )} + /> + + {task.title || task.id.slice(0, 12)} + + {status} + {task.tabs?.length ?? 0} tabs +
+ ); + })} )} {/* GitHub */} -
- {liveGithub ? ( + {organization && ( +
- App Install - - {liveGithub.installationStatus.replace(/_/g, " ")} + App + + {organization.github.installationStatus.replace(/_/g, " ")}
@@ -464,55 +276,29 @@ export const DevPanel = memo(function DevPanel({ organizationId, snapshot, organ width: "5px", height: "5px", borderRadius: "50%", - backgroundColor: syncStatusColor(liveGithub.syncStatus, t), + backgroundColor: syncStatusColor(organization.github.syncStatus, t), flexShrink: 0, })} /> Sync - {liveGithub.syncStatus} - {liveGithub.lastSyncAt != null && {timeAgo(liveGithub.lastSyncAt)}} -
-
- - Webhook - {lastWebhookAt != null ? ( - - {liveGithub.lastWebhookEvent} · {timeAgo(lastWebhookAt)} - - ) : ( - never received - )} + {organization.github.syncStatus}
- - - +
- {liveGithub.connectedAccount &&
@{liveGithub.connectedAccount}
} - {liveGithub.lastSyncLabel &&
last sync: {liveGithub.lastSyncLabel}
} - {liveGithub.syncPhase && ( -
- phase: {liveGithub.syncPhase.replace(/^syncing_/, "").replace(/_/g, " ")} ({liveGithub.processedRepositoryCount}/ - {liveGithub.totalRepositoryCount}) -
+ {organization.github.connectedAccount && ( +
@{organization.github.connectedAccount}
+ )} + {organization.github.lastSyncLabel && ( +
last sync: {organization.github.lastSyncLabel}
)}
- ) : ( - No organization data loaded - )} -
+
+ )} - {/* Organization */} -
-
{organizationId}
+ {/* Workspace */} +
+
{workspaceId}
{organization && (
org: {organization.settings.displayName} ({organization.kind}) diff --git a/foundry/packages/frontend/src/components/mock-layout.tsx b/foundry/packages/frontend/src/components/mock-layout.tsx index 4089e01..8bb3d5d 100644 --- a/foundry/packages/frontend/src/components/mock-layout.tsx +++ b/foundry/packages/frontend/src/components/mock-layout.tsx @@ -1,21 +1,16 @@ import { memo, useCallback, useEffect, useLayoutEffect, useMemo, useRef, useState, type PointerEvent as ReactPointerEvent } from "react"; -import { useQuery } from "@tanstack/react-query"; import { useNavigate } from "@tanstack/react-router"; import { useStyletron } from "baseui"; import { - DEFAULT_WORKSPACE_MODEL_GROUPS, - DEFAULT_WORKSPACE_MODEL_ID, createErrorContext, - type FoundryOrganization, - type TaskWorkspaceSnapshot, - type WorkspaceModelGroup, - type WorkspaceSessionSummary, - type WorkspaceTaskDetail, - type WorkspaceTaskSummary, + type TaskWorkbenchSnapshot, + type WorkbenchSessionSummary, + type WorkbenchTaskDetail, + type WorkbenchTaskSummary, } from "@sandbox-agent/foundry-shared"; -import { useSubscription } from "@sandbox-agent/foundry-client"; +import { useInterest } from "@sandbox-agent/foundry-client"; -import { CircleAlert, PanelLeft, PanelRight } from "lucide-react"; +import { PanelLeft, PanelRight } from "lucide-react"; import { useFoundryTokens } from "../app/theme"; import { logger } from "../logging.js"; @@ -24,10 +19,10 @@ import { MessageList } from "./mock-layout/message-list"; import { PromptComposer } from "./mock-layout/prompt-composer"; import { RightSidebar } from "./mock-layout/right-sidebar"; import { Sidebar } from "./mock-layout/sidebar"; -import { SessionStrip } from "./mock-layout/session-strip"; +import { TabStrip } from "./mock-layout/tab-strip"; import { TerminalPane } from "./mock-layout/terminal-pane"; import { TranscriptHeader } from "./mock-layout/transcript-header"; -import { PROMPT_TEXTAREA_MAX_HEIGHT, PROMPT_TEXTAREA_MIN_HEIGHT, SPanel, ScrollBody, Shell, SpinnerDot } from "./mock-layout/ui"; +import { PROMPT_TEXTAREA_MAX_HEIGHT, PROMPT_TEXTAREA_MIN_HEIGHT, SPanel, ScrollBody, Shell } from "./mock-layout/ui"; import { DevPanel, useDevPanel } from "./dev-panel"; import { buildDisplayMessages, @@ -42,13 +37,12 @@ import { type Message, type ModelId, } from "./mock-layout/view-model"; -import { activeMockOrganization, activeMockUser, getMockOrganizationById, useMockAppClient, useMockAppSnapshot } from "../lib/mock-app"; +import { activeMockOrganization, useMockAppSnapshot } from "../lib/mock-app"; import { backendClient } from "../lib/backend"; -import { subscriptionManager } from "../lib/subscription"; -import { describeTaskState, isProvisioningTaskStatus } from "../features/tasks/status"; +import { interestManager } from "../lib/interest"; -function firstAgentSessionId(task: Task): string | null { - return task.sessions[0]?.id ?? null; +function firstAgentTabId(task: Task): string | null { + return task.tabs[0]?.id ?? null; } function sanitizeOpenDiffs(task: Task, paths: string[] | undefined): string[] { @@ -59,93 +53,31 @@ function sanitizeOpenDiffs(task: Task, paths: string[] | undefined): string[] { return paths.filter((path) => task.diffs[path] != null); } -function sanitizeLastAgentSessionId(task: Task, sessionId: string | null | undefined): string | null { - if (sessionId && task.sessions.some((tab) => tab.id === sessionId)) { - return sessionId; +function sanitizeLastAgentTabId(task: Task, tabId: string | null | undefined): string | null { + if (tabId && task.tabs.some((tab) => tab.id === tabId)) { + return tabId; } - return firstAgentSessionId(task); + return firstAgentTabId(task); } -function sanitizeActiveSessionId(task: Task, sessionId: string | null | undefined, openDiffs: string[], lastAgentSessionId: string | null): string | null { - if (sessionId) { - if (task.sessions.some((tab) => tab.id === sessionId)) { - return sessionId; +function sanitizeActiveTabId(task: Task, tabId: string | null | undefined, openDiffs: string[], lastAgentTabId: string | null): string | null { + if (tabId) { + if (task.tabs.some((tab) => tab.id === tabId)) { + return tabId; } - if (isDiffTab(sessionId) && openDiffs.includes(diffPath(sessionId))) { - return sessionId; + if (isDiffTab(tabId) && openDiffs.includes(diffPath(tabId))) { + return tabId; } } - return openDiffs.length > 0 ? diffTabId(openDiffs[openDiffs.length - 1]!) : lastAgentSessionId; + return openDiffs.length > 0 ? diffTabId(openDiffs[openDiffs.length - 1]!) : lastAgentTabId; } -type GithubStatusView = Pick< - FoundryOrganization["github"], - "connectedAccount" | "installationStatus" | "syncStatus" | "importedRepoCount" | "lastSyncLabel" -> & { - syncPhase?: string | null; - processedRepositoryCount?: number; - totalRepositoryCount?: number; -}; - -function githubInstallationWarningTitle(github: GithubStatusView): string { - return github.installationStatus === "install_required" ? "GitHub App not installed" : "GitHub App needs reconnection"; -} - -function githubInstallationWarningDetail(github: GithubStatusView): string { - const statusDetail = github.lastSyncLabel.trim(); - const requirementDetail = - github.installationStatus === "install_required" - ? "Webhooks are required for Foundry to function. Repo sync and PR updates will not work until the GitHub App is installed for this organization." - : "Webhook delivery is unavailable. Repo sync and PR updates will not work until the GitHub App is reconnected."; - return statusDetail ? `${requirementDetail} ${statusDetail}.` : requirementDetail; -} - -function GithubInstallationWarning({ - github, - css, - t, -}: { - github: GithubStatusView; - css: ReturnType[0]; - t: ReturnType; -}) { - if (github.installationStatus === "connected") { - return null; - } - - return ( -
- -
-
{githubInstallationWarningTitle(github)}
-
{githubInstallationWarningDetail(github)}
-
-
- ); -} - -function toSessionModel( - summary: WorkspaceSessionSummary, - sessionDetail?: { draft: Task["sessions"][number]["draft"]; transcript: Task["sessions"][number]["transcript"] }, -): Task["sessions"][number] { +function toLegacyTab( + summary: WorkbenchSessionSummary, + sessionDetail?: { draft: Task["tabs"][number]["draft"]; transcript: Task["tabs"][number]["transcript"] }, +): Task["tabs"][number] { return { id: summary.id, sessionId: summary.sessionId, @@ -156,7 +88,6 @@ function toSessionModel( thinkingSinceMs: summary.thinkingSinceMs, unread: summary.unread, created: summary.created, - errorMessage: summary.errorMessage ?? null, draft: sessionDetail?.draft ?? { text: "", attachments: [], @@ -166,10 +97,10 @@ function toSessionModel( }; } -function toTaskModel( - summary: WorkspaceTaskSummary, - detail?: WorkspaceTaskDetail, - sessionCache?: Map, +function toLegacyTask( + summary: WorkbenchTaskSummary, + detail?: WorkbenchTaskDetail, + sessionCache?: Map, ): Task { const sessions = detail?.sessionsSummary ?? summary.sessionsSummary; return { @@ -181,107 +112,52 @@ function toTaskModel( updatedAtMs: detail?.updatedAtMs ?? summary.updatedAtMs, branch: detail?.branch ?? summary.branch, pullRequest: detail?.pullRequest ?? summary.pullRequest, - activeSessionId: detail?.activeSessionId ?? summary.activeSessionId ?? null, - sessions: sessions.map((session) => toSessionModel(session, sessionCache?.get(session.id))), + tabs: sessions.map((session) => toLegacyTab(session, sessionCache?.get(session.id))), fileChanges: detail?.fileChanges ?? [], diffs: detail?.diffs ?? {}, fileTree: detail?.fileTree ?? [], minutesUsed: detail?.minutesUsed ?? 0, - sandboxes: detail?.sandboxes ?? [], - activeSandboxId: detail?.activeSandboxId ?? null, - primaryUserLogin: detail?.primaryUserLogin ?? summary.primaryUserLogin ?? null, - primaryUserAvatarUrl: detail?.primaryUserAvatarUrl ?? summary.primaryUserAvatarUrl ?? null, }; } -function sessionStateMessage(tab: Task["sessions"][number] | null | undefined): string | null { - if (!tab) { - return null; - } - if (tab.status === "pending_provision") { - return "Provisioning sandbox..."; - } - if (tab.status === "pending_session_create") { - return "Creating session..."; - } - if (tab.status === "error") { - return tab.errorMessage ?? "Session failed to start."; - } - return null; -} - -function groupRepositories( - repos: Array<{ id: string; label: string }>, - tasks: Task[], - openPullRequests?: Array<{ - repoId: string; - repoFullName: string; - number: number; - title: string; - state: string; - url: string; - headRefName: string; - authorLogin: string | null; - isDraft: boolean; - }>, -) { +function groupProjects(repos: Array<{ id: string; label: string }>, tasks: Task[]) { return repos .map((repo) => ({ id: repo.id, label: repo.label, updatedAtMs: tasks.filter((task) => task.repoId === repo.id).reduce((latest, task) => Math.max(latest, task.updatedAtMs), 0), tasks: tasks.filter((task) => task.repoId === repo.id).sort((left, right) => right.updatedAtMs - left.updatedAtMs), - pullRequests: (openPullRequests ?? []).filter((pr) => pr.repoId === repo.id), })) - .sort((a, b) => { - // Repos with tasks first, then repos with PRs, then alphabetical - const aHasActivity = a.tasks.length > 0 || a.pullRequests.length > 0; - const bHasActivity = b.tasks.length > 0 || b.pullRequests.length > 0; - if (aHasActivity && !bHasActivity) return -1; - if (!aHasActivity && bHasActivity) return 1; - if (a.updatedAtMs !== b.updatedAtMs) return b.updatedAtMs - a.updatedAtMs; - return a.label.localeCompare(b.label); - }); + .filter((repo) => repo.tasks.length > 0); } -interface WorkspaceActions { - createTask(input: { - repoId: string; - task: string; - title?: string; - branch?: string; - onBranch?: string; - model?: ModelId; - }): Promise<{ taskId: string; sessionId?: string }>; - markTaskUnread(input: { repoId: string; taskId: string }): Promise; - renameTask(input: { repoId: string; taskId: string; value: string }): Promise; - archiveTask(input: { repoId: string; taskId: string }): Promise; - publishPr(input: { repoId: string; taskId: string }): Promise; - revertFile(input: { repoId: string; taskId: string; path: string }): Promise; - updateDraft(input: { repoId: string; taskId: string; sessionId: string; text: string; attachments: LineAttachment[] }): Promise; - sendMessage(input: { repoId: string; taskId: string; sessionId: string; text: string; attachments: LineAttachment[] }): Promise; - stopAgent(input: { repoId: string; taskId: string; sessionId: string }): Promise; - selectSession(input: { repoId: string; taskId: string; sessionId: string }): Promise; - setSessionUnread(input: { repoId: string; taskId: string; sessionId: string; unread: boolean }): Promise; - renameSession(input: { repoId: string; taskId: string; sessionId: string; title: string }): Promise; - closeSession(input: { repoId: string; taskId: string; sessionId: string }): Promise; - addSession(input: { repoId: string; taskId: string; model?: string }): Promise<{ sessionId: string }>; - changeModel(input: { repoId: string; taskId: string; sessionId: string; model: ModelId }): Promise; - changeOwner(input: { repoId: string; taskId: string; targetUserId: string; targetUserName: string; targetUserEmail: string }): Promise; - adminReloadGithubOrganization(): Promise; - adminReloadGithubRepository(repoId: string): Promise; +interface WorkbenchActions { + createTask(input: { repoId: string; task: string; title?: string; branch?: string; model?: ModelId }): Promise<{ taskId: string; tabId?: string }>; + markTaskUnread(input: { taskId: string }): Promise; + renameTask(input: { taskId: string; value: string }): Promise; + renameBranch(input: { taskId: string; value: string }): Promise; + archiveTask(input: { taskId: string }): Promise; + publishPr(input: { taskId: string }): Promise; + revertFile(input: { taskId: string; path: string }): Promise; + updateDraft(input: { taskId: string; tabId: string; text: string; attachments: LineAttachment[] }): Promise; + sendMessage(input: { taskId: string; tabId: string; text: string; attachments: LineAttachment[] }): Promise; + stopAgent(input: { taskId: string; tabId: string }): Promise; + setSessionUnread(input: { taskId: string; tabId: string; unread: boolean }): Promise; + renameSession(input: { taskId: string; tabId: string; title: string }): Promise; + closeTab(input: { taskId: string; tabId: string }): Promise; + addTab(input: { taskId: string; model?: string }): Promise<{ tabId: string }>; + changeModel(input: { taskId: string; tabId: string; model: ModelId }): Promise; } const TranscriptPanel = memo(function TranscriptPanel({ - taskWorkspaceClient, + taskWorkbenchClient, task, - hasSandbox, - activeSessionId, - lastAgentSessionId, + activeTabId, + lastAgentTabId, openDiffs, onSyncRouteSession, - onSetActiveSessionId, - onSetLastAgentSessionId, + onSetActiveTabId, + onSetLastAgentTabId, onSetOpenDiffs, sidebarCollapsed, onToggleSidebar, @@ -289,19 +165,16 @@ const TranscriptPanel = memo(function TranscriptPanel({ onSidebarPeekEnd, rightSidebarCollapsed, onToggleRightSidebar, - selectedSessionHydrating = false, - modelGroups, onNavigateToUsage, }: { - taskWorkspaceClient: WorkspaceActions; + taskWorkbenchClient: WorkbenchActions; task: Task; - hasSandbox: boolean; - activeSessionId: string | null; - lastAgentSessionId: string | null; + activeTabId: string | null; + lastAgentTabId: string | null; openDiffs: string[]; onSyncRouteSession: (taskId: string, sessionId: string | null, replace?: boolean) => void; - onSetActiveSessionId: (sessionId: string | null) => void; - onSetLastAgentSessionId: (sessionId: string | null) => void; + onSetActiveTabId: (tabId: string | null) => void; + onSetLastAgentTabId: (tabId: string | null) => void; onSetOpenDiffs: (paths: string[]) => void; sidebarCollapsed?: boolean; onToggleSidebar?: () => void; @@ -309,82 +182,28 @@ const TranscriptPanel = memo(function TranscriptPanel({ onSidebarPeekEnd?: () => void; rightSidebarCollapsed?: boolean; onToggleRightSidebar?: () => void; - selectedSessionHydrating?: boolean; - modelGroups: WorkspaceModelGroup[]; onNavigateToUsage?: () => void; }) { const t = useFoundryTokens(); - const appSnapshot = useMockAppSnapshot(); - const appClient = useMockAppClient(); - const currentUser = activeMockUser(appSnapshot); - const defaultModel = currentUser?.defaultModel ?? DEFAULT_WORKSPACE_MODEL_ID; - const [editingField, setEditingField] = useState<"title" | null>(null); + const [defaultModel, setDefaultModel] = useState("claude-sonnet-4"); + const [editingField, setEditingField] = useState<"title" | "branch" | null>(null); const [editValue, setEditValue] = useState(""); - const [editingSessionId, setEditingSessionId] = useState(null); + const [editingSessionTabId, setEditingSessionTabId] = useState(null); const [editingSessionName, setEditingSessionName] = useState(""); - const [pendingHistoryTarget, setPendingHistoryTarget] = useState<{ messageId: string; sessionId: string } | null>(null); + const [pendingHistoryTarget, setPendingHistoryTarget] = useState<{ messageId: string; tabId: string } | null>(null); const [copiedMessageId, setCopiedMessageId] = useState(null); const [timerNowMs, setTimerNowMs] = useState(() => Date.now()); - const [localDraft, setLocalDraft] = useState(""); - const [localAttachments, setLocalAttachments] = useState([]); - const [pendingMessage, setPendingMessage] = useState<{ text: string; sessionId: string; sentAt: number } | null>(null); - const lastEditTimeRef = useRef(0); - const throttleTimerRef = useRef | null>(null); - const pendingDraftRef = useRef<{ text: string; attachments: LineAttachment[] } | null>(null); const scrollRef = useRef(null); const textareaRef = useRef(null); const messageRefs = useRef(new Map()); - const activeDiff = activeSessionId && isDiffTab(activeSessionId) ? diffPath(activeSessionId) : null; - const activeAgentSession = activeDiff ? null : (task.sessions.find((candidate) => candidate.id === activeSessionId) ?? task.sessions[0] ?? null); - const promptSession = task.sessions.find((candidate) => candidate.id === lastAgentSessionId) ?? task.sessions[0] ?? null; + const activeDiff = activeTabId && isDiffTab(activeTabId) ? diffPath(activeTabId) : null; + const activeAgentTab = activeDiff ? null : (task.tabs.find((candidate) => candidate.id === activeTabId) ?? task.tabs[0] ?? null); + const promptTab = task.tabs.find((candidate) => candidate.id === lastAgentTabId) ?? task.tabs[0] ?? null; const isTerminal = task.status === "archived"; - const historyEvents = useMemo(() => buildHistoryEvents(task.sessions), [task.sessions]); - const activeMessages = useMemo(() => buildDisplayMessages(activeAgentSession), [activeAgentSession]); - const taskState = describeTaskState(task.status); - const taskProvisioning = isProvisioningTaskStatus(task.status); - const taskProvisioningMessage = taskState.detail; - const activeSessionMessage = sessionStateMessage(activeAgentSession); - const showPendingSessionState = - !activeDiff && - !!activeAgentSession && - (activeAgentSession.status === "pending_provision" || activeAgentSession.status === "pending_session_create" || activeAgentSession.status === "error") && - activeMessages.length === 0; - const serverDraft = promptSession?.draft.text ?? ""; - const serverAttachments = promptSession?.draft.attachments; - const serverAttachmentsJson = JSON.stringify(serverAttachments ?? []); - - // Sync server → local only when user hasn't typed recently (3s cooldown) - const DRAFT_SYNC_COOLDOWN_MS = 3_000; - useEffect(() => { - if (Date.now() - lastEditTimeRef.current > DRAFT_SYNC_COOLDOWN_MS) { - setLocalDraft(serverDraft); - setLocalAttachments(serverAttachments ?? []); - } - }, [serverDraft, serverAttachmentsJson]); - - // Reset local draft immediately on session/task switch - useEffect(() => { - lastEditTimeRef.current = 0; - setLocalDraft(promptSession?.draft.text ?? ""); - setLocalAttachments(promptSession?.draft.attachments ?? []); - }, [promptSession?.id, task.id]); - - // Clear pending message once the real transcript contains a client message newer than when we sent - const pendingMessageClientCount = useRef(0); - useEffect(() => { - if (!pendingMessage) return; - - const targetSession = task.sessions.find((s) => s.id === pendingMessage.sessionId); - if (!targetSession) return; - - const clientEventCount = targetSession.transcript.filter((event) => event.sender === "client").length; - if (clientEventCount > pendingMessageClientCount.current) { - setPendingMessage(null); - } - }, [task.sessions, pendingMessage]); - - const draft = localDraft; - const attachments = localAttachments; + const historyEvents = useMemo(() => buildHistoryEvents(task.tabs), [task.tabs]); + const activeMessages = useMemo(() => buildDisplayMessages(activeAgentTab), [activeAgentTab]); + const draft = promptTab?.draft.text ?? ""; + const attachments = promptTab?.draft.attachments ?? []; useEffect(() => { if (scrollRef.current) { @@ -394,10 +213,10 @@ const TranscriptPanel = memo(function TranscriptPanel({ useEffect(() => { textareaRef.current?.focus(); - }, [activeSessionId, task.id]); + }, [activeTabId, task.id]); useEffect(() => { - setEditingSessionId(null); + setEditingSessionTabId(null); setEditingSessionName(""); }, [task.id]); @@ -411,7 +230,21 @@ const TranscriptPanel = memo(function TranscriptPanel({ const nextHeight = Math.min(textarea.scrollHeight, PROMPT_TEXTAREA_MAX_HEIGHT); textarea.style.height = `${Math.max(PROMPT_TEXTAREA_MIN_HEIGHT, nextHeight)}px`; textarea.style.overflowY = textarea.scrollHeight > PROMPT_TEXTAREA_MAX_HEIGHT ? "auto" : "hidden"; - }, [draft, activeSessionId, task.id]); + }, [draft, activeTabId, task.id]); + + useEffect(() => { + if (!pendingHistoryTarget || activeTabId !== pendingHistoryTarget.tabId) { + return; + } + + const targetNode = messageRefs.current.get(pendingHistoryTarget.messageId); + if (!targetNode) { + return; + } + + targetNode.scrollIntoView({ behavior: "smooth", block: "center" }); + setPendingHistoryTarget(null); + }, [activeMessages.length, activeTabId, pendingHistoryTarget]); useEffect(() => { if (!copiedMessageId) { @@ -426,7 +259,7 @@ const TranscriptPanel = memo(function TranscriptPanel({ }, [copiedMessageId]); useEffect(() => { - if (!activeAgentSession || activeAgentSession.status !== "running" || activeAgentSession.thinkingSinceMs === null) { + if (!activeAgentTab || activeAgentTab.status !== "running" || activeAgentTab.thinkingSinceMs === null) { return; } @@ -436,22 +269,21 @@ const TranscriptPanel = memo(function TranscriptPanel({ }, 1_000); return () => window.clearInterval(timer); - }, [activeAgentSession?.id, activeAgentSession?.status, activeAgentSession?.thinkingSinceMs]); + }, [activeAgentTab?.id, activeAgentTab?.status, activeAgentTab?.thinkingSinceMs]); useEffect(() => { - if (!activeAgentSession?.unread) { + if (!activeAgentTab?.unread) { return; } - void taskWorkspaceClient.setSessionUnread({ - repoId: task.repoId, + void taskWorkbenchClient.setSessionUnread({ taskId: task.id, - sessionId: activeAgentSession.id, + tabId: activeAgentTab.id, unread: false, }); - }, [activeAgentSession?.id, activeAgentSession?.unread, task.id]); + }, [activeAgentTab?.id, activeAgentTab?.unread, task.id]); - const startEditingField = useCallback((field: "title", value: string) => { + const startEditingField = useCallback((field: "title" | "branch", value: string) => { setEditingField(field); setEditValue(value); }, []); @@ -461,236 +293,186 @@ const TranscriptPanel = memo(function TranscriptPanel({ }, []); const commitEditingField = useCallback( - (field: "title") => { + (field: "title" | "branch") => { const value = editValue.trim(); if (!value) { setEditingField(null); return; } - void taskWorkspaceClient.renameTask({ repoId: task.repoId, taskId: task.id, value }); + if (field === "title") { + void taskWorkbenchClient.renameTask({ taskId: task.id, value }); + } else { + void taskWorkbenchClient.renameBranch({ taskId: task.id, value }); + } setEditingField(null); }, [editValue, task.id], ); - const DRAFT_THROTTLE_MS = 500; - - const flushDraft = useCallback( - (text: string, nextAttachments: LineAttachment[], sessionId: string) => { - void taskWorkspaceClient.updateDraft({ - repoId: task.repoId, - taskId: task.id, - sessionId, - text, - attachments: nextAttachments, - }); - }, - [task.id], - ); - - // Clean up throttle timer on unmount - useEffect(() => { - return () => { - if (throttleTimerRef.current) { - clearTimeout(throttleTimerRef.current); - } - }; - }, []); - const updateDraft = useCallback( (nextText: string, nextAttachments: LineAttachment[]) => { - if (!promptSession) { + if (!promptTab) { return; } - // Update local state immediately for responsive typing - lastEditTimeRef.current = Date.now(); - setLocalDraft(nextText); - setLocalAttachments(nextAttachments); - - // Throttle the network call - pendingDraftRef.current = { text: nextText, attachments: nextAttachments }; - if (!throttleTimerRef.current) { - throttleTimerRef.current = setTimeout(() => { - throttleTimerRef.current = null; - if (pendingDraftRef.current) { - flushDraft(pendingDraftRef.current.text, pendingDraftRef.current.attachments, promptSession.id); - pendingDraftRef.current = null; - } - }, DRAFT_THROTTLE_MS); - } + void taskWorkbenchClient.updateDraft({ + taskId: task.id, + tabId: promptTab.id, + text: nextText, + attachments: nextAttachments, + }); }, - [promptSession, flushDraft], + [task.id, promptTab], ); const sendMessage = useCallback(() => { const text = draft.trim(); - if (!text || !promptSession) { + if (!text || !promptTab) { return; } - // Clear draft and show optimistic message immediately (don't wait for server round-trip) - setLocalDraft(""); - setLocalAttachments([]); - lastEditTimeRef.current = Date.now(); - // Snapshot current client message count so we can detect when the server adds ours - pendingMessageClientCount.current = promptSession.transcript.filter((event) => event.sender === "client").length; - setPendingMessage({ text, sessionId: promptSession.id, sentAt: Date.now() }); - - onSetActiveSessionId(promptSession.id); - onSetLastAgentSessionId(promptSession.id); - void taskWorkspaceClient.sendMessage({ - repoId: task.repoId, + onSetActiveTabId(promptTab.id); + onSetLastAgentTabId(promptTab.id); + void taskWorkbenchClient.sendMessage({ taskId: task.id, - sessionId: promptSession.id, + tabId: promptTab.id, text, attachments, }); - }, [attachments, draft, task.id, onSetActiveSessionId, onSetLastAgentSessionId, promptSession]); + }, [attachments, draft, task.id, onSetActiveTabId, onSetLastAgentTabId, promptTab]); const stopAgent = useCallback(() => { - if (!promptSession) { + if (!promptTab) { return; } - void taskWorkspaceClient.stopAgent({ - repoId: task.repoId, + void taskWorkbenchClient.stopAgent({ taskId: task.id, - sessionId: promptSession.id, + tabId: promptTab.id, }); - }, [task.id, promptSession]); + }, [task.id, promptTab]); - const switchSession = useCallback( - (sessionId: string) => { - onSetActiveSessionId(sessionId); + const switchTab = useCallback( + (tabId: string) => { + onSetActiveTabId(tabId); - if (!isDiffTab(sessionId)) { - onSetLastAgentSessionId(sessionId); - void taskWorkspaceClient.selectSession({ - repoId: task.repoId, - taskId: task.id, - sessionId, - }); - const session = task.sessions.find((candidate) => candidate.id === sessionId); - if (session?.unread) { - void taskWorkspaceClient.setSessionUnread({ - repoId: task.repoId, + if (!isDiffTab(tabId)) { + onSetLastAgentTabId(tabId); + const tab = task.tabs.find((candidate) => candidate.id === tabId); + if (tab?.unread) { + void taskWorkbenchClient.setSessionUnread({ taskId: task.id, - sessionId, + tabId, unread: false, }); } - onSyncRouteSession(task.id, sessionId); + onSyncRouteSession(task.id, tabId); } }, - [task.id, task.repoId, task.sessions, onSetActiveSessionId, onSetLastAgentSessionId, onSyncRouteSession], + [task.id, task.tabs, onSetActiveTabId, onSetLastAgentTabId, onSyncRouteSession], ); - const setSessionUnread = useCallback( - (sessionId: string, unread: boolean) => { - void taskWorkspaceClient.setSessionUnread({ repoId: task.repoId, taskId: task.id, sessionId, unread }); + const setTabUnread = useCallback( + (tabId: string, unread: boolean) => { + void taskWorkbenchClient.setSessionUnread({ taskId: task.id, tabId, unread }); }, - [task.id, task.repoId], + [task.id], ); - const startRenamingSession = useCallback( - (sessionId: string) => { - const targetSession = task.sessions.find((candidate) => candidate.id === sessionId); - if (!targetSession) { - throw new Error(`Unable to rename missing session ${sessionId}`); + const startRenamingTab = useCallback( + (tabId: string) => { + const targetTab = task.tabs.find((candidate) => candidate.id === tabId); + if (!targetTab) { + throw new Error(`Unable to rename missing session tab ${tabId}`); } - setEditingSessionId(sessionId); - setEditingSessionName(targetSession.sessionName); + setEditingSessionTabId(tabId); + setEditingSessionName(targetTab.sessionName); }, - [task.sessions], + [task.tabs], ); - const cancelSessionRename = useCallback(() => { - setEditingSessionId(null); + const cancelTabRename = useCallback(() => { + setEditingSessionTabId(null); setEditingSessionName(""); }, []); - const commitSessionRename = useCallback(() => { - if (!editingSessionId) { + const commitTabRename = useCallback(() => { + if (!editingSessionTabId) { return; } const trimmedName = editingSessionName.trim(); if (!trimmedName) { - cancelSessionRename(); + cancelTabRename(); return; } - void taskWorkspaceClient.renameSession({ - repoId: task.repoId, + void taskWorkbenchClient.renameSession({ taskId: task.id, - sessionId: editingSessionId, + tabId: editingSessionTabId, title: trimmedName, }); - cancelSessionRename(); - }, [cancelSessionRename, editingSessionName, editingSessionId, task.id]); + cancelTabRename(); + }, [cancelTabRename, editingSessionName, editingSessionTabId, task.id]); - const closeSession = useCallback( - (sessionId: string) => { - const remainingSessions = task.sessions.filter((candidate) => candidate.id !== sessionId); - const nextSessionId = remainingSessions[0]?.id ?? null; + const closeTab = useCallback( + (tabId: string) => { + const remainingTabs = task.tabs.filter((candidate) => candidate.id !== tabId); + const nextTabId = remainingTabs[0]?.id ?? null; - if (activeSessionId === sessionId) { - onSetActiveSessionId(nextSessionId); + if (activeTabId === tabId) { + onSetActiveTabId(nextTabId); } - if (lastAgentSessionId === sessionId) { - onSetLastAgentSessionId(nextSessionId); + if (lastAgentTabId === tabId) { + onSetLastAgentTabId(nextTabId); } - onSyncRouteSession(task.id, nextSessionId); - void taskWorkspaceClient.closeSession({ repoId: task.repoId, taskId: task.id, sessionId }); + onSyncRouteSession(task.id, nextTabId); + void taskWorkbenchClient.closeTab({ taskId: task.id, tabId }); }, - [activeSessionId, task.id, task.repoId, task.sessions, lastAgentSessionId, onSetActiveSessionId, onSetLastAgentSessionId, onSyncRouteSession], + [activeTabId, task.id, task.tabs, lastAgentTabId, onSetActiveTabId, onSetLastAgentTabId, onSyncRouteSession], ); const closeDiffTab = useCallback( (path: string) => { const nextOpenDiffs = openDiffs.filter((candidate) => candidate !== path); onSetOpenDiffs(nextOpenDiffs); - if (activeSessionId === diffTabId(path)) { - onSetActiveSessionId( - nextOpenDiffs.length > 0 ? diffTabId(nextOpenDiffs[nextOpenDiffs.length - 1]!) : (lastAgentSessionId ?? firstAgentSessionId(task)), - ); + if (activeTabId === diffTabId(path)) { + onSetActiveTabId(nextOpenDiffs.length > 0 ? diffTabId(nextOpenDiffs[nextOpenDiffs.length - 1]!) : (lastAgentTabId ?? firstAgentTabId(task))); } }, - [activeSessionId, task, lastAgentSessionId, onSetActiveSessionId, onSetOpenDiffs, openDiffs], + [activeTabId, task, lastAgentTabId, onSetActiveTabId, onSetOpenDiffs, openDiffs], ); - const addSession = useCallback(() => { + const addTab = useCallback(() => { void (async () => { - const { sessionId } = await taskWorkspaceClient.addSession({ repoId: task.repoId, taskId: task.id }); - onSetLastAgentSessionId(sessionId); - onSetActiveSessionId(sessionId); - onSyncRouteSession(task.id, sessionId); + const { tabId } = await taskWorkbenchClient.addTab({ taskId: task.id }); + onSetLastAgentTabId(tabId); + onSetActiveTabId(tabId); + onSyncRouteSession(task.id, tabId); })(); - }, [task.id, task.repoId, onSetActiveSessionId, onSetLastAgentSessionId, onSyncRouteSession]); + }, [task.id, onSetActiveTabId, onSetLastAgentTabId, onSyncRouteSession]); const changeModel = useCallback( (model: ModelId) => { - if (!promptSession) { - throw new Error(`Unable to change model for task ${task.id} without an active prompt session`); + if (!promptTab) { + throw new Error(`Unable to change model for task ${task.id} without an active prompt tab`); } - void taskWorkspaceClient.changeModel({ - repoId: task.repoId, + void taskWorkbenchClient.changeModel({ taskId: task.id, - sessionId: promptSession.id, + tabId: promptTab.id, model, }); }, - [task.id, promptSession], + [task.id, promptTab], ); const addAttachment = useCallback( (filePath: string, lineNumber: number, lineContent: string) => { - if (!promptSession) { + if (!promptTab) { return; } @@ -701,7 +483,7 @@ const TranscriptPanel = memo(function TranscriptPanel({ updateDraft(draft, [...attachments, nextAttachment]); }, - [attachments, draft, promptSession, updateDraft], + [attachments, draft, promptTab, updateDraft], ); const removeAttachment = useCallback( @@ -716,13 +498,20 @@ const TranscriptPanel = memo(function TranscriptPanel({ const jumpToHistoryEvent = useCallback( (event: HistoryEvent) => { - setPendingHistoryTarget({ messageId: event.messageId, sessionId: event.sessionId }); + setPendingHistoryTarget({ messageId: event.messageId, tabId: event.tabId }); - if (activeSessionId !== event.sessionId) { - switchSession(event.sessionId); + if (activeTabId !== event.tabId) { + switchTab(event.tabId); + return; + } + + const targetNode = messageRefs.current.get(event.messageId); + if (targetNode) { + targetNode.scrollIntoView({ behavior: "smooth", block: "center" }); + setPendingHistoryTarget(null); } }, - [activeSessionId, switchSession], + [activeTabId, switchTab], ); const copyMessage = useCallback(async (message: Message) => { @@ -744,29 +533,25 @@ const TranscriptPanel = memo(function TranscriptPanel({ } }, []); - const isOptimisticThinking = pendingMessage !== null && activeAgentSession?.id === pendingMessage.sessionId; const thinkingTimerLabel = - activeAgentSession?.status === "running" && activeAgentSession.thinkingSinceMs !== null - ? formatThinkingDuration(timerNowMs - activeAgentSession.thinkingSinceMs) - : isOptimisticThinking - ? formatThinkingDuration(timerNowMs - pendingMessage.sentAt) - : null; + activeAgentTab?.status === "running" && activeAgentTab.thinkingSinceMs !== null + ? formatThinkingDuration(timerNowMs - activeAgentTab.thinkingSinceMs) + : null; return ( { - if (activeAgentSession) { - setSessionUnread(activeAgentSession.id, unread); + onSetActiveTabUnread={(unread) => { + if (activeAgentTab) { + setTabUnread(activeAgentTab.id, unread); } }} sidebarCollapsed={sidebarCollapsed} @@ -792,21 +577,21 @@ const TranscriptPanel = memo(function TranscriptPanel({ border: `1px solid ${t.borderDefault}`, }} > - {activeDiff ? ( @@ -816,7 +601,7 @@ const TranscriptPanel = memo(function TranscriptPanel({ diff={task.diffs[activeDiff]} onAddAttachment={addAttachment} /> - ) : task.sessions.length === 0 ? ( + ) : task.tabs.length === 0 ? (
- {taskProvisioning ? ( - <> - -

{taskState.title}

-

{taskProvisioningMessage}

- - ) : ( - <> -

Create the first session

-

Sessions are where you chat with the agent. Start one now to send the first prompt on this task.

- - - )} -
-
- - ) : selectedSessionHydrating ? ( - -
-
- -

Loading session

-

Fetching the latest transcript for this session.

-
-
-
- ) : showPendingSessionState ? ( - -
-
- {activeAgentSession?.status === "error" ? null : } -

- {activeAgentSession?.status === "pending_provision" - ? "Provisioning sandbox" - : activeAgentSession?.status === "pending_session_create" - ? "Creating session" - : "Session unavailable"} -

-

{activeSessionMessage}

- {activeAgentSession?.status === "error" ? ( - - ) : null} +

Create the first session

+

Sessions are where you chat with the agent. Start one now to send the first prompt on this task.

+
) : ( setPendingHistoryTarget(null)} copiedMessageId={copiedMessageId} onCopyMessage={(message) => { void copyMessage(message); }} thinkingTimerLabel={thinkingTimerLabel} - pendingMessage={ - pendingMessage && activeAgentSession?.id === pendingMessage.sessionId ? { text: pendingMessage.text, sentAt: pendingMessage.sentAt } : null - } /> )} - {!isTerminal && promptSession && (promptSession.status === "ready" || promptSession.status === "running" || promptSession.status === "idle") ? ( + {!isTerminal && promptTab ? ( updateDraft(value, attachments)} onSend={sendMessage} onStop={stopAgent} onRemoveAttachment={removeAttachment} onChangeModel={changeModel} - onSetDefaultModel={(model) => { - void appClient.setDefaultModel(model); - }} + onSetDefaultModel={setDefaultModel} /> ) : null} @@ -1066,26 +754,22 @@ const DEFAULT_TERMINAL_HEIGHT = 320; const TERMINAL_HEIGHT_STORAGE_KEY = "foundry:terminal-height"; const RightRail = memo(function RightRail({ - organizationId, + workspaceId, task, - activeSessionId, + activeTabId, onOpenDiff, onArchive, onRevertFile, onPublishPr, - onChangeOwner, - members, onToggleSidebar, }: { - organizationId: string; + workspaceId: string; task: Task; - activeSessionId: string | null; + activeTabId: string | null; onOpenDiff: (path: string) => void; onArchive: () => void; onRevertFile: (path: string) => void; onPublishPr: () => void; - onChangeOwner: (member: { id: string; name: string; email: string }) => void; - members: Array<{ id: string; name: string; email: string }>; onToggleSidebar?: () => void; }) { const [css] = useStyletron(); @@ -1173,13 +857,11 @@ const RightRail = memo(function RightRail({ > @@ -1197,7 +879,7 @@ const RightRail = memo(function RightRail({ })} > { @@ -1219,12 +901,12 @@ const RightRail = memo(function RightRail({ }); interface MockLayoutProps { - organizationId: string; + workspaceId: string; selectedTaskId?: string | null; selectedSessionId?: string | null; } -function MockOrganizationOrgBar() { +function MockWorkspaceOrgBar() { const navigate = useNavigate(); const snapshot = useMockAppSnapshot(); const organization = activeMockOrganization(snapshot); @@ -1300,104 +982,68 @@ function MockOrganizationOrgBar() { ); } -export function MockLayout({ organizationId, selectedTaskId, selectedSessionId }: MockLayoutProps) { +export function MockLayout({ workspaceId, selectedTaskId, selectedSessionId }: MockLayoutProps) { const [css] = useStyletron(); const t = useFoundryTokens(); const navigate = useNavigate(); - const taskWorkspaceClient = useMemo( + const taskWorkbenchClient = useMemo( () => ({ - createTask: (input) => backendClient.createWorkspaceTask(organizationId, input), - markTaskUnread: (input) => backendClient.markWorkspaceUnread(organizationId, input), - renameTask: (input) => backendClient.renameWorkspaceTask(organizationId, input), - archiveTask: async (input) => backendClient.runAction(organizationId, input.repoId, input.taskId, "archive"), - publishPr: (input) => backendClient.publishWorkspacePr(organizationId, input), - revertFile: (input) => backendClient.revertWorkspaceFile(organizationId, input), - updateDraft: (input) => backendClient.updateWorkspaceDraft(organizationId, input), - sendMessage: (input) => backendClient.sendWorkspaceMessage(organizationId, input), - stopAgent: (input) => backendClient.stopWorkspaceSession(organizationId, input), - selectSession: (input) => backendClient.selectWorkspaceSession(organizationId, input), - setSessionUnread: (input) => backendClient.setWorkspaceSessionUnread(organizationId, input), - renameSession: (input) => backendClient.renameWorkspaceSession(organizationId, input), - closeSession: (input) => backendClient.closeWorkspaceSession(organizationId, input), - addSession: (input) => backendClient.createWorkspaceSession(organizationId, input), - changeModel: (input) => backendClient.changeWorkspaceModel(organizationId, input), - changeOwner: (input) => backendClient.changeWorkspaceTaskOwner(organizationId, input), - adminReloadGithubOrganization: () => backendClient.adminReloadGithubOrganization(organizationId), - adminReloadGithubRepository: (repoId) => backendClient.adminReloadGithubRepository(organizationId, repoId), + createTask: (input) => backendClient.createWorkbenchTask(workspaceId, input), + markTaskUnread: (input) => backendClient.markWorkbenchUnread(workspaceId, input), + renameTask: (input) => backendClient.renameWorkbenchTask(workspaceId, input), + renameBranch: (input) => backendClient.renameWorkbenchBranch(workspaceId, input), + archiveTask: async (input) => backendClient.runAction(workspaceId, input.taskId, "archive"), + publishPr: (input) => backendClient.publishWorkbenchPr(workspaceId, input), + revertFile: (input) => backendClient.revertWorkbenchFile(workspaceId, input), + updateDraft: (input) => backendClient.updateWorkbenchDraft(workspaceId, input), + sendMessage: (input) => backendClient.sendWorkbenchMessage(workspaceId, input), + stopAgent: (input) => backendClient.stopWorkbenchSession(workspaceId, input), + setSessionUnread: (input) => backendClient.setWorkbenchSessionUnread(workspaceId, input), + renameSession: (input) => backendClient.renameWorkbenchSession(workspaceId, input), + closeTab: (input) => backendClient.closeWorkbenchSession(workspaceId, input), + addTab: (input) => backendClient.createWorkbenchSession(workspaceId, input), + changeModel: (input) => backendClient.changeWorkbenchModel(workspaceId, input), }), - [organizationId], + [workspaceId], ); - const organizationState = useSubscription(subscriptionManager, "organization", { organizationId }); - const organizationReposData = organizationState.data?.repos; - const taskSummariesData = organizationState.data?.taskSummaries; - const openPullRequestsData = organizationState.data?.openPullRequests; - const organizationRepos = organizationReposData ?? []; - const taskSummaries = taskSummariesData ?? []; + const workspaceState = useInterest(interestManager, "workspace", { workspaceId }); + const workspaceRepos = workspaceState.data?.repos ?? []; + const taskSummaries = workspaceState.data?.taskSummaries ?? []; const selectedTaskSummary = useMemo( () => taskSummaries.find((task) => task.id === selectedTaskId) ?? taskSummaries[0] ?? null, - [selectedTaskId, taskSummariesData], + [selectedTaskId, taskSummaries], ); - const taskState = useSubscription( - subscriptionManager, + const taskState = useInterest( + interestManager, "task", selectedTaskSummary ? { - organizationId, + workspaceId, repoId: selectedTaskSummary.repoId, taskId: selectedTaskSummary.id, } : null, ); - const sessionState = useSubscription( - subscriptionManager, + const sessionState = useInterest( + interestManager, "session", selectedTaskSummary && selectedSessionId ? { - organizationId, + workspaceId, repoId: selectedTaskSummary.repoId, taskId: selectedTaskSummary.id, sessionId: selectedSessionId, } : null, ); - const activeSandbox = useMemo(() => { - if (!taskState.data?.activeSandboxId) return null; - return taskState.data.sandboxes?.find((s) => s.sandboxId === taskState.data!.activeSandboxId) ?? null; - }, [taskState.data?.activeSandboxId, taskState.data?.sandboxes]); - const sandboxState = useSubscription( - subscriptionManager, - "sandboxProcesses", - activeSandbox - ? { - organizationId, - sandboxProviderId: activeSandbox.sandboxProviderId, - sandboxId: activeSandbox.sandboxId, - } - : null, - ); - const hasSandbox = Boolean(activeSandbox) && sandboxState.status !== "error"; - const modelGroupsQuery = useQuery({ - queryKey: ["mock-layout", "workspace-model-groups", organizationId, activeSandbox?.sandboxProviderId ?? "", activeSandbox?.sandboxId ?? ""], - enabled: Boolean(activeSandbox?.sandboxId), - staleTime: 30_000, - refetchOnWindowFocus: false, - queryFn: async () => { - if (!activeSandbox) { - throw new Error("Cannot load workspace model groups without an active sandbox."); - } - - return await backendClient.getSandboxWorkspaceModelGroups(organizationId, activeSandbox.sandboxProviderId, activeSandbox.sandboxId); - }, - }); - const modelGroups = modelGroupsQuery.data && modelGroupsQuery.data.length > 0 ? modelGroupsQuery.data : DEFAULT_WORKSPACE_MODEL_GROUPS; const tasks = useMemo(() => { - const sessionCache = new Map(); + const sessionCache = new Map(); if (selectedTaskSummary && taskState.data) { for (const session of taskState.data.sessionsSummary) { const cached = (selectedSessionId && session.id === selectedSessionId ? sessionState.data : undefined) ?? - subscriptionManager.getSnapshot("session", { - organizationId, + interestManager.getSnapshot("session", { + workspaceId, repoId: selectedTaskSummary.repoId, taskId: selectedTaskSummary.id, sessionId: session.id, @@ -1411,34 +1057,30 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } } } - const hydratedTasks = taskSummaries.map((summary) => - summary.id === selectedTaskSummary?.id ? toTaskModel(summary, taskState.data, sessionCache) : toTaskModel(summary), + return taskSummaries.map((summary) => + summary.id === selectedTaskSummary?.id ? toLegacyTask(summary, taskState.data, sessionCache) : toLegacyTask(summary), ); - return hydratedTasks.sort((left, right) => right.updatedAtMs - left.updatedAtMs); - }, [selectedTaskSummary, selectedSessionId, sessionState.data, taskState.data, taskSummariesData, organizationId]); - const openPullRequests = openPullRequestsData ?? []; - const rawRepositories = useMemo(() => groupRepositories(organizationRepos, tasks, openPullRequests), [tasks, organizationReposData, openPullRequestsData]); + }, [selectedTaskSummary, selectedSessionId, sessionState.data, taskState.data, taskSummaries, workspaceId]); + const rawProjects = useMemo(() => groupProjects(workspaceRepos, tasks), [tasks, workspaceRepos]); const appSnapshot = useMockAppSnapshot(); - const currentUser = activeMockUser(appSnapshot); const activeOrg = activeMockOrganization(appSnapshot); - const liveGithub = organizationState.data?.github ?? activeOrg?.github ?? null; const navigateToUsage = useCallback(() => { if (activeOrg) { void navigate({ to: "/organizations/$organizationId/billing" as never, params: { organizationId: activeOrg.id } as never }); } }, [activeOrg, navigate]); - const [repositoryOrder, setRepositoryOrder] = useState(null); - const repositories = useMemo(() => { - if (!repositoryOrder) return rawRepositories; - const byId = new Map(rawRepositories.map((p) => [p.id, p])); - const ordered = repositoryOrder.map((id) => byId.get(id)).filter(Boolean) as typeof rawRepositories; - for (const p of rawRepositories) { - if (!repositoryOrder.includes(p.id)) ordered.push(p); + const [projectOrder, setProjectOrder] = useState(null); + const projects = useMemo(() => { + if (!projectOrder) return rawProjects; + const byId = new Map(rawProjects.map((p) => [p.id, p])); + const ordered = projectOrder.map((id) => byId.get(id)).filter(Boolean) as typeof rawProjects; + for (const p of rawProjects) { + if (!projectOrder.includes(p.id)) ordered.push(p); } return ordered; - }, [rawRepositories, repositoryOrder]); - const [activeSessionIdByTask, setActiveSessionIdByTask] = useState>({}); - const [lastAgentSessionIdByTask, setLastAgentSessionIdByTask] = useState>({}); + }, [rawProjects, projectOrder]); + const [activeTabIdByTask, setActiveTabIdByTask] = useState>({}); + const [lastAgentTabIdByTask, setLastAgentTabIdByTask] = useState>({}); const [openDiffsByTask, setOpenDiffsByTask] = useState>({}); const [selectedNewTaskRepoId, setSelectedNewTaskRepoId] = useState(""); const [leftWidth, setLeftWidth] = useState(() => readStoredWidth(LEFT_WIDTH_STORAGE_KEY, LEFT_SIDEBAR_DEFAULT_WIDTH)); @@ -1461,28 +1103,28 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } peekTimeoutRef.current = setTimeout(() => setLeftSidebarPeeking(false), 200); }, []); - const reorderRepositories = useCallback( + const reorderProjects = useCallback( (fromIndex: number, toIndex: number) => { - const ids = repositories.map((p) => p.id); + const ids = projects.map((p) => p.id); const [moved] = ids.splice(fromIndex, 1); ids.splice(toIndex, 0, moved!); - setRepositoryOrder(ids); + setProjectOrder(ids); }, - [repositories], + [projects], ); - const [taskOrderByRepository, setTaskOrderByRepository] = useState>({}); + const [taskOrderByProject, setTaskOrderByProject] = useState>({}); const reorderTasks = useCallback( - (repositoryId: string, fromIndex: number, toIndex: number) => { - const repository = repositories.find((p) => p.id === repositoryId); - if (!repository) return; - const currentOrder = taskOrderByRepository[repositoryId] ?? repository.tasks.map((t) => t.id); + (projectId: string, fromIndex: number, toIndex: number) => { + const project = projects.find((p) => p.id === projectId); + if (!project) return; + const currentOrder = taskOrderByProject[projectId] ?? project.tasks.map((t) => t.id); const ids = [...currentOrder]; const [moved] = ids.splice(fromIndex, 1); ids.splice(toIndex, 0, moved!); - setTaskOrderByRepository((prev) => ({ ...prev, [repositoryId]: ids })); + setTaskOrderByProject((prev) => ({ ...prev, [projectId]: ids })); }, - [repositories, taskOrderByRepository], + [projects, taskOrderByProject], ); useEffect(() => { @@ -1514,12 +1156,7 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } startRightRef.current = rightWidthRef.current; }, []); - const activeTask = useMemo(() => { - if (selectedTaskId) { - return tasks.find((task) => task.id === selectedTaskId) ?? tasks[0] ?? null; - } - return tasks[0] ?? null; - }, [selectedTaskId, tasks]); + const activeTask = useMemo(() => tasks.find((task) => task.id === selectedTaskId) ?? tasks[0] ?? null, [tasks, selectedTaskId]); useEffect(() => { if (activeTask) { @@ -1534,38 +1171,33 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } const fallbackTask = tasks.find((task) => task.id === fallbackTaskId) ?? null; void navigate({ - to: "/organizations/$organizationId/tasks/$taskId", + to: "/workspaces/$workspaceId/tasks/$taskId", params: { - organizationId, + workspaceId, taskId: fallbackTaskId, }, - search: { sessionId: fallbackTask?.sessions[0]?.id ?? undefined }, + search: { sessionId: fallbackTask?.tabs[0]?.id ?? undefined }, replace: true, }); - }, [activeTask, navigate, tasks, organizationId]); + }, [activeTask, tasks, navigate, workspaceId]); const openDiffs = activeTask ? sanitizeOpenDiffs(activeTask, openDiffsByTask[activeTask.id]) : []; - const lastAgentSessionId = activeTask ? sanitizeLastAgentSessionId(activeTask, lastAgentSessionIdByTask[activeTask.id]) : null; - const activeSessionId = activeTask - ? sanitizeActiveSessionId(activeTask, activeSessionIdByTask[activeTask.id] ?? activeTask.activeSessionId ?? null, openDiffs, lastAgentSessionId) - : null; - const selectedSessionHydrating = Boolean( - selectedSessionId && activeSessionId === selectedSessionId && sessionState.status === "loading" && !sessionState.data, - ); + const lastAgentTabId = activeTask ? sanitizeLastAgentTabId(activeTask, lastAgentTabIdByTask[activeTask.id]) : null; + const activeTabId = activeTask ? sanitizeActiveTabId(activeTask, activeTabIdByTask[activeTask.id], openDiffs, lastAgentTabId) : null; const syncRouteSession = useCallback( (taskId: string, sessionId: string | null, replace = false) => { void navigate({ - to: "/organizations/$organizationId/tasks/$taskId", + to: "/workspaces/$workspaceId/tasks/$taskId", params: { - organizationId, + workspaceId, taskId, }, search: { sessionId: sessionId ?? undefined }, ...(replace ? { replace: true } : {}), }); }, - [navigate, organizationId], + [navigate, workspaceId], ); useEffect(() => { @@ -1573,7 +1205,7 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } return; } - const resolvedRouteSessionId = sanitizeLastAgentSessionId(activeTask, selectedSessionId); + const resolvedRouteSessionId = sanitizeLastAgentTabId(activeTask, selectedSessionId); if (!resolvedRouteSessionId) { return; } @@ -1583,15 +1215,15 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } return; } - if (lastAgentSessionIdByTask[activeTask.id] === resolvedRouteSessionId) { + if (lastAgentTabIdByTask[activeTask.id] === resolvedRouteSessionId) { return; } - setLastAgentSessionIdByTask((current) => ({ + setLastAgentTabIdByTask((current) => ({ ...current, [activeTask.id]: resolvedRouteSessionId, })); - setActiveSessionIdByTask((current) => { + setActiveTabIdByTask((current) => { const currentActive = current[activeTask.id]; if (currentActive && isDiffTab(currentActive)) { return current; @@ -1602,26 +1234,25 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } [activeTask.id]: resolvedRouteSessionId, }; }); - }, [activeTask, lastAgentSessionIdByTask, selectedSessionId, syncRouteSession]); + }, [activeTask, lastAgentTabIdByTask, selectedSessionId, syncRouteSession]); useEffect(() => { - const organizationRepos = organizationReposData ?? []; - if (selectedNewTaskRepoId && organizationRepos.some((repo) => repo.id === selectedNewTaskRepoId)) { + if (selectedNewTaskRepoId && workspaceRepos.some((repo) => repo.id === selectedNewTaskRepoId)) { return; } const fallbackRepoId = - activeTask?.repoId && organizationRepos.some((repo) => repo.id === activeTask.repoId) ? activeTask.repoId : (organizationRepos[0]?.id ?? ""); + activeTask?.repoId && workspaceRepos.some((repo) => repo.id === activeTask.repoId) ? activeTask.repoId : (workspaceRepos[0]?.id ?? ""); if (fallbackRepoId !== selectedNewTaskRepoId) { setSelectedNewTaskRepoId(fallbackRepoId); } - }, [activeTask?.repoId, selectedNewTaskRepoId, organizationReposData]); + }, [activeTask?.repoId, selectedNewTaskRepoId, workspaceRepos]); useEffect(() => { if (!activeTask) { return; } - if (activeTask.sessions.length > 0) { + if (activeTask.tabs.length > 0) { autoCreatingSessionForTaskRef.current.delete(activeTask.id); return; } @@ -1635,49 +1266,47 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } autoCreatingSessionForTaskRef.current.add(activeTask.id); void (async () => { try { - const { sessionId } = await taskWorkspaceClient.addSession({ repoId: activeTask.repoId, taskId: activeTask.id }); - syncRouteSession(activeTask.id, sessionId, true); + const { tabId } = await taskWorkbenchClient.addTab({ taskId: activeTask.id }); + syncRouteSession(activeTask.id, tabId, true); } catch (error) { logger.error( { taskId: activeTask.id, ...createErrorContext(error), }, - "failed_to_auto_create_workspace_session", + "failed_to_auto_create_workbench_session", ); // Keep the guard in the set on error to prevent retry storms. - // The guard is cleared when sessions appear (line above) or the task changes. + // The guard is cleared when tabs appear (line above) or the task changes. } })(); - }, [activeTask, selectedSessionId, syncRouteSession, taskWorkspaceClient]); + }, [activeTask, selectedSessionId, syncRouteSession, taskWorkbenchClient]); const createTask = useCallback( - (overrideRepoId?: string, options?: { title?: string; task?: string; branch?: string; onBranch?: string }) => { + (overrideRepoId?: string) => { void (async () => { const repoId = overrideRepoId || selectedNewTaskRepoId; if (!repoId) { throw new Error("Cannot create a task without an available repo"); } - const { taskId, sessionId } = await taskWorkspaceClient.createTask({ + const { taskId, tabId } = await taskWorkbenchClient.createTask({ repoId, - task: options?.task ?? "New task", - model: currentUser?.defaultModel ?? DEFAULT_WORKSPACE_MODEL_ID, - title: options?.title ?? "New task", - ...(options?.branch ? { branch: options.branch } : {}), - ...(options?.onBranch ? { onBranch: options.onBranch } : {}), + task: "New task", + model: "gpt-4o", + title: "New task", }); await navigate({ - to: "/organizations/$organizationId/tasks/$taskId", + to: "/workspaces/$workspaceId/tasks/$taskId", params: { - organizationId, + workspaceId, taskId, }, - search: { sessionId: sessionId ?? undefined }, + search: { sessionId: tabId ?? undefined }, }); })(); }, - [currentUser?.defaultModel, navigate, selectedNewTaskRepoId, taskWorkspaceClient, organizationId], + [navigate, selectedNewTaskRepoId, workspaceId], ); const openDiffTab = useCallback( @@ -1696,7 +1325,7 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } [activeTask.id]: [...existing, path], }; }); - setActiveSessionIdByTask((current) => ({ + setActiveTabIdByTask((current) => ({ ...current, [activeTask.id]: diffTabId(path), })); @@ -1708,27 +1337,20 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } (id: string) => { const task = tasks.find((candidate) => candidate.id === id) ?? null; void navigate({ - to: "/organizations/$organizationId/tasks/$taskId", + to: "/workspaces/$workspaceId/tasks/$taskId", params: { - organizationId, + workspaceId, taskId: id, }, - search: { sessionId: task?.sessions[0]?.id ?? undefined }, + search: { sessionId: task?.tabs[0]?.id ?? undefined }, }); }, - [navigate, tasks, organizationId], + [tasks, navigate, workspaceId], ); - const markTaskUnread = useCallback( - (id: string) => { - const task = tasks.find((candidate) => candidate.id === id); - if (!task) { - return; - } - void taskWorkspaceClient.markTaskUnread({ repoId: task.repoId, taskId: id }); - }, - [tasks], - ); + const markTaskUnread = useCallback((id: string) => { + void taskWorkbenchClient.markTaskUnread({ taskId: id }); + }, []); const renameTask = useCallback( (id: string) => { @@ -1747,39 +1369,45 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } return; } - void taskWorkspaceClient.renameTask({ repoId: currentTask.repoId, taskId: id, value: trimmedTitle }); + void taskWorkbenchClient.renameTask({ taskId: id, value: trimmedTitle }); }, [tasks], ); - const changeOwner = useCallback( - (member: { id: string; name: string; email: string }) => { - if (!activeTask) { - throw new Error("Cannot change owner without an active task"); + const renameBranch = useCallback( + (id: string) => { + const currentTask = tasks.find((task) => task.id === id); + if (!currentTask) { + throw new Error(`Unable to rename missing task ${id}`); } - void taskWorkspaceClient.changeOwner({ - repoId: activeTask.repoId, - taskId: activeTask.id, - targetUserId: member.id, - targetUserName: member.name, - targetUserEmail: member.email, - }); + + const nextBranch = window.prompt("Rename branch", currentTask.branch ?? ""); + if (nextBranch === null) { + return; + } + + const trimmedBranch = nextBranch.trim(); + if (!trimmedBranch) { + return; + } + + void taskWorkbenchClient.renameBranch({ taskId: id, value: trimmedBranch }); }, - [activeTask], + [tasks], ); const archiveTask = useCallback(() => { if (!activeTask) { throw new Error("Cannot archive without an active task"); } - void taskWorkspaceClient.archiveTask({ repoId: activeTask.repoId, taskId: activeTask.id }); + void taskWorkbenchClient.archiveTask({ taskId: activeTask.id }); }, [activeTask]); const publishPr = useCallback(() => { if (!activeTask) { throw new Error("Cannot publish PR without an active task"); } - void taskWorkspaceClient.publishPr({ repoId: activeTask.repoId, taskId: activeTask.id }); + void taskWorkbenchClient.publishPr({ taskId: activeTask.id }); }, [activeTask]); const revertFile = useCallback( @@ -1791,21 +1419,20 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } ...current, [activeTask.id]: sanitizeOpenDiffs(activeTask, current[activeTask.id]).filter((candidate) => candidate !== path), })); - setActiveSessionIdByTask((current) => ({ + setActiveTabIdByTask((current) => ({ ...current, [activeTask.id]: current[activeTask.id] === diffTabId(path) - ? sanitizeLastAgentSessionId(activeTask, lastAgentSessionIdByTask[activeTask.id]) + ? sanitizeLastAgentTabId(activeTask, lastAgentTabIdByTask[activeTask.id]) : (current[activeTask.id] ?? null), })); - void taskWorkspaceClient.revertFile({ - repoId: activeTask.repoId, + void taskWorkbenchClient.revertFile({ taskId: activeTask.id, path, }); }, - [activeTask, lastAgentSessionIdByTask], + [activeTask, lastAgentTabIdByTask], ); const isDesktop = !!import.meta.env.VITE_DESKTOP; @@ -1894,20 +1521,19 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } >
void taskWorkspaceClient.adminReloadGithubOrganization()} - onReloadRepository={(repoId) => void taskWorkspaceClient.adminReloadGithubRepository(repoId)} onToggleSidebar={() => setLeftSidebarOpen(false)} />
@@ -1949,7 +1575,7 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } gap: "12px", }} > - {liveGithub?.syncStatus === "syncing" || liveGithub?.syncStatus === "pending" ? ( + {activeOrg?.github.syncStatus === "syncing" || activeOrg?.github.syncStatus === "pending" ? ( <>

Syncing with GitHub

- {liveGithub.lastSyncLabel || `Importing repos from @${liveGithub.connectedAccount || "GitHub"}...`} - {(liveGithub.totalRepositoryCount ?? 0) > 0 && ( - <> - {" "} - {liveGithub.syncPhase === "syncing_repositories" - ? `${liveGithub.importedRepoCount} of ${liveGithub.totalRepositoryCount} repos imported so far.` - : `${liveGithub.processedRepositoryCount} of ${liveGithub.totalRepositoryCount} repos processed in ${liveGithub.syncPhase?.replace(/^syncing_/, "").replace(/_/g, " ") ?? "sync"}.`} - - )} + Importing repos from @{activeOrg.github.connectedAccount || "GitHub"}... + {activeOrg.github.importedRepoCount > 0 && <> {activeOrg.github.importedRepoCount} repos imported so far.}

- ) : liveGithub?.syncStatus === "error" ? ( + ) : activeOrg?.github.syncStatus === "error" ? ( <>

GitHub sync failed

There was a problem syncing repos from GitHub. Check the dev panel for details.

@@ -1990,22 +1609,22 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } <>

Create your first task

- {organizationRepos.length > 0 + {workspaceRepos.length > 0 ? "Start from the sidebar to create a task on the first available repo." - : "No repos are available in this organization yet."} + : "No repos are available in this workspace yet."}

- {liveGithub && } + {activeOrg && (activeOrg.github.installationStatus === "install_required" || activeOrg.github.installationStatus === "reconnect_required") && ( +
+ + + GitHub App {activeOrg.github.installationStatus === "install_required" ? "not installed" : "needs reconnection"} — repo sync is unavailable + +
+ )} {showDevPanel && ( )} @@ -2065,20 +1716,19 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } >
void taskWorkspaceClient.adminReloadGithubOrganization()} - onReloadRepository={(repoId) => void taskWorkspaceClient.adminReloadGithubRepository(repoId)} onToggleSidebar={() => setLeftSidebarOpen(false)} />
@@ -2114,10 +1764,10 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } onMouseLeave={endPeek} > { selectTask(id); setLeftSidebarPeeking(false); @@ -2126,11 +1776,10 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } onSelectNewTaskRepo={setSelectedNewTaskRepoId} onMarkUnread={markTaskUnread} onRenameTask={renameTask} - onReorderRepositories={reorderRepositories} - taskOrderByRepository={taskOrderByRepository} + onRenameBranch={renameBranch} + onReorderProjects={reorderProjects} + taskOrderByProject={taskOrderByProject} onReorderTasks={reorderTasks} - onReloadOrganization={() => void taskWorkspaceClient.adminReloadGithubOrganization()} - onReloadRepository={(repoId) => void taskWorkspaceClient.adminReloadGithubRepository(repoId)} onToggleSidebar={() => { setLeftSidebarPeeking(false); setLeftSidebarOpen(true); @@ -2143,19 +1792,17 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } {leftSidebarOpen ? : null}
{ - setActiveSessionIdByTask((current) => ({ ...current, [activeTask.id]: sessionId })); + onSetActiveTabId={(tabId) => { + setActiveTabIdByTask((current) => ({ ...current, [activeTask.id]: tabId })); }} - onSetLastAgentSessionId={(sessionId) => { - setLastAgentSessionIdByTask((current) => ({ ...current, [activeTask.id]: sessionId })); + onSetLastAgentTabId={(tabId) => { + setLastAgentTabIdByTask((current) => ({ ...current, [activeTask.id]: tabId })); }} onSetOpenDiffs={(paths) => { setOpenDiffsByTask((current) => ({ ...current, [activeTask.id]: paths })); @@ -2169,7 +1816,6 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } onSidebarPeekEnd={endPeek} rightSidebarCollapsed={!rightSidebarOpen} onToggleRightSidebar={() => setRightSidebarOpen(true)} - selectedSessionHydrating={selectedSessionHydrating} onNavigateToUsage={navigateToUsage} />
@@ -2187,48 +1833,57 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } >
setRightSidebarOpen(false)} />
- {liveGithub && } + {activeOrg && (activeOrg.github.installationStatus === "install_required" || activeOrg.github.installationStatus === "reconnect_required") && ( +
+ + + GitHub App {activeOrg.github.installationStatus === "install_required" ? "not installed" : "needs reconnection"} — repo sync is unavailable + +
+ )} {showDevPanel && ( ({ - id: tab.id, - sessionId: tab.sessionId ?? null, - sessionName: tab.sessionName ?? tab.id, - agent: tab.agent, - model: tab.model, - status: tab.status, - thinkingSinceMs: tab.thinkingSinceMs ?? null, - unread: tab.unread ?? false, - created: tab.created ?? false, - })) ?? [], - }} /> )} diff --git a/foundry/packages/frontend/src/components/mock-layout/history-minimap.tsx b/foundry/packages/frontend/src/components/mock-layout/history-minimap.tsx index ae2443d..3c48a27 100644 --- a/foundry/packages/frontend/src/components/mock-layout/history-minimap.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/history-minimap.tsx @@ -97,7 +97,7 @@ export const HistoryMinimap = memo(function HistoryMinimap({ events, onSelect }: className={css({ appearance: "none", WebkitAppearance: "none", - backgroundColor: "transparent", + background: "none", border: "none", margin: "0", padding: "6px 8px", @@ -110,7 +110,7 @@ export const HistoryMinimap = memo(function HistoryMinimap({ events, onSelect }: overflow: "hidden", textOverflow: "ellipsis", whiteSpace: "nowrap", - transition: "background-color 160ms ease, color 160ms ease", + transition: "background 160ms ease, color 160ms ease", ":hover": { backgroundColor: t.interactiveHover, color: t.textPrimary, diff --git a/foundry/packages/frontend/src/components/mock-layout/message-list.tsx b/foundry/packages/frontend/src/components/mock-layout/message-list.tsx index 499e6cd..7068268 100644 --- a/foundry/packages/frontend/src/components/mock-layout/message-list.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/message-list.tsx @@ -1,19 +1,5 @@ -import { AgentTranscript as AgentTranscript_, type AgentTranscriptClassNames, type TranscriptEntry } from "@sandbox-agent/react"; -import { memo, useEffect, useMemo, type MutableRefObject, type RefObject } from "react"; - -// Cast needed: tsup-generated .d.ts returns react_jsx_runtime.JSX.Element which -// doesn't unify with the consumer's JSX.Element under Bundler moduleResolution. -// eslint-disable-next-line @typescript-eslint/no-explicit-any -const AgentTranscript = AgentTranscript_ as any as React.FC<{ - entries: TranscriptEntry[]; - classNames?: Partial; - scrollRef?: RefObject; - scrollToEntryId?: string | null; - virtualize?: boolean; - isThinking?: boolean; - renderMessageText?: (entry: TranscriptEntry) => React.ReactNode; - renderThinkingState?: () => React.ReactNode; -}>; +import { AgentTranscript, type AgentTranscriptClassNames, type TranscriptEntry } from "@sandbox-agent/react"; +import { memo, useMemo, type MutableRefObject, type Ref } from "react"; import { useStyletron } from "baseui"; import { LabelSmall, LabelXSmall } from "baseui/typography"; import { Copy } from "lucide-react"; @@ -21,22 +7,18 @@ import { Copy } from "lucide-react"; import { useFoundryTokens } from "../../app/theme"; import { HistoryMinimap } from "./history-minimap"; import { SpinnerDot } from "./ui"; -import { buildDisplayMessages, formatMessageDuration, formatMessageTimestamp, type AgentSession, type HistoryEvent, type Message } from "./view-model"; +import { buildDisplayMessages, formatMessageDuration, formatMessageTimestamp, type AgentTab, type HistoryEvent, type Message } from "./view-model"; const TranscriptMessageBody = memo(function TranscriptMessageBody({ message, messageRefs, copiedMessageId, onCopyMessage, - isTarget, - onTargetRendered, }: { message: Message; messageRefs: MutableRefObject>; copiedMessageId: string | null; onCopyMessage: (message: Message) => void; - isTarget?: boolean; - onTargetRendered?: () => void; }) { const [css] = useStyletron(); const t = useFoundryTokens(); @@ -45,20 +27,6 @@ const TranscriptMessageBody = memo(function TranscriptMessageBody({ const messageTimestamp = formatMessageTimestamp(message.createdAtMs); const displayFooter = isUser ? messageTimestamp : message.durationMs ? `${messageTimestamp} • Took ${formatMessageDuration(message.durationMs)}` : null; - useEffect(() => { - if (!isTarget) { - return; - } - - const targetNode = messageRefs.current.get(message.id); - if (!targetNode) { - return; - } - - targetNode.scrollIntoView({ behavior: "smooth", block: "center" }); - onTargetRendered?.(); - }, [isTarget, message.id, messageRefs, onTargetRendered]); - return (
{ @@ -154,57 +122,40 @@ const TranscriptMessageBody = memo(function TranscriptMessageBody({ }); export const MessageList = memo(function MessageList({ - session, + tab, scrollRef, messageRefs, historyEvents, onSelectHistoryEvent, - targetMessageId, - onTargetMessageResolved, copiedMessageId, onCopyMessage, thinkingTimerLabel, - pendingMessage, }: { - session: AgentSession | null | undefined; - scrollRef: RefObject; + tab: AgentTab | null | undefined; + scrollRef: Ref; messageRefs: MutableRefObject>; historyEvents: HistoryEvent[]; onSelectHistoryEvent: (event: HistoryEvent) => void; - targetMessageId?: string | null; - onTargetMessageResolved?: () => void; copiedMessageId: string | null; onCopyMessage: (message: Message) => void; thinkingTimerLabel: string | null; - pendingMessage: { text: string; sentAt: number } | null; }) { const [css] = useStyletron(); const t = useFoundryTokens(); - const PENDING_MESSAGE_ID = "__pending__"; - const messages = useMemo(() => buildDisplayMessages(session), [session]); + const messages = useMemo(() => buildDisplayMessages(tab), [tab]); const messagesById = useMemo(() => new Map(messages.map((message) => [message.id, message])), [messages]); - const messageIndexById = useMemo(() => new Map(messages.map((message, index) => [message.id, index])), [messages]); - const transcriptEntries = useMemo(() => { - const entries: TranscriptEntry[] = messages.map((message) => ({ - id: message.id, - eventId: message.id, - kind: "message", - time: new Date(message.createdAtMs).toISOString(), - role: message.sender === "client" ? "user" : "assistant", - text: message.text, - })); - if (pendingMessage) { - entries.push({ - id: PENDING_MESSAGE_ID, - eventId: PENDING_MESSAGE_ID, + const transcriptEntries = useMemo( + () => + messages.map((message) => ({ + id: message.id, + eventId: message.id, kind: "message", - time: new Date(pendingMessage.sentAt).toISOString(), - role: "user", - text: pendingMessage.text, - }); - } - return entries; - }, [messages, pendingMessage]); + time: new Date(message.createdAtMs).toISOString(), + role: message.sender === "client" ? "user" : "assistant", + text: message.text, + })), + [messages], + ); const messageContentClass = css({ maxWidth: "100%", @@ -241,37 +192,6 @@ export const MessageList = memo(function MessageList({ letterSpacing: "0.01em", }), }; - const scrollContainerClass = css({ - padding: "16px 52px 16px 20px", - display: "flex", - flexDirection: "column", - flex: 1, - minHeight: 0, - overflowY: "auto", - }); - - useEffect(() => { - if (!targetMessageId) { - return; - } - - const targetNode = messageRefs.current.get(targetMessageId); - if (targetNode) { - targetNode.scrollIntoView({ behavior: "smooth", block: "center" }); - onTargetMessageResolved?.(); - return; - } - - const targetIndex = messageIndexById.get(targetMessageId); - if (targetIndex == null) { - return; - } - - scrollRef.current?.scrollTo({ - top: Math.max(0, targetIndex * 88), - behavior: "smooth", - }); - }, [messageIndexById, messageRefs, onTargetMessageResolved, scrollRef, targetMessageId]); return ( <> @@ -281,8 +201,18 @@ export const MessageList = memo(function MessageList({ } `} {historyEvents.length > 0 ? : null} -
- {session && transcriptEntries.length === 0 ? ( +
+ {tab && transcriptEntries.length === 0 ? (
- {!session?.created ? "Choose an agent and model, then send your first message" : "No messages yet in this session"} + {!tab.created ? "Choose an agent and model, then send your first message" : "No messages yet in this session"}
) : ( { - if (entry.id === PENDING_MESSAGE_ID && pendingMessage) { - const pendingMsg: Message = { - id: PENDING_MESSAGE_ID, - sender: "client", - text: pendingMessage.text, - createdAtMs: pendingMessage.sentAt, - event: { - id: PENDING_MESSAGE_ID, - eventIndex: -1, - sessionId: "", - connectionId: "", - sender: "client", - createdAt: pendingMessage.sentAt, - payload: {}, - }, - }; - return ( -
- -
- ); - } + renderMessageText={(entry) => { const message = messagesById.get(entry.id); if (!message) { return null; } - return ( - - ); + return ; }} - isThinking={Boolean((session && session.status === "running" && transcriptEntries.length > 0) || pendingMessage)} + isThinking={Boolean(tab && tab.status === "running" && transcriptEntries.length > 0)} renderThinkingState={() => (
diff --git a/foundry/packages/frontend/src/components/mock-layout/model-picker.tsx b/foundry/packages/frontend/src/components/mock-layout/model-picker.tsx index 6ec6ea6..7327f79 100644 --- a/foundry/packages/frontend/src/components/mock-layout/model-picker.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/model-picker.tsx @@ -2,21 +2,18 @@ import { memo, useState } from "react"; import { useStyletron } from "baseui"; import { StatefulPopover, PLACEMENT } from "baseui/popover"; import { ChevronUp, Star } from "lucide-react"; -import { workspaceModelLabel, type WorkspaceModelGroup } from "@sandbox-agent/foundry-shared"; import { useFoundryTokens } from "../../app/theme"; import { AgentIcon } from "./ui"; -import { type ModelId } from "./view-model"; +import { MODEL_GROUPS, modelLabel, providerAgent, type ModelId } from "./view-model"; const ModelPickerContent = memo(function ModelPickerContent({ - groups, value, defaultModel, onChange, onSetDefault, close, }: { - groups: WorkspaceModelGroup[]; value: ModelId; defaultModel: ModelId; onChange: (id: ModelId) => void; @@ -29,7 +26,7 @@ const ModelPickerContent = memo(function ModelPickerContent({ return (
- {groups.map((group) => ( + {MODEL_GROUPS.map((group) => (
void; @@ -142,9 +137,7 @@ export const ModelPicker = memo(function ModelPicker({ }, }, }} - content={({ close }) => ( - - )} + content={({ close }) => } >
diff --git a/foundry/packages/frontend/src/components/mock-layout/prompt-composer.tsx b/foundry/packages/frontend/src/components/mock-layout/prompt-composer.tsx index b7e27be..08d72ae 100644 --- a/foundry/packages/frontend/src/components/mock-layout/prompt-composer.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/prompt-composer.tsx @@ -2,7 +2,6 @@ import { memo, type Ref } from "react"; import { useStyletron } from "baseui"; import { ChatComposer, type ChatComposerClassNames } from "@sandbox-agent/react"; import { FileCode, SendHorizonal, Square, X } from "lucide-react"; -import { type WorkspaceModelGroup } from "@sandbox-agent/foundry-shared"; import { useFoundryTokens } from "../../app/theme"; import { ModelPicker } from "./model-picker"; @@ -14,7 +13,6 @@ export const PromptComposer = memo(function PromptComposer({ textareaRef, placeholder, attachments, - modelGroups, defaultModel, model, isRunning, @@ -29,7 +27,6 @@ export const PromptComposer = memo(function PromptComposer({ textareaRef: Ref; placeholder: string; attachments: LineAttachment[]; - modelGroups: WorkspaceModelGroup[]; defaultModel: ModelId; model: ModelId; isRunning: boolean; @@ -175,7 +172,7 @@ export const PromptComposer = memo(function PromptComposer({ renderSubmitContent={() => (isRunning ? : )} renderFooter={() => (
- +
)} /> diff --git a/foundry/packages/frontend/src/components/mock-layout/right-sidebar.tsx b/foundry/packages/frontend/src/components/mock-layout/right-sidebar.tsx index 3565b44..4adddc4 100644 --- a/foundry/packages/frontend/src/components/mock-layout/right-sidebar.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/right-sidebar.tsx @@ -1,21 +1,7 @@ -import { memo, useCallback, useMemo, useRef, useState, type MouseEvent } from "react"; +import { memo, useCallback, useMemo, useState, type MouseEvent } from "react"; import { useStyletron } from "baseui"; -import { LabelSmall, LabelXSmall } from "baseui/typography"; -import { - Archive, - ArrowUpFromLine, - ChevronDown, - ChevronRight, - FileCode, - FilePlus, - FileX, - FolderOpen, - ExternalLink, - GitBranch, - GitPullRequest, - PanelRight, - User, -} from "lucide-react"; +import { LabelSmall } from "baseui/typography"; +import { Archive, ArrowUpFromLine, ChevronRight, FileCode, FilePlus, FileX, FolderOpen, GitPullRequest, PanelRight } from "lucide-react"; import { useFoundryTokens } from "../../app/theme"; import { createErrorContext } from "@sandbox-agent/foundry-shared"; @@ -69,9 +55,7 @@ const FileTree = memo(function FileTree({ display: "flex", alignItems: "center", gap: "4px", - paddingTop: "3px", - paddingRight: "10px", - paddingBottom: "3px", + padding: "3px 10px", paddingLeft: `${10 + depth * 16}px`, cursor: "pointer", fontSize: "12px", @@ -108,28 +92,24 @@ const FileTree = memo(function FileTree({ export const RightSidebar = memo(function RightSidebar({ task, - activeSessionId, + activeTabId, onOpenDiff, onArchive, onRevertFile, onPublishPr, - onChangeOwner, - members, onToggleSidebar, }: { task: Task; - activeSessionId: string | null; + activeTabId: string | null; onOpenDiff: (path: string) => void; onArchive: () => void; onRevertFile: (path: string) => void; onPublishPr: () => void; - onChangeOwner: (member: { id: string; name: string; email: string }) => void; - members: Array<{ id: string; name: string; email: string }>; onToggleSidebar?: () => void; }) { const [css] = useStyletron(); const t = useFoundryTokens(); - const [rightTab, setRightTab] = useState<"overview" | "changes" | "files">("overview"); + const [rightTab, setRightTab] = useState<"changes" | "files">("changes"); const contextMenu = useContextMenu(); const changedPaths = useMemo(() => new Set(task.fileChanges.map((file) => file.path)), [task.fileChanges]); const isTerminal = task.status === "archived"; @@ -143,9 +123,7 @@ export const RightSidebar = memo(function RightSidebar({ }); observer.observe(node); }, []); - const [ownerDropdownOpen, setOwnerDropdownOpen] = useState(false); - const ownerDropdownRef = useRef(null); - const pullRequestUrl = task.pullRequest?.url ?? null; + const pullRequestUrl = task.pullRequest != null ? `https://github.com/${task.repoName}/pull/${task.pullRequest.number}` : null; const copyFilePath = useCallback(async (path: string) => { try { @@ -197,7 +175,7 @@ export const RightSidebar = memo(function RightSidebar({ className={css({ appearance: "none", WebkitAppearance: "none", - backgroundColor: "transparent", + background: "none", border: "none", margin: "0", boxSizing: "border-box", @@ -224,7 +202,7 @@ export const RightSidebar = memo(function RightSidebar({ className={css({ appearance: "none", WebkitAppearance: "none", - backgroundColor: "transparent", + background: "none", border: "none", margin: "0", boxSizing: "border-box", @@ -252,7 +230,7 @@ export const RightSidebar = memo(function RightSidebar({ className={css({ appearance: "none", WebkitAppearance: "none", - backgroundColor: "transparent", + background: "none", border: "none", margin: "0", boxSizing: "border-box", @@ -329,51 +307,22 @@ export const RightSidebar = memo(function RightSidebar({ borderTopRightRadius: "12px", })} > -
- {rightTab === "overview" ? ( -
-
- - Owner - -
-
setOwnerDropdownOpen((prev) => !prev)} - onKeyDown={(event) => { - if (event.key === "Enter" || event.key === " ") setOwnerDropdownOpen((prev) => !prev); - }} - className={css({ - display: "flex", - alignItems: "center", - gap: "10px", - paddingTop: "4px", - paddingRight: "8px", - paddingBottom: "4px", - paddingLeft: "4px", - borderRadius: "6px", - cursor: "pointer", - ":hover": { backgroundColor: t.interactiveHover }, - })} - > - {task.primaryUserLogin ? ( - <> - {task.primaryUserAvatarUrl ? ( - {task.primaryUserLogin} - ) : ( -
- -
- )} - - {task.primaryUserLogin} - - - ) : ( - <> -
- -
- - No owner assigned - - - )} - -
- {ownerDropdownOpen ? ( - <> -
setOwnerDropdownOpen(false)} - className={css({ position: "fixed", top: 0, left: 0, right: 0, bottom: 0, zIndex: 99 })} - /> -
- {members.map((member) => ( -
{ - onChangeOwner(member); - setOwnerDropdownOpen(false); - }} - onKeyDown={(event) => { - if (event.key === "Enter" || event.key === " ") { - onChangeOwner(member); - setOwnerDropdownOpen(false); - } - }} - className={css({ - display: "flex", - alignItems: "center", - gap: "8px", - paddingTop: "6px", - paddingRight: "12px", - paddingBottom: "6px", - paddingLeft: "12px", - cursor: "pointer", - fontSize: "12px", - color: t.textPrimary, - ":hover": { backgroundColor: t.interactiveHover }, - })} - > -
- -
- {member.name} -
- ))} - {members.length === 0 ? ( -
- No members -
- ) : null} -
- - ) : null} -
-
-
- - Branch - -
- - - {task.branch ?? "No branch"} - -
-
-
- - Repository - - {task.repoName} -
- {task.pullRequest ? ( -
- - Pull Request - -
- - - #{task.pullRequest.number} {task.pullRequest.title ?? ""} - -
-
- ) : null} - {task.sandboxes?.find((s) => s.sandboxId === task.activeSandboxId)?.url ? ( - - ) : null} -
- ) : rightTab === "changes" ? ( + {rightTab === "changes" ? (
{task.fileChanges.length === 0 ? (
@@ -688,7 +399,7 @@ export const RightSidebar = memo(function RightSidebar({
) : null} {task.fileChanges.map((file) => { - const isActive = activeSessionId === diffTabId(file.path); + const isActive = activeTabId === diffTabId(file.path); const TypeIcon = file.type === "A" ? FilePlus : file.type === "D" ? FileX : FileCode; const iconColor = file.type === "A" ? t.statusSuccess : file.type === "D" ? t.statusError : t.textTertiary; return ( diff --git a/foundry/packages/frontend/src/components/mock-layout/sidebar.tsx b/foundry/packages/frontend/src/components/mock-layout/sidebar.tsx index 6ebb026..0f8f688 100644 --- a/foundry/packages/frontend/src/components/mock-layout/sidebar.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/sidebar.tsx @@ -1,7 +1,6 @@ import { memo, useCallback, useEffect, useLayoutEffect, useMemo, useRef, useState } from "react"; import { createPortal } from "react-dom"; import { useNavigate } from "@tanstack/react-router"; -import { useVirtualizer } from "@tanstack/react-virtual"; import { useStyletron } from "baseui"; import { LabelSmall, LabelXSmall } from "baseui/typography"; import { Select, type Value } from "baseui/select"; @@ -14,20 +13,19 @@ import { GitPullRequestDraft, ListChecks, LogOut, - MoreHorizontal, PanelLeft, Plus, Settings, User, } from "lucide-react"; -import { formatRelativeAge, type Task, type RepositorySection } from "./view-model"; +import { formatRelativeAge, type Task, type ProjectSection } from "./view-model"; import { ContextMenuOverlay, TaskIndicator, PanelHeaderBar, SPanel, ScrollBody, useContextMenu } from "./ui"; import { activeMockOrganization, eligibleOrganizations, useMockAppClient, useMockAppSnapshot } from "../../lib/mock-app"; import { useFoundryTokens } from "../../app/theme"; import type { FoundryTokens } from "../../styles/tokens"; -const REPOSITORY_COLORS = ["#6366f1", "#f59e0b", "#10b981", "#ef4444", "#8b5cf6", "#ec4899", "#06b6d4", "#f97316"]; +const PROJECT_COLORS = ["#6366f1", "#f59e0b", "#10b981", "#ef4444", "#8b5cf6", "#ec4899", "#06b6d4", "#f97316"]; /** Strip the org prefix (e.g. "rivet-dev/") when all repos share the same org. */ function stripCommonOrgPrefix(label: string, repos: Array<{ label: string }>): string { @@ -40,22 +38,22 @@ function stripCommonOrgPrefix(label: string, repos: Array<{ label: string }>): s return label; } -function repositoryInitial(label: string): string { +function projectInitial(label: string): string { const parts = label.split("/"); const name = parts[parts.length - 1] ?? label; return name.charAt(0).toUpperCase(); } -function repositoryIconColor(label: string): string { +function projectIconColor(label: string): string { let hash = 0; for (let i = 0; i < label.length; i++) { hash = (hash * 31 + label.charCodeAt(i)) | 0; } - return REPOSITORY_COLORS[Math.abs(hash) % REPOSITORY_COLORS.length]!; + return PROJECT_COLORS[Math.abs(hash) % PROJECT_COLORS.length]!; } export const Sidebar = memo(function Sidebar({ - repositories, + projects, newTaskRepos, selectedNewTaskRepoId, activeId, @@ -64,14 +62,13 @@ export const Sidebar = memo(function Sidebar({ onSelectNewTaskRepo, onMarkUnread, onRenameTask, - onReorderRepositories, - taskOrderByRepository, + onRenameBranch, + onReorderProjects, + taskOrderByProject, onReorderTasks, - onReloadOrganization, - onReloadRepository, onToggleSidebar, }: { - repositories: RepositorySection[]; + projects: ProjectSection[]; newTaskRepos: Array<{ id: string; label: string }>; selectedNewTaskRepoId: string; activeId: string; @@ -80,26 +77,22 @@ export const Sidebar = memo(function Sidebar({ onSelectNewTaskRepo: (repoId: string) => void; onMarkUnread: (id: string) => void; onRenameTask: (id: string) => void; - onReorderRepositories: (fromIndex: number, toIndex: number) => void; - taskOrderByRepository: Record; - onReorderTasks: (repositoryId: string, fromIndex: number, toIndex: number) => void; - onReloadOrganization: () => void; - onReloadRepository: (repoId: string) => void; + onRenameBranch: (id: string) => void; + onReorderProjects: (fromIndex: number, toIndex: number) => void; + taskOrderByProject: Record; + onReorderTasks: (projectId: string, fromIndex: number, toIndex: number) => void; onToggleSidebar?: () => void; }) { const [css] = useStyletron(); const t = useFoundryTokens(); const contextMenu = useContextMenu(); - const [collapsedRepositories, setCollapsedRepositories] = useState>({}); - const [hoveredRepositoryId, setHoveredRepositoryId] = useState(null); - const [headerMenuOpen, setHeaderMenuOpen] = useState(false); - const headerMenuRef = useRef(null); - const scrollRef = useRef(null); + const [collapsedProjects, setCollapsedProjects] = useState>({}); + const [hoveredProjectId, setHoveredProjectId] = useState(null); // Mouse-based drag and drop state type DragState = - | { type: "repository"; fromIdx: number; overIdx: number | null } - | { type: "task"; repositoryId: string; fromIdx: number; overIdx: number | null } + | { type: "project"; fromIdx: number; overIdx: number | null } + | { type: "task"; projectId: string; fromIdx: number; overIdx: number | null } | null; const [drag, setDrag] = useState(null); const dragRef = useRef(null); @@ -113,19 +106,19 @@ export const Sidebar = memo(function Sidebar({ // Detect which element is under the cursor using data attributes const el = document.elementFromPoint(e.clientX, e.clientY); if (!el) return; - const repositoryEl = (el as HTMLElement).closest?.("[data-repository-idx]") as HTMLElement | null; + const projectEl = (el as HTMLElement).closest?.("[data-project-idx]") as HTMLElement | null; const taskEl = (el as HTMLElement).closest?.("[data-task-idx]") as HTMLElement | null; - if (drag.type === "repository" && repositoryEl) { - const overIdx = Number(repositoryEl.dataset.repositoryIdx); + if (drag.type === "project" && projectEl) { + const overIdx = Number(projectEl.dataset.projectIdx); if (overIdx !== drag.overIdx) { setDrag({ ...drag, overIdx }); dragRef.current = { ...drag, overIdx }; } } else if (drag.type === "task" && taskEl) { - const overRepositoryId = taskEl.dataset.taskRepositoryId ?? ""; + const overProjectId = taskEl.dataset.taskProjectId ?? ""; const overIdx = Number(taskEl.dataset.taskIdx); - if (overRepositoryId === drag.repositoryId && overIdx !== drag.overIdx) { + if (overProjectId === drag.projectId && overIdx !== drag.overIdx) { setDrag({ ...drag, overIdx }); dragRef.current = { ...drag, overIdx }; } @@ -138,10 +131,10 @@ export const Sidebar = memo(function Sidebar({ const onUp = () => { const d = dragRef.current; if (d && didDragRef.current && d.overIdx !== null && d.fromIdx !== d.overIdx) { - if (d.type === "repository") { - onReorderRepositories(d.fromIdx, d.overIdx); + if (d.type === "project") { + onReorderProjects(d.fromIdx, d.overIdx); } else { - onReorderTasks(d.repositoryId, d.fromIdx, d.overIdx); + onReorderTasks(d.projectId, d.fromIdx, d.overIdx); } } dragRef.current = null; @@ -154,70 +147,18 @@ export const Sidebar = memo(function Sidebar({ document.removeEventListener("mousemove", onMove); document.removeEventListener("mouseup", onUp); }; - }, [drag, onReorderRepositories, onReorderTasks]); - - useEffect(() => { - if (!headerMenuOpen) { - return; - } - const onMouseDown = (event: MouseEvent) => { - if (headerMenuRef.current?.contains(event.target as Node)) { - return; - } - setHeaderMenuOpen(false); - }; - document.addEventListener("mousedown", onMouseDown); - return () => document.removeEventListener("mousedown", onMouseDown); - }, [headerMenuOpen]); + }, [drag, onReorderProjects, onReorderTasks]); const [createSelectOpen, setCreateSelectOpen] = useState(false); const selectOptions = useMemo(() => newTaskRepos.map((repo) => ({ id: repo.id, label: stripCommonOrgPrefix(repo.label, newTaskRepos) })), [newTaskRepos]); - type FlatItem = - | { key: string; type: "repository-header"; repository: RepositorySection; repositoryIndex: number } - | { key: string; type: "task"; repository: RepositorySection; repositoryIndex: number; task: Task; taskIndex: number } - | { key: string; type: "task-drop-zone"; repository: RepositorySection; repositoryIndex: number; taskCount: number } - | { key: string; type: "repository-drop-zone"; repositoryCount: number }; - const flatItems = useMemo(() => { - const items: FlatItem[] = []; - repositories.forEach((repository, repositoryIndex) => { - items.push({ key: `repository:${repository.id}`, type: "repository-header", repository, repositoryIndex }); - if (!collapsedRepositories[repository.id]) { - const orderedTaskIds = taskOrderByRepository[repository.id]; - const orderedTasks = orderedTaskIds - ? (() => { - const byId = new Map(repository.tasks.map((t) => [t.id, t])); - const sorted = orderedTaskIds.map((id) => byId.get(id)).filter(Boolean) as typeof repository.tasks; - for (const t of repository.tasks) { - if (!orderedTaskIds.includes(t.id)) sorted.push(t); - } - return sorted; - })() - : repository.tasks; - orderedTasks.forEach((task, taskIndex) => { - items.push({ key: `task:${task.id}`, type: "task" as const, repository, repositoryIndex, task, taskIndex }); - }); - items.push({ key: `task-drop:${repository.id}`, type: "task-drop-zone", repository, repositoryIndex, taskCount: orderedTasks.length }); - } - }); - items.push({ key: "repository-drop-zone", type: "repository-drop-zone", repositoryCount: repositories.length }); - return items; - }, [collapsedRepositories, repositories, taskOrderByRepository]); - const virtualizer = useVirtualizer({ - count: flatItems.length, - getItemKey: (index) => flatItems[index]?.key ?? index, - getScrollElement: () => scrollRef.current, - estimateSize: () => 40, - overscan: 12, - measureElement: (element) => element.getBoundingClientRect().height, - }); return ( @@ -385,332 +326,255 @@ export const Sidebar = memo(function Sidebar({ />
) : ( -
- - {headerMenuOpen ? ( -
- -
- ) : null} -
{ - if (newTaskRepos.length === 0) return; +
{ + if (newTaskRepos.length === 0) return; + if (newTaskRepos.length === 1) { + onSelectNewTaskRepo(newTaskRepos[0]!.id); + onCreate(newTaskRepos[0]!.id); + } else { + setCreateSelectOpen(true); + } + }} + onKeyDown={(event) => { + if (newTaskRepos.length === 0) return; + if (event.key === "Enter" || event.key === " ") { if (newTaskRepos.length === 1) { onSelectNewTaskRepo(newTaskRepos[0]!.id); onCreate(newTaskRepos[0]!.id); } else { setCreateSelectOpen(true); } - }} - onKeyDown={(event) => { - if (newTaskRepos.length === 0) return; - if (event.key === "Enter" || event.key === " ") { - if (newTaskRepos.length === 1) { - onSelectNewTaskRepo(newTaskRepos[0]!.id); - onCreate(newTaskRepos[0]!.id); - } else { - setCreateSelectOpen(true); - } - } - }} - className={css({ - width: "26px", - height: "26px", - borderRadius: "8px", - backgroundColor: newTaskRepos.length > 0 ? t.borderMedium : t.interactiveHover, - color: t.textPrimary, - cursor: newTaskRepos.length > 0 ? "pointer" : "not-allowed", - display: "flex", - alignItems: "center", - justifyContent: "center", - transition: "background 200ms ease", - flexShrink: 0, - opacity: newTaskRepos.length > 0 ? 1 : 0.6, - ":hover": newTaskRepos.length > 0 ? { backgroundColor: "rgba(255, 255, 255, 0.20)" } : undefined, - })} - > - -
+ } + }} + className={css({ + width: "26px", + height: "26px", + borderRadius: "8px", + backgroundColor: newTaskRepos.length > 0 ? t.borderMedium : t.interactiveHover, + color: t.textPrimary, + cursor: newTaskRepos.length > 0 ? "pointer" : "not-allowed", + display: "flex", + alignItems: "center", + justifyContent: "center", + transition: "background 200ms ease", + flexShrink: 0, + opacity: newTaskRepos.length > 0 ? 1 : 0.6, + ":hover": newTaskRepos.length > 0 ? { backgroundColor: "rgba(255, 255, 255, 0.20)" } : undefined, + })} + > +
)} - -
-
- {virtualizer.getVirtualItems().map((virtualItem) => { - const item = flatItems[virtualItem.index]; - if (!item) { - return null; - } + +
+ {projects.map((project, projectIndex) => { + const isCollapsed = collapsedProjects[project.id] === true; + const isProjectDropTarget = drag?.type === "project" && drag.overIdx === projectIndex && drag.fromIdx !== projectIndex; + const isBeingDragged = drag?.type === "project" && drag.fromIdx === projectIndex && didDragRef.current; + const orderedTaskIds = taskOrderByProject[project.id]; + const orderedTasks = orderedTaskIds + ? (() => { + const byId = new Map(project.tasks.map((t) => [t.id, t])); + const sorted = orderedTaskIds.map((id) => byId.get(id)).filter(Boolean) as typeof project.tasks; + for (const t of project.tasks) { + if (!orderedTaskIds.includes(t.id)) sorted.push(t); + } + return sorted; + })() + : project.tasks; - if (item.type === "repository-header") { - const { repository, repositoryIndex } = item; - const isCollapsed = collapsedRepositories[repository.id] === true; - const isRepositoryDropTarget = drag?.type === "repository" && drag.overIdx === repositoryIndex && drag.fromIdx !== repositoryIndex; - const isBeingDragged = drag?.type === "repository" && drag.fromIdx === repositoryIndex && didDragRef.current; + return ( +
+
setHoveredProjectId(project.id)} + onMouseLeave={() => setHoveredProjectId((cur) => (cur === project.id ? null : cur))} + onMouseDown={(event) => { + if (event.button !== 0) return; + startYRef.current = event.clientY; + didDragRef.current = false; + setHoveredProjectId(null); + const state: DragState = { type: "project", fromIdx: projectIndex, overIdx: null }; + dragRef.current = state; + setDrag(state); + }} + onClick={() => { + if (!didDragRef.current) { + setCollapsedProjects((current) => ({ + ...current, + [project.id]: !current[project.id], + })); + } + }} + data-project-header + className={css({ + display: "flex", + alignItems: "center", + justifyContent: "space-between", + padding: "10px 8px 4px", + gap: "8px", + cursor: "grab", + userSelect: "none", + })} + > +
+
+ + {projectInitial(project.label)} + + + {isCollapsed ? : } + +
+ + {stripCommonOrgPrefix(project.label, projects)} + +
+
+ {isCollapsed ? {formatRelativeAge(project.updatedAtMs)} : null} + +
+
- return ( -
{ - if (node) { - virtualizer.measureElement(node); - } - }} - style={{ - left: 0, - position: "absolute", - top: 0, - transform: `translateY(${virtualItem.start}px)`, - width: "100%", - opacity: isBeingDragged ? 0.4 : 1, - transition: "opacity 150ms ease", - }} - > - {isRepositoryDropTarget ? ( -
- ) : null} -
+ {!isCollapsed && + orderedTasks.map((task, taskIndex) => { + const isActive = task.id === activeId; + const isDim = task.status === "archived"; + const isRunning = task.tabs.some((tab) => tab.status === "running"); + const hasUnread = task.tabs.some((tab) => tab.unread); + const isDraft = task.pullRequest == null || task.pullRequest.status === "draft"; + const totalAdded = task.fileChanges.reduce((sum, file) => sum + file.added, 0); + const totalRemoved = task.fileChanges.reduce((sum, file) => sum + file.removed, 0); + const hasDiffs = totalAdded > 0 || totalRemoved > 0; + const isTaskDropTarget = drag?.type === "task" && drag.projectId === project.id && drag.overIdx === taskIndex && drag.fromIdx !== taskIndex; + const isTaskBeingDragged = drag?.type === "task" && drag.projectId === project.id && drag.fromIdx === taskIndex && didDragRef.current; + + return (
setHoveredRepositoryId(repository.id)} - onMouseLeave={() => setHoveredRepositoryId((cur) => (cur === repository.id ? null : cur))} + key={task.id} + data-task-idx={taskIndex} + data-task-project-id={project.id} onMouseDown={(event) => { if (event.button !== 0) return; + // Only start task drag if not already in a project drag + if (dragRef.current) return; + event.stopPropagation(); startYRef.current = event.clientY; didDragRef.current = false; - setHoveredRepositoryId(null); - const state: DragState = { type: "repository", fromIdx: repositoryIndex, overIdx: null }; + const state: DragState = { type: "task", projectId: project.id, fromIdx: taskIndex, overIdx: null }; dragRef.current = state; setDrag(state); }} onClick={() => { if (!didDragRef.current) { - setCollapsedRepositories((current) => ({ - ...current, - [repository.id]: !current[repository.id], - })); + onSelect(task.id); } }} onContextMenu={(event) => contextMenu.open(event, [ - { label: "Reload repository", onClick: () => onReloadRepository(repository.id) }, - { label: "New task", onClick: () => onCreate(repository.id) }, + { label: "Rename task", onClick: () => onRenameTask(task.id) }, + { label: "Rename branch", onClick: () => onRenameBranch(task.id) }, + { label: "Mark as unread", onClick: () => onMarkUnread(task.id) }, ]) } - data-repository-header - className={css({ - display: "flex", - alignItems: "center", - justifyContent: "space-between", - padding: "10px 8px 4px", - gap: "8px", - cursor: "grab", - userSelect: "none", - })} - > -
-
- - {repositoryInitial(repository.label)} - - - {isCollapsed ? : } - -
- - {stripCommonOrgPrefix(repository.label, repositories)} - -
-
- {isCollapsed ? {formatRelativeAge(repository.updatedAtMs)} : null} - -
-
-
-
- ); - } - - if (item.type === "task") { - const { repository, task, taskIndex } = item; - const isActive = task.id === activeId; - const isRunning = task.sessions.some((s) => s.status === "running"); - const isProvisioning = - (String(task.status).startsWith("init_") && task.status !== "init_complete") || - task.sessions.some((s) => s.status === "pending_provision" || s.status === "pending_session_create"); - const hasUnread = task.sessions.some((s) => s.unread); - const isDraft = task.pullRequest?.isDraft ?? true; - const totalAdded = task.fileChanges.reduce((sum, file) => sum + file.added, 0); - const totalRemoved = task.fileChanges.reduce((sum, file) => sum + file.removed, 0); - const hasDiffs = totalAdded > 0 || totalRemoved > 0; - const isTaskDropTarget = - drag?.type === "task" && drag.repositoryId === repository.id && drag.overIdx === taskIndex && drag.fromIdx !== taskIndex; - const isTaskBeingDragged = drag?.type === "task" && drag.repositoryId === repository.id && drag.fromIdx === taskIndex && didDragRef.current; - - return ( -
{ - if (node) { - virtualizer.measureElement(node); - } - }} - style={{ - left: 0, - position: "absolute", - top: 0, - transform: `translateY(${virtualItem.start}px)`, - width: "100%", - opacity: isTaskBeingDragged ? 0.4 : 1, - transition: "opacity 150ms ease", - }} - onMouseDown={(event) => { - if (event.button !== 0) return; - if (dragRef.current) return; - event.stopPropagation(); - startYRef.current = event.clientY; - didDragRef.current = false; - const state: DragState = { type: "task", repositoryId: repository.id, fromIdx: taskIndex, overIdx: null }; - dragRef.current = state; - setDrag(state); - }} - > - {isTaskDropTarget ? ( -
- ) : null} -
-
onSelect(task.id)} - onContextMenu={(event) => { - const items = [ - { label: "Rename task", onClick: () => onRenameTask(task.id) }, - { label: "Mark as unread", onClick: () => onMarkUnread(task.id) }, - ]; - contextMenu.open(event, items); - }} className={css({ padding: "8px 12px", borderRadius: "8px", + position: "relative", backgroundColor: isActive ? t.interactiveHover : "transparent", + opacity: isTaskBeingDragged ? 0.4 : 1, cursor: "pointer", transition: "all 150ms ease", + "::before": { + content: '""', + position: "absolute", + top: "-2px", + left: 0, + right: 0, + height: "2px", + backgroundColor: isTaskDropTarget ? t.textPrimary : "transparent", + transition: "background-color 100ms ease", + }, ":hover": { backgroundColor: t.interactiveHover, }, @@ -728,46 +592,27 @@ export const Sidebar = memo(function Sidebar({ flexShrink: 0, })} > - +
-
- - {task.title} - -
- {task.primaryUserLogin ? ( - - {task.primaryUserLogin} - - ) : null} + + {task.title} + {task.pullRequest != null ? ( #{task.pullRequest.number} - {task.pullRequest.isDraft ? : null} + {task.pullRequest.status === "draft" ? : null} ) : ( @@ -783,32 +628,13 @@ export const Sidebar = memo(function Sidebar({
-
-
- ); - } - - if (item.type === "task-drop-zone") { - const { repository, taskCount } = item; - const isDropTarget = drag?.type === "task" && drag.repositoryId === repository.id && drag.overIdx === taskCount && drag.fromIdx !== taskCount; - return ( + ); + })} + {/* Bottom drop zone for dragging to end of task list */} + {!isCollapsed && (
{ - if (node) { - virtualizer.measureElement(node); - } - }} - style={{ - left: 0, - position: "absolute", - top: 0, - transform: `translateY(${virtualItem.start}px)`, - width: "100%", - }} + data-task-idx={orderedTasks.length} + data-task-project-id={project.id} className={css({ minHeight: "4px", position: "relative", @@ -819,54 +645,37 @@ export const Sidebar = memo(function Sidebar({ left: 0, right: 0, height: "2px", - backgroundColor: isDropTarget ? t.textPrimary : "transparent", + backgroundColor: + drag?.type === "task" && drag.projectId === project.id && drag.overIdx === orderedTasks.length && drag.fromIdx !== orderedTasks.length + ? t.textPrimary + : "transparent", transition: "background-color 100ms ease", }, })} /> - ); - } - - if (item.type === "repository-drop-zone") { - const isDropTarget = drag?.type === "repository" && drag.overIdx === item.repositoryCount && drag.fromIdx !== item.repositoryCount; - return ( -
{ - if (node) { - virtualizer.measureElement(node); - } - }} - style={{ - left: 0, - position: "absolute", - top: 0, - transform: `translateY(${virtualItem.start}px)`, - width: "100%", - }} - className={css({ - minHeight: "4px", - position: "relative", - "::before": { - content: '""', - position: "absolute", - top: 0, - left: 0, - right: 0, - height: "2px", - backgroundColor: isDropTarget ? t.textPrimary : "transparent", - transition: "background-color 100ms ease", - }, - })} - /> - ); - } - - return null; + )} +
+ ); + })} + {/* Bottom drop zone for dragging project to end of list */} +
+ />
@@ -901,19 +710,19 @@ function SidebarFooter() { const snapshot = useMockAppSnapshot(); const organization = activeMockOrganization(snapshot); const [open, setOpen] = useState(false); - const [organizationFlyoutOpen, setOrganizationFlyoutOpen] = useState(false); + const [workspaceFlyoutOpen, setWorkspaceFlyoutOpen] = useState(false); const containerRef = useRef(null); const flyoutTimerRef = useRef | null>(null); - const organizationTriggerRef = useRef(null); + const workspaceTriggerRef = useRef(null); const flyoutRef = useRef(null); const [flyoutPos, setFlyoutPos] = useState<{ top: number; left: number } | null>(null); useLayoutEffect(() => { - if (organizationFlyoutOpen && organizationTriggerRef.current) { - const rect = organizationTriggerRef.current.getBoundingClientRect(); + if (workspaceFlyoutOpen && workspaceTriggerRef.current) { + const rect = workspaceTriggerRef.current.getBoundingClientRect(); setFlyoutPos({ top: rect.top, left: rect.right + 4 }); } - }, [organizationFlyoutOpen]); + }, [workspaceFlyoutOpen]); useEffect(() => { if (!open) return; @@ -923,7 +732,7 @@ function SidebarFooter() { const inFlyout = flyoutRef.current?.contains(target); if (!inContainer && !inFlyout) { setOpen(false); - setOrganizationFlyoutOpen(false); + setWorkspaceFlyoutOpen(false); } } document.addEventListener("mousedown", handleClick); @@ -933,10 +742,10 @@ function SidebarFooter() { const switchToOrg = useCallback( (org: (typeof snapshot.organizations)[number]) => { setOpen(false); - setOrganizationFlyoutOpen(false); + setWorkspaceFlyoutOpen(false); void (async () => { await client.selectOrganization(org.id); - await navigate({ to: `/organizations/${org.organizationId}` as never }); + await navigate({ to: `/workspaces/${org.workspaceId}` as never }); })(); }, [client, navigate], @@ -944,11 +753,11 @@ function SidebarFooter() { const openFlyout = useCallback(() => { if (flyoutTimerRef.current) clearTimeout(flyoutTimerRef.current); - setOrganizationFlyoutOpen(true); + setWorkspaceFlyoutOpen(true); }, []); const closeFlyout = useCallback(() => { - flyoutTimerRef.current = setTimeout(() => setOrganizationFlyoutOpen(false), 150); + flyoutTimerRef.current = setTimeout(() => setWorkspaceFlyoutOpen(false), 150); }, []); const menuItems: Array<{ icon: React.ReactNode; label: string; danger?: boolean; onClick: () => void }> = []; @@ -1022,14 +831,14 @@ function SidebarFooter() { })} >
- {/* Organization flyout trigger */} + {/* Workspace flyout trigger */} {organization ? ( -
+
) : null} - {/* Organization flyout portal */} - {organizationFlyoutOpen && organization && flyoutPos + {/* Workspace flyout portal */} + {workspaceFlyoutOpen && organization && flyoutPos ? createPortal(
{ setOpen((prev) => { - if (prev) setOrganizationFlyoutOpen(false); + if (prev) setWorkspaceFlyoutOpen(false); return !prev; }); }} diff --git a/foundry/packages/frontend/src/components/mock-layout/session-strip.tsx b/foundry/packages/frontend/src/components/mock-layout/tab-strip.tsx similarity index 78% rename from foundry/packages/frontend/src/components/mock-layout/session-strip.tsx rename to foundry/packages/frontend/src/components/mock-layout/tab-strip.tsx index 105fbef..e96989e 100644 --- a/foundry/packages/frontend/src/components/mock-layout/session-strip.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/tab-strip.tsx @@ -4,40 +4,40 @@ import { LabelXSmall } from "baseui/typography"; import { FileCode, Plus, X } from "lucide-react"; import { useFoundryTokens } from "../../app/theme"; -import { ContextMenuOverlay, SessionAvatar, useContextMenu } from "./ui"; +import { ContextMenuOverlay, TabAvatar, useContextMenu } from "./ui"; import { diffTabId, fileName, type Task } from "./view-model"; -export const SessionStrip = memo(function SessionStrip({ +export const TabStrip = memo(function TabStrip({ task, - activeSessionId, + activeTabId, openDiffs, - editingSessionId, + editingSessionTabId, editingSessionName, onEditingSessionNameChange, - onSwitchSession, - onStartRenamingSession, + onSwitchTab, + onStartRenamingTab, onCommitSessionRename, onCancelSessionRename, - onSetSessionUnread, - onCloseSession, + onSetTabUnread, + onCloseTab, onCloseDiffTab, - onAddSession, + onAddTab, sidebarCollapsed, }: { task: Task; - activeSessionId: string | null; + activeTabId: string | null; openDiffs: string[]; - editingSessionId: string | null; + editingSessionTabId: string | null; editingSessionName: string; onEditingSessionNameChange: (value: string) => void; - onSwitchSession: (sessionId: string) => void; - onStartRenamingSession: (sessionId: string) => void; + onSwitchTab: (tabId: string) => void; + onStartRenamingTab: (tabId: string) => void; onCommitSessionRename: () => void; onCancelSessionRename: () => void; - onSetSessionUnread: (sessionId: string, unread: boolean) => void; - onCloseSession: (sessionId: string) => void; + onSetTabUnread: (tabId: string, unread: boolean) => void; + onCloseTab: (tabId: string) => void; onCloseDiffTab: (path: string) => void; - onAddSession: () => void; + onAddTab: () => void; sidebarCollapsed?: boolean; }) { const [css] = useStyletron(); @@ -48,8 +48,8 @@ export const SessionStrip = memo(function SessionStrip({ return ( <>
- {task.sessions.map((tab) => { - const isActive = tab.id === activeSessionId; + {task.tabs.map((tab) => { + const isActive = tab.id === activeTabId; return (
onSwitchSession(tab.id)} - onDoubleClick={() => onStartRenamingSession(tab.id)} + onClick={() => onSwitchTab(tab.id)} + onDoubleClick={() => onStartRenamingTab(tab.id)} onMouseDown={(event) => { - if (event.button === 1 && task.sessions.length > 1) { + if (event.button === 1 && task.tabs.length > 1) { event.preventDefault(); - onCloseSession(tab.id); + onCloseTab(tab.id); } }} onContextMenu={(event) => contextMenu.open(event, [ - { label: "Rename session", onClick: () => onStartRenamingSession(tab.id) }, + { label: "Rename session", onClick: () => onStartRenamingTab(tab.id) }, { label: tab.unread ? "Mark as read" : "Mark as unread", - onClick: () => onSetSessionUnread(tab.id, !tab.unread), + onClick: () => onSetTabUnread(tab.id, !tab.unread), }, - ...(task.sessions.length > 1 ? [{ label: "Close session", onClick: () => onCloseSession(tab.id) }] : []), + ...(task.tabs.length > 1 ? [{ label: "Close tab", onClick: () => onCloseTab(tab.id) }] : []), ]) } - data-session + data-tab className={css({ display: "flex", alignItems: "center", @@ -117,9 +117,9 @@ export const SessionStrip = memo(function SessionStrip({ flexShrink: 0, })} > - +
- {editingSessionId === tab.id ? ( + {editingSessionTabId === tab.id ? ( )} - {task.sessions.length > 1 ? ( + {task.tabs.length > 1 ? ( { event.stopPropagation(); - onCloseSession(tab.id); + onCloseTab(tab.id); }} /> ) : null} @@ -171,19 +171,19 @@ export const SessionStrip = memo(function SessionStrip({ ); })} {openDiffs.map((path) => { - const sessionId = diffTabId(path); - const isActive = sessionId === activeSessionId; + const tabId = diffTabId(path); + const isActive = tabId === activeTabId; return (
onSwitchSession(sessionId)} + key={tabId} + onClick={() => onSwitchTab(tabId)} onMouseDown={(event) => { if (event.button === 1) { event.preventDefault(); onCloseDiffTab(path); } }} - data-session + data-tab className={css({ display: "flex", alignItems: "center", @@ -206,7 +206,7 @@ export const SessionStrip = memo(function SessionStrip({ { event.stopPropagation(); @@ -217,7 +217,7 @@ export const SessionStrip = memo(function SessionStrip({ ); })}
void; @@ -95,10 +95,10 @@ function HeaderIconButton({ ); } -export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onCollapse, onStartResize }: TerminalPaneProps) { +export function TerminalPane({ workspaceId, taskId, isExpanded, onExpand, onCollapse, onStartResize }: TerminalPaneProps) { const [css] = useStyletron(); const t = useFoundryTokens(); - const [activeSessionId, setActiveTabId] = useState(null); + const [activeTabId, setActiveTabId] = useState(null); const [processTabs, setProcessTabs] = useState([]); const [creatingProcess, setCreatingProcess] = useState(false); const [hoveredTabId, setHoveredTabId] = useState(null); @@ -184,17 +184,17 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC [listWidth], ); - const organizationState = useSubscription(subscriptionManager, "organization", { organizationId }); + const workspaceState = useInterest(interestManager, "workspace", { workspaceId }); const taskSummary = useMemo( - () => (taskId ? (organizationState.data?.taskSummaries.find((task) => task.id === taskId) ?? null) : null), - [taskId, organizationState.data?.taskSummaries], + () => (taskId ? (workspaceState.data?.taskSummaries.find((task) => task.id === taskId) ?? null) : null), + [taskId, workspaceState.data?.taskSummaries], ); - const taskState = useSubscription( - subscriptionManager, + const taskState = useInterest( + interestManager, "task", taskSummary ? { - organizationId, + workspaceId, repoId: taskSummary.repoId, taskId: taskSummary.id, } @@ -211,7 +211,7 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC }, [taskState.data]); const connectionQuery = useQuery({ - queryKey: ["mock-layout", "sandbox-agent-connection", organizationId, activeSandbox?.sandboxProviderId ?? "", activeSandbox?.sandboxId ?? ""], + queryKey: ["mock-layout", "sandbox-agent-connection", workspaceId, activeSandbox?.providerId ?? "", activeSandbox?.sandboxId ?? ""], enabled: Boolean(activeSandbox?.sandboxId), staleTime: 30_000, refetchOnWindowFocus: false, @@ -220,17 +220,17 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC throw new Error("Cannot load a sandbox connection without an active sandbox."); } - return await backendClient.getSandboxAgentConnection(organizationId, activeSandbox.sandboxProviderId, activeSandbox.sandboxId); + return await backendClient.getSandboxAgentConnection(workspaceId, activeSandbox.providerId, activeSandbox.sandboxId); }, }); - const processesState = useSubscription( - subscriptionManager, + const processesState = useInterest( + interestManager, "sandboxProcesses", activeSandbox ? { - organizationId, - sandboxProviderId: activeSandbox.sandboxProviderId, + workspaceId, + providerId: activeSandbox.providerId, sandboxId: activeSandbox.sandboxId, } : null, @@ -305,8 +305,7 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC setProcessTabs([]); }, [taskId]); - const processesData = processesState.data; - const processes = processesData ?? []; + const processes = processesState.data ?? []; const openTerminalTab = useCallback((process: SandboxProcessRecord) => { setProcessTabs((current) => { @@ -326,11 +325,11 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC }); }, []); - const closeTerminalTab = useCallback((sessionId: string) => { + const closeTerminalTab = useCallback((tabId: string) => { setProcessTabs((current) => { - const next = current.filter((tab) => tab.id !== sessionId); + const next = current.filter((tab) => tab.id !== tabId); setActiveTabId((currentActive) => { - if (currentActive === sessionId) { + if (currentActive === tabId) { return next.length > 0 ? next[next.length - 1]!.id : null; } return currentActive; @@ -347,8 +346,8 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC setCreatingProcess(true); try { const created = await backendClient.createSandboxProcess({ - organizationId, - sandboxProviderId: activeSandbox.sandboxProviderId, + workspaceId, + providerId: activeSandbox.providerId, sandboxId: activeSandbox.sandboxId, request: defaultShellRequest(activeSandbox.cwd), }); @@ -356,13 +355,13 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC } finally { setCreatingProcess(false); } - }, [activeSandbox, openTerminalTab, organizationId]); + }, [activeSandbox, openTerminalTab, workspaceId]); const processTabsById = useMemo(() => new Map(processTabs.map((tab) => [tab.id, tab])), [processTabs]); - const activeProcessTab = activeSessionId ? (processTabsById.get(activeSessionId) ?? null) : null; + const activeProcessTab = activeTabId ? (processTabsById.get(activeTabId) ?? null) : null; const activeTerminalProcess = useMemo( () => (activeProcessTab ? (processes.find((process) => process.id === activeProcessTab.processId) ?? null) : null), - [activeProcessTab, processesData], + [activeProcessTab, processes], ); const emptyBodyClassName = css({ @@ -544,10 +543,7 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC gap: "6px", minHeight: "39px", maxHeight: "39px", - paddingTop: "0", - paddingRight: "14px", - paddingBottom: "0", - paddingLeft: "14px", + padding: "0 14px", borderTop: `1px solid ${t.borderDefault}`, backgroundColor: t.surfacePrimary, flexShrink: 0, @@ -572,9 +568,9 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC css={css} t={t} label="Kill terminal" - disabled={!activeSessionId} + disabled={!activeTabId} onClick={() => { - if (activeSessionId) closeTerminalTab(activeSessionId); + if (activeTabId) closeTerminalTab(activeTabId); }} > @@ -623,7 +619,7 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC })} > {processTabs.map((tab, tabIndex) => { - const isActive = activeSessionId === tab.id; + const isActive = activeTabId === tab.id; const isHovered = hoveredTabId === tab.id; const isDropTarget = tabDrag !== null && tabDrag.overIdx === tabIndex && tabDrag.fromIdx !== tabIndex; const isBeingDragged = tabDrag !== null && tabDrag.fromIdx === tabIndex && didTabDrag.current; diff --git a/foundry/packages/frontend/src/components/mock-layout/transcript-header.tsx b/foundry/packages/frontend/src/components/mock-layout/transcript-header.tsx index 16f87e6..00e8b0c 100644 --- a/foundry/packages/frontend/src/components/mock-layout/transcript-header.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/transcript-header.tsx @@ -1,24 +1,22 @@ -import { memo, useMemo } from "react"; +import { memo } from "react"; import { useStyletron } from "baseui"; import { LabelSmall } from "baseui/typography"; import { Clock, PanelLeft, PanelRight } from "lucide-react"; import { useFoundryTokens } from "../../app/theme"; -import { deriveHeaderStatus } from "../../features/tasks/status"; -import { HeaderStatusPill, PanelHeaderBar } from "./ui"; -import { type AgentSession, type Task } from "./view-model"; +import { PanelHeaderBar } from "./ui"; +import { type AgentTab, type Task } from "./view-model"; export const TranscriptHeader = memo(function TranscriptHeader({ task, - hasSandbox, - activeSession, + activeTab, editingField, editValue, onEditValueChange, onStartEditingField, onCommitEditingField, onCancelEditingField, - onSetActiveSessionUnread, + onSetActiveTabUnread, sidebarCollapsed, onToggleSidebar, onSidebarPeekStart, @@ -28,15 +26,14 @@ export const TranscriptHeader = memo(function TranscriptHeader({ onNavigateToUsage, }: { task: Task; - hasSandbox: boolean; - activeSession: AgentSession | null | undefined; - editingField: "title" | null; + activeTab: AgentTab | null | undefined; + editingField: "title" | "branch" | null; editValue: string; onEditValueChange: (value: string) => void; - onStartEditingField: (field: "title", value: string) => void; - onCommitEditingField: (field: "title") => void; + onStartEditingField: (field: "title" | "branch", value: string) => void; + onCommitEditingField: (field: "title" | "branch") => void; onCancelEditingField: () => void; - onSetActiveSessionUnread: (unread: boolean) => void; + onSetActiveTabUnread: (unread: boolean) => void; sidebarCollapsed?: boolean; onToggleSidebar?: () => void; onSidebarPeekStart?: () => void; @@ -49,10 +46,6 @@ export const TranscriptHeader = memo(function TranscriptHeader({ const t = useFoundryTokens(); const isDesktop = !!import.meta.env.VITE_DESKTOP; const needsTrafficLightInset = isDesktop && sidebarCollapsed; - const headerStatus = useMemo( - () => deriveHeaderStatus(task.status, activeSession?.status ?? null, activeSession?.errorMessage ?? null, hasSandbox), - [task.status, activeSession?.status, activeSession?.errorMessage, hasSandbox], - ); return ( @@ -117,22 +110,57 @@ export const TranscriptHeader = memo(function TranscriptHeader({ )} {task.branch ? ( - - {task.branch} - + editingField === "branch" ? ( + onEditValueChange(event.target.value)} + onBlur={() => onCommitEditingField("branch")} + onKeyDown={(event) => { + if (event.key === "Enter") { + onCommitEditingField("branch"); + } else if (event.key === "Escape") { + onCancelEditingField(); + } + }} + className={css({ + appearance: "none", + WebkitAppearance: "none", + background: "none", + margin: "0", + outline: "none", + padding: "2px 8px", + borderRadius: "999px", + border: `1px solid ${t.borderFocus}`, + backgroundColor: t.interactiveSubtle, + color: t.textPrimary, + fontSize: "11px", + whiteSpace: "nowrap", + fontFamily: '"IBM Plex Mono", monospace', + minWidth: "60px", + })} + /> + ) : ( + onStartEditingField("branch", task.branch ?? "")} + className={css({ + padding: "2px 8px", + borderRadius: "999px", + border: `1px solid ${t.borderMedium}`, + backgroundColor: t.interactiveSubtle, + color: t.textPrimary, + fontSize: "11px", + whiteSpace: "nowrap", + fontFamily: '"IBM Plex Mono", monospace', + cursor: "pointer", + ":hover": { borderColor: t.borderFocus }, + })} + > + {task.branch} + + ) ) : null} -
; - if (isProvisioning) return ; if (hasUnread) return ; if (isDraft) return ; return ; @@ -181,82 +170,13 @@ export const AgentIcon = memo(function AgentIcon({ agent, size = 14 }: { agent: return ; case "Cursor": return ; - default: - return ; } }); -export type HeaderStatusVariant = "error" | "warning" | "success" | "neutral"; - -export interface HeaderStatusInfo { - variant: HeaderStatusVariant; - label: string; - spinning: boolean; - tooltip?: string; -} - -export const HeaderStatusPill = memo(function HeaderStatusPill({ status }: { status: HeaderStatusInfo }) { - const [css] = useStyletron(); - const t = useFoundryTokens(); - - const colorMap: Record = { - error: { bg: `${t.statusError}18`, text: t.statusError, dot: t.statusError }, - warning: { bg: `${t.statusWarning}18`, text: t.statusWarning, dot: t.statusWarning }, - success: { bg: `${t.statusSuccess}18`, text: t.statusSuccess, dot: t.statusSuccess }, - neutral: { bg: t.interactiveSubtle, text: t.textTertiary, dot: t.textTertiary }, - }; - const colors = colorMap[status.variant]; - - return ( -
- {status.spinning ? ( -
- ) : ( -
- )} - {status.label} -
- ); -}); - -export const SessionAvatar = memo(function SessionAvatar({ session }: { session: AgentSession }) { - if (session.status === "running" || session.status === "pending_provision" || session.status === "pending_session_create") return ; - if (session.unread) return ; - return ; +export const TabAvatar = memo(function TabAvatar({ tab }: { tab: AgentTab }) { + if (tab.status === "running") return ; + if (tab.unread) return ; + return ; }); export const Shell = styled("div", ({ $theme }) => { @@ -301,10 +221,7 @@ export const PanelHeaderBar = styled("div", ({ $theme }) => { alignItems: "center", minHeight: HEADER_HEIGHT, maxHeight: HEADER_HEIGHT, - paddingTop: "0", - paddingRight: "14px", - paddingBottom: "0", - paddingLeft: "14px", + padding: "0 14px", borderBottom: `1px solid ${t.borderDefault}`, backgroundColor: t.surfaceTertiary, gap: "8px", diff --git a/foundry/packages/frontend/src/components/mock-layout/view-model.test.ts b/foundry/packages/frontend/src/components/mock-layout/view-model.test.ts index bc6ab87..f3362dc 100644 --- a/foundry/packages/frontend/src/components/mock-layout/view-model.test.ts +++ b/foundry/packages/frontend/src/components/mock-layout/view-model.test.ts @@ -1,14 +1,14 @@ import { describe, expect, it } from "vitest"; -import type { WorkspaceSession } from "@sandbox-agent/foundry-shared"; +import type { WorkbenchAgentTab } from "@sandbox-agent/foundry-shared"; import { buildDisplayMessages } from "./view-model"; -function makeSession(transcript: WorkspaceSession["transcript"]): WorkspaceSession { +function makeTab(transcript: WorkbenchAgentTab["transcript"]): WorkbenchAgentTab { return { - id: "session-1", + id: "tab-1", sessionId: "session-1", sessionName: "Session 1", agent: "Codex", - model: "gpt-5.3-codex", + model: "gpt-4o", status: "idle", thinkingSinceMs: null, unread: false, @@ -25,7 +25,7 @@ function makeSession(transcript: WorkspaceSession["transcript"]): WorkspaceSessi describe("buildDisplayMessages", () => { it("collapses chunked agent output into a single display message", () => { const messages = buildDisplayMessages( - makeSession([ + makeTab([ { id: "evt-setup", eventIndex: 0, @@ -139,7 +139,7 @@ describe("buildDisplayMessages", () => { it("hides non-message session update envelopes", () => { const messages = buildDisplayMessages( - makeSession([ + makeTab([ { id: "evt-client", eventIndex: 1, diff --git a/foundry/packages/frontend/src/components/mock-layout/view-model.ts b/foundry/packages/frontend/src/components/mock-layout/view-model.ts index 9232293..d22ea5c 100644 --- a/foundry/packages/frontend/src/components/mock-layout/view-model.ts +++ b/foundry/packages/frontend/src/components/mock-layout/view-model.ts @@ -1,28 +1,38 @@ -import { - DEFAULT_WORKSPACE_MODEL_GROUPS as SharedModelGroups, - workspaceModelLabel as sharedWorkspaceModelLabel, - workspaceProviderAgent as sharedWorkspaceProviderAgent, -} from "@sandbox-agent/foundry-shared"; import type { - WorkspaceAgentKind as AgentKind, - WorkspaceSession as AgentSession, - WorkspaceDiffLineKind as DiffLineKind, - WorkspaceFileChange as FileChange, - WorkspaceFileTreeNode as FileTreeNode, - WorkspaceTask as Task, - WorkspaceHistoryEvent as HistoryEvent, - WorkspaceLineAttachment as LineAttachment, - WorkspaceModelGroup as ModelGroup, - WorkspaceModelId as ModelId, - WorkspaceParsedDiffLine as ParsedDiffLine, - WorkspaceRepositorySection as RepositorySection, - WorkspaceTranscriptEvent as TranscriptEvent, + WorkbenchAgentKind as AgentKind, + WorkbenchAgentTab as AgentTab, + WorkbenchDiffLineKind as DiffLineKind, + WorkbenchFileChange as FileChange, + WorkbenchFileTreeNode as FileTreeNode, + WorkbenchTask as Task, + WorkbenchHistoryEvent as HistoryEvent, + WorkbenchLineAttachment as LineAttachment, + WorkbenchModelGroup as ModelGroup, + WorkbenchModelId as ModelId, + WorkbenchParsedDiffLine as ParsedDiffLine, + WorkbenchProjectSection as ProjectSection, + WorkbenchTranscriptEvent as TranscriptEvent, } from "@sandbox-agent/foundry-shared"; import { extractEventText } from "../../features/sessions/model"; -export type { RepositorySection }; +export type { ProjectSection }; -export const MODEL_GROUPS: ModelGroup[] = SharedModelGroups; +export const MODEL_GROUPS: ModelGroup[] = [ + { + provider: "Claude", + models: [ + { id: "claude-sonnet-4", label: "Sonnet 4" }, + { id: "claude-opus-4", label: "Opus 4" }, + ], + }, + { + provider: "OpenAI", + models: [ + { id: "gpt-4o", label: "GPT-4o" }, + { id: "o3", label: "o3" }, + ], + }, +]; export function formatRelativeAge(updatedAtMs: number, nowMs = Date.now()): string { const deltaSeconds = Math.max(0, Math.floor((nowMs - updatedAtMs) / 1000)); @@ -80,11 +90,15 @@ export function formatMessageDuration(durationMs: number): string { } export function modelLabel(id: ModelId): string { - return sharedWorkspaceModelLabel(id, MODEL_GROUPS); + const group = MODEL_GROUPS.find((candidate) => candidate.models.some((model) => model.id === id)); + const model = group?.models.find((candidate) => candidate.id === id); + return model && group ? `${group.provider} ${model.label}` : id; } export function providerAgent(provider: string): AgentKind { - return sharedWorkspaceProviderAgent(provider); + if (provider === "Claude") return "Claude"; + if (provider === "OpenAI") return "Codex"; + return "Cursor"; } const DIFF_PREFIX = "diff:"; @@ -120,17 +134,17 @@ function historyDetail(event: TranscriptEvent): string { return content || "Untitled event"; } -export function buildHistoryEvents(sessions: AgentSession[]): HistoryEvent[] { - return sessions - .flatMap((session) => - session.transcript +export function buildHistoryEvents(tabs: AgentTab[]): HistoryEvent[] { + return tabs + .flatMap((tab) => + tab.transcript .filter((event) => event.sender === "client") .map((event) => ({ - id: `history-${session.id}-${event.id}`, + id: `history-${tab.id}-${event.id}`, messageId: event.id, preview: historyPreview(event), - sessionName: session.sessionName, - sessionId: session.id, + sessionName: tab.sessionName, + tabId: tab.id, createdAtMs: event.createdAt, detail: historyDetail(event), })), @@ -237,8 +251,8 @@ function shouldDisplayEvent(event: TranscriptEvent): boolean { return Boolean(extractEventText(payload).trim()); } -export function buildDisplayMessages(session: AgentSession | null | undefined): Message[] { - if (!session) { +export function buildDisplayMessages(tab: AgentTab | null | undefined): Message[] { + if (!tab) { return []; } @@ -252,7 +266,7 @@ export function buildDisplayMessages(session: AgentSession | null | undefined): pendingAgentMessage = null; }; - for (const event of session.transcript) { + for (const event of tab.transcript) { const chunkText = isAgentChunkEvent(event); if (chunkText !== null) { if (!pendingAgentMessage) { @@ -311,7 +325,7 @@ export function parseDiffLines(diff: string): ParsedDiffLine[] { export type { AgentKind, - AgentSession, + AgentTab, DiffLineKind, FileChange, FileTreeNode, diff --git a/foundry/packages/frontend/src/components/mock-onboarding.tsx b/foundry/packages/frontend/src/components/mock-onboarding.tsx index 4528695..66bcfcc 100644 --- a/foundry/packages/frontend/src/components/mock-onboarding.tsx +++ b/foundry/packages/frontend/src/components/mock-onboarding.tsx @@ -103,8 +103,8 @@ function formatDate(value: string | null): string { return dateFormatter.format(new Date(value)); } -function organizationPath(organization: FoundryOrganization): string { - return `/organizations/${organization.organizationId}`; +function workspacePath(organization: FoundryOrganization): string { + return `/workspaces/${organization.workspaceId}`; } function settingsPath(organization: FoundryOrganization): string { @@ -121,7 +121,7 @@ function checkoutPath(organization: FoundryOrganization, planId: FoundryBillingP function statusBadge(t: FoundryTokens, organization: FoundryOrganization) { if (organization.kind === "personal") { - return Personal organization; + return Personal workspace; } return GitHub organization; } @@ -347,11 +347,11 @@ export function MockOrganizationSelectorPage() { /> -

Select a organization

+

Select a workspace

Choose where you want to work.

- {/* Organization list */} + {/* Workspace list */}
{ void (async () => { await client.selectOrganization(organization.id); - await navigate({ to: organizationPath(organization) }); + await navigate({ to: workspacePath(organization) }); })(); }} style={{ @@ -580,13 +580,13 @@ function SettingsLayout({ overflowY: "auto", }} > - {/* Back to organization */} + {/* Back to workspace */} {/* User header */} @@ -775,7 +775,7 @@ export function MockOrganizationSettingsPage({ organization }: { organization: F
{[ "Hand off tasks to teammates for review or continuation", - "Shared organization with unified billing across your org", + "Shared workspace with unified billing across your org", "200 task hours per seat, with bulk hour purchases available", "Collaborative task history and audit trail", ].map((feature) => ( @@ -1132,7 +1132,7 @@ export function MockAccountSettingsPage() { }} > - Back to organization + Back to workspace
diff --git a/foundry/packages/frontend/src/components/organization-dashboard.tsx b/foundry/packages/frontend/src/components/workspace-dashboard.tsx similarity index 72% rename from foundry/packages/frontend/src/components/organization-dashboard.tsx rename to foundry/packages/frontend/src/components/workspace-dashboard.tsx index 4f54ac3..fca4279 100644 --- a/foundry/packages/frontend/src/components/organization-dashboard.tsx +++ b/foundry/packages/frontend/src/components/workspace-dashboard.tsx @@ -1,6 +1,6 @@ import { useEffect, useMemo, useState, type ReactNode } from "react"; -import type { RepoBranchRecord, RepoOverview, TaskWorkspaceSnapshot, WorkspaceTaskStatus } from "@sandbox-agent/foundry-shared"; -import { currentFoundryOrganization, useSubscription } from "@sandbox-agent/foundry-client"; +import type { AgentType, RepoBranchRecord, RepoOverview, RepoStackAction, WorkbenchTaskStatus } from "@sandbox-agent/foundry-shared"; +import { useInterest } from "@sandbox-agent/foundry-client"; import { useMutation, useQuery } from "@tanstack/react-query"; import { Link, useNavigate } from "@tanstack/react-router"; import { Button } from "baseui/button"; @@ -13,16 +13,14 @@ import { Textarea } from "baseui/textarea"; import { StyledDivider } from "baseui/divider"; import { styled, useStyletron } from "baseui"; import { HeadingSmall, HeadingXSmall, LabelSmall, LabelXSmall, MonoLabelSmall, ParagraphSmall } from "baseui/typography"; -import { Bot, CircleAlert, FolderGit2, GitBranch, MessageSquareText, SendHorizontal } from "lucide-react"; -import { deriveHeaderStatus, describeTaskState } from "../features/tasks/status"; -import { HeaderStatusPill } from "./mock-layout/ui"; +import { Bot, CircleAlert, FolderGit2, GitBranch, MessageSquareText, SendHorizontal, Shuffle } from "lucide-react"; +import { formatDiffStat } from "../features/tasks/model"; import { buildTranscript, resolveSessionSelection } from "../features/sessions/model"; import { backendClient } from "../lib/backend"; -import { subscriptionManager } from "../lib/subscription"; -import { DevPanel, useDevPanel } from "./dev-panel"; +import { interestManager } from "../lib/interest"; -interface OrganizationDashboardProps { - organizationId: string; +interface WorkspaceDashboardProps { + workspaceId: string; selectedTaskId?: string; selectedRepoId?: string; } @@ -94,13 +92,24 @@ const FILTER_OPTIONS: SelectItem[] = [ { id: "all", label: "All Branches" }, ]; -function statusKind(status: WorkspaceTaskStatus): StatusTagKind { +const AGENT_OPTIONS: SelectItem[] = [ + { id: "codex", label: "codex" }, + { id: "claude", label: "claude" }, +]; + +function statusKind(status: WorkbenchTaskStatus): StatusTagKind { if (status === "running") return "positive"; - if (status === "error") return "negative"; - if (String(status).startsWith("init_")) return "warning"; + if (status === "new") return "warning"; return "neutral"; } +function normalizeAgent(agent: string | null): AgentType | undefined { + if (agent === "claude" || agent === "codex") { + return agent; + } + return undefined; +} + function formatTime(value: number): string { return new Date(value).toLocaleTimeString([], { hour: "2-digit", minute: "2-digit" }); } @@ -129,6 +138,8 @@ function repoSummary(overview: RepoOverview | undefined): { total: number; mapped: number; unmapped: number; + conflicts: number; + needsRestack: number; openPrs: number; } { if (!overview) { @@ -136,18 +147,28 @@ function repoSummary(overview: RepoOverview | undefined): { total: 0, mapped: 0, unmapped: 0, + conflicts: 0, + needsRestack: 0, openPrs: 0, }; } let mapped = 0; + let conflicts = 0; + let needsRestack = 0; let openPrs = 0; for (const row of overview.branches) { if (row.taskId) { mapped += 1; } - if (row.pullRequest && row.pullRequest.state !== "MERGED" && row.pullRequest.state !== "CLOSED") { + if (row.conflictsWithMain) { + conflicts += 1; + } + if (row.trackedInStack && row.parentBranch && row.hasUnpushed) { + needsRestack += 1; + } + if (row.prNumber && row.prState !== "MERGED" && row.prState !== "CLOSED") { openPrs += 1; } } @@ -156,30 +177,25 @@ function repoSummary(overview: RepoOverview | undefined): { total: overview.branches.length, mapped, unmapped: Math.max(0, overview.branches.length - mapped), + conflicts, + needsRestack, openPrs, }; } function branchKind(row: RepoBranchRecord): StatusTagKind { - if (row.pullRequest?.isDraft || row.pullRequest?.state === "OPEN") { + if (row.conflictsWithMain) { + return "negative"; + } + if (row.prState === "OPEN" || row.prState === "DRAFT") { return "warning"; } - if (row.pullRequest?.state === "MERGED") { + if (row.prState === "MERGED") { return "positive"; } return "neutral"; } -function branchPullRequestLabel(branch: RepoBranchRecord): string { - if (!branch.pullRequest) { - return "no pr"; - } - if (branch.pullRequest.isDraft) { - return "draft"; - } - return branch.pullRequest.state.toLowerCase(); -} - function matchesOverviewFilter(branch: RepoBranchRecord, filter: RepoOverviewFilter): boolean { if (filter === "archived") { return branch.taskStatus === "archived"; @@ -313,10 +329,9 @@ function MetaRow({ label, value, mono = false }: { label: string; value: string; ); } -export function OrganizationDashboard({ organizationId, selectedTaskId, selectedRepoId }: OrganizationDashboardProps) { +export function WorkspaceDashboard({ workspaceId, selectedTaskId, selectedRepoId }: WorkspaceDashboardProps) { const [css, theme] = useStyletron(); const navigate = useNavigate(); - const showDevPanel = useDevPanel(); const repoOverviewMode = typeof selectedRepoId === "string" && selectedRepoId.length > 0; const [draft, setDraft] = useState(""); @@ -326,26 +341,36 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected const [newTitle, setNewTitle] = useState(""); const [newBranchName, setNewBranchName] = useState(""); const [createOnBranch, setCreateOnBranch] = useState(null); + const [addRepoOpen, setAddRepoOpen] = useState(false); const [createTaskOpen, setCreateTaskOpen] = useState(false); + const [addRepoRemote, setAddRepoRemote] = useState(""); + const [addRepoError, setAddRepoError] = useState(null); + const [stackActionError, setStackActionError] = useState(null); + const [stackActionMessage, setStackActionMessage] = useState(null); const [selectedOverviewBranch, setSelectedOverviewBranch] = useState(null); const [overviewFilter, setOverviewFilter] = useState("active"); + const [reparentBranchName, setReparentBranchName] = useState(null); + const [reparentParentBranch, setReparentParentBranch] = useState(""); + const [newAgentType, setNewAgentType] = useState(() => { + try { + const raw = globalThis.localStorage?.getItem("hf.settings.agentType"); + return raw === "claude" || raw === "codex" ? raw : "codex"; + } catch { + return "codex"; + } + }); const [createError, setCreateError] = useState(null); - const appState = useSubscription(subscriptionManager, "app", {}); - const activeOrg = appState.data ? currentFoundryOrganization(appState.data) : null; - - const organizationState = useSubscription(subscriptionManager, "organization", { organizationId }); - const reposData = organizationState.data?.repos; - const rowsData = organizationState.data?.taskSummaries; - const repos = reposData ?? []; - const rows = rowsData ?? []; - const selectedSummary = useMemo(() => rows.find((row) => row.id === selectedTaskId) ?? rows[0] ?? null, [rowsData, selectedTaskId]); - const taskState = useSubscription( - subscriptionManager, + const workspaceState = useInterest(interestManager, "workspace", { workspaceId }); + const repos = workspaceState.data?.repos ?? []; + const rows = workspaceState.data?.taskSummaries ?? []; + const selectedSummary = useMemo(() => rows.find((row) => row.id === selectedTaskId) ?? rows[0] ?? null, [rows, selectedTaskId]); + const taskState = useInterest( + interestManager, "task", !repoOverviewMode && selectedSummary ? { - organizationId, + workspaceId, repoId: selectedSummary.repoId, taskId: selectedSummary.id, } @@ -354,18 +379,17 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected const activeRepoId = selectedRepoId ?? createRepoId; const repoOverviewQuery = useQuery({ - queryKey: ["organization", organizationId, "repo-overview", activeRepoId], + queryKey: ["workspace", workspaceId, "repo-overview", activeRepoId], enabled: Boolean(repoOverviewMode && activeRepoId), queryFn: async () => { if (!activeRepoId) { throw new Error("No repo selected"); } - return backendClient.getRepoOverview(organizationId, activeRepoId); + return backendClient.getRepoOverview(workspaceId, activeRepoId); }, }); useEffect(() => { - const repos = reposData ?? []; if (repoOverviewMode && selectedRepoId) { setCreateRepoId(selectedRepoId); return; @@ -373,11 +397,17 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected if (!createRepoId && repos.length > 0) { setCreateRepoId(repos[0]!.id); } - }, [createRepoId, repoOverviewMode, reposData, selectedRepoId]); + }, [createRepoId, repoOverviewMode, repos, selectedRepoId]); + + useEffect(() => { + try { + globalThis.localStorage?.setItem("hf.settings.agentType", newAgentType); + } catch { + // ignore storage failures + } + }, [newAgentType]); const repoGroups = useMemo(() => { - const repos = reposData ?? []; - const rows = rowsData ?? []; const byRepo = new Map(); for (const row of rows) { const bucket = byRepo.get(row.repoId); @@ -405,7 +435,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected } return a.repoLabel.localeCompare(b.repoLabel); }); - }, [reposData, rowsData]); + }, [repos, rows]); const selectedForSession = repoOverviewMode ? null : (taskState.data ?? null); @@ -418,31 +448,25 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected }, [selectedForSession]); useEffect(() => { - const rows = rowsData ?? []; if (!repoOverviewMode && !selectedTaskId && rows.length > 0) { void navigate({ - to: "/organizations/$organizationId/tasks/$taskId", + to: "/workspaces/$workspaceId/tasks/$taskId", params: { - organizationId, + workspaceId, taskId: rows[0]!.id, }, search: { sessionId: undefined }, replace: true, }); } - }, [navigate, repoOverviewMode, rowsData, selectedTaskId, organizationId]); + }, [navigate, repoOverviewMode, rows, selectedTaskId, workspaceId]); useEffect(() => { setActiveSessionId(null); setDraft(""); }, [selectedForSession?.id]); - const sessionRowsData = selectedForSession?.sessionsSummary; - const sessionRows = sessionRowsData ?? []; - const taskStatus = selectedForSession?.status ?? null; - const taskStatusState = describeTaskState(taskStatus); - const taskStateSummary = `${taskStatusState.title}. ${taskStatusState.detail}`; - const shouldUseTaskStateEmptyState = Boolean(selectedForSession && taskStatus && taskStatus !== "running" && taskStatus !== "idle"); + const sessionRows = selectedForSession?.sessionsSummary ?? []; const sessionSelection = useMemo( () => resolveSessionSelection({ @@ -457,94 +481,35 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected status: session.status, })), }), - [activeSessionId, selectedForSession?.activeSessionId, sessionRowsData], + [activeSessionId, selectedForSession?.activeSessionId, sessionRows], ); const resolvedSessionId = sessionSelection.sessionId; const staleSessionId = sessionSelection.staleSessionId; - const sessionState = useSubscription( - subscriptionManager, + const sessionState = useInterest( + interestManager, "session", selectedForSession && resolvedSessionId ? { - organizationId, + workspaceId, repoId: selectedForSession.repoId, taskId: selectedForSession.id, sessionId: resolvedSessionId, } : null, ); - const selectedSessionSummary = useMemo(() => sessionRows.find((session) => session.id === resolvedSessionId) ?? null, [resolvedSessionId, sessionRowsData]); - const isPendingProvision = selectedSessionSummary?.status === "pending_provision"; - const isPendingSessionCreate = selectedSessionSummary?.status === "pending_session_create"; - const isSessionError = selectedSessionSummary?.status === "error"; const canStartSession = Boolean(selectedForSession && activeSandbox?.sandboxId); - const devPanelFocusedTask = useMemo(() => { - if (repoOverviewMode) { - return null; - } - - const task = selectedForSession ?? selectedSummary; - if (!task) { - return null; - } - - return { - id: task.id, - repoId: task.repoId, - title: task.title, - status: task.status, - branch: task.branch ?? null, - activeSandboxId: selectedForSession?.activeSandboxId ?? null, - activeSessionId: selectedForSession?.activeSessionId ?? null, - sandboxes: selectedForSession?.sandboxes ?? [], - sessions: selectedForSession?.sessionsSummary ?? [], - }; - }, [repoOverviewMode, selectedForSession, selectedSummary]); - const devPanelSnapshot = useMemo( - (): TaskWorkspaceSnapshot => ({ - organizationId, - repos: repos.map((repo) => ({ id: repo.id, label: repo.label })), - repositories: [], - tasks: rows.map((task) => ({ - id: task.id, - repoId: task.repoId, - title: task.title, - status: task.status, - repoName: task.repoName, - updatedAtMs: task.updatedAtMs, - branch: task.branch ?? null, - pullRequest: task.pullRequest, - sessions: task.sessionsSummary.map((session) => ({ - ...session, - draft: { - text: "", - attachments: [], - updatedAtMs: null, - }, - transcript: [], - })), - fileChanges: [], - diffs: {}, - fileTree: [], - minutesUsed: 0, - activeSandboxId: selectedForSession?.id === task.id ? selectedForSession.activeSandboxId : null, - })), - }), - [reposData, rowsData, selectedForSession, organizationId], - ); const startSessionFromTask = async (): Promise<{ id: string; status: "running" | "idle" | "error" }> => { if (!selectedForSession || !activeSandbox?.sandboxId) { throw new Error("No sandbox is available for this task"); } - const preferredAgent = selectedSessionSummary?.agent === "Claude" ? "claude" : selectedSessionSummary?.agent === "Codex" ? "codex" : undefined; return backendClient.createSandboxSession({ - organizationId, - sandboxProviderId: activeSandbox.sandboxProviderId, + workspaceId, + providerId: activeSandbox.providerId, sandboxId: activeSandbox.sandboxId, prompt: selectedForSession.task, cwd: activeSandbox.cwd ?? undefined, - agent: preferredAgent, + agent: normalizeAgent(selectedForSession.agentType), }); }; @@ -571,8 +536,8 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected } const sessionId = await ensureSessionForPrompt(); await backendClient.sendSandboxPrompt({ - organizationId, - sandboxProviderId: activeSandbox.sandboxProviderId, + workspaceId, + providerId: activeSandbox.providerId, sandboxId: activeSandbox.sandboxId, sessionId, prompt, @@ -598,9 +563,10 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected const draftBranchName = newBranchName.trim(); return backendClient.createTask({ - organizationId, + workspaceId, repoId, task, + agentType: newAgentType, explicitTitle: draftTitle || undefined, explicitBranchName: createOnBranch ? undefined : draftBranchName || undefined, onBranch: createOnBranch ?? undefined, @@ -614,9 +580,9 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected setCreateOnBranch(null); setCreateTaskOpen(false); await navigate({ - to: "/organizations/$organizationId/tasks/$taskId", + to: "/workspaces/$workspaceId/tasks/$taskId", params: { - organizationId, + workspaceId, taskId: task.taskId, }, search: { sessionId: undefined }, @@ -627,6 +593,63 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected }, }); + const addRepo = useMutation({ + mutationFn: async (remoteUrl: string) => { + const trimmed = remoteUrl.trim(); + if (!trimmed) { + throw new Error("Remote URL is required"); + } + return backendClient.addRepo(workspaceId, trimmed); + }, + onSuccess: async (created) => { + setAddRepoError(null); + setAddRepoRemote(""); + setAddRepoOpen(false); + setCreateRepoId(created.repoId); + if (repoOverviewMode) { + await navigate({ + to: "/workspaces/$workspaceId/repos/$repoId", + params: { + workspaceId, + repoId: created.repoId, + }, + }); + } + }, + onError: (error) => { + setAddRepoError(error instanceof Error ? error.message : String(error)); + }, + }); + + const runStackAction = useMutation({ + mutationFn: async (input: { action: RepoStackAction; branchName?: string; parentBranch?: string }) => { + if (!activeRepoId) { + throw new Error("No repository selected"); + } + return backendClient.runRepoStackAction({ + workspaceId, + repoId: activeRepoId, + action: input.action, + branchName: input.branchName, + parentBranch: input.parentBranch, + }); + }, + onSuccess: async (result) => { + if (result.executed) { + setStackActionError(null); + setStackActionMessage(result.message); + } else { + setStackActionMessage(null); + setStackActionError(result.message); + } + await repoOverviewQuery.refetch(); + }, + onError: (error) => { + setStackActionMessage(null); + setStackActionError(error instanceof Error ? error.message : String(error)); + }, + }); + const openCreateFromBranch = (repoId: string, branchName: string): void => { setCreateRepoId(repoId); setCreateOnBranch(branchName); @@ -638,20 +661,22 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected setCreateTaskOpen(true); }; - const repoOptions = useMemo(() => repos.map((repo) => createOption({ id: repo.id, label: repo.label })), [reposData]); + const repoOptions = useMemo(() => repos.map((repo) => createOption({ id: repo.id, label: repo.label })), [repos]); const selectedRepoOption = repoOptions.find((option) => option.id === createRepoId) ?? null; + const selectedAgentOption = useMemo(() => createOption(AGENT_OPTIONS.find((option) => option.id === newAgentType) ?? AGENT_OPTIONS[0]!), [newAgentType]); const selectedFilterOption = useMemo( () => createOption(FILTER_OPTIONS.find((option) => option.id === overviewFilter) ?? FILTER_OPTIONS[0]!), [overviewFilter], ); const sessionOptions = useMemo( () => sessionRows.map((session) => createOption({ id: session.id, label: `${session.sessionName} (${session.status})` })), - [sessionRowsData], + [sessionRows], ); const selectedSessionOption = sessionOptions.find((option) => option.id === resolvedSessionId) ?? null; const overview = repoOverviewQuery.data; const overviewStats = repoSummary(overview); + const stackActionsEnabled = Boolean(overview?.stackAvailable) && !runStackAction.isPending; const filteredOverviewBranches = useMemo(() => { if (!overview?.branches?.length) { return []; @@ -678,6 +703,26 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected } }, [filteredOverviewBranches, selectedOverviewBranch]); + const handleReparentSubmit = (): void => { + if (!reparentBranchName || !reparentParentBranch.trim()) { + return; + } + setStackActionError(null); + void runStackAction + .mutateAsync({ + action: "reparent_branch", + branchName: reparentBranchName, + parentBranch: reparentParentBranch.trim(), + }) + .then(() => { + setReparentBranchName(null); + setReparentParentBranch(""); + }) + .catch(() => { + // mutation state is surfaced above + }); + }; + const modalOverrides = useMemo( () => ({ Dialog: { @@ -718,7 +763,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected gap: "2px", })} > - Organization + Workspace
- {organizationId} + {workspaceId}
@@ -737,14 +782,12 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected size="compact" kind="secondary" onClick={() => { - void navigate({ - to: "/organizations/$organizationId/settings", - params: { organizationId }, - }); + setAddRepoError(null); + setAddRepoOpen(true); }} - data-testid="organization-settings-open" + data-testid="repo-add-open" > - GitHub Settings + Add Repo
@@ -759,14 +802,14 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected - {organizationState.status === "loading" ? ( + {workspaceState.status === "loading" ? ( <> ) : null} - {organizationState.status !== "loading" && repoGroups.length === 0 ? ( - No repos or tasks yet. Create the repository in GitHub, then sync repos from organization settings. + {workspaceState.status !== "loading" && repoGroups.length === 0 ? ( + No repos or tasks yet. Add a repo to start a workspace. ) : null} {repoGroups.map((group) => ( @@ -780,8 +823,8 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected })} >
+ + + +
@@ -950,8 +1028,28 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected Branches {overviewStats.total} Mapped {overviewStats.mapped} Unmapped {overviewStats.unmapped} + Conflicts {overviewStats.conflicts} Open PRs {overviewStats.openPrs} + Needs restack {overviewStats.needsRestack}
+ + {overview && !overview.stackAvailable ? ( + + git-spice is unavailable for this repo. Stack actions are disabled. + + ) : null} + + {stackActionError ? ( + + {stackActionError} + + ) : null} + + {stackActionMessage ? ( + + {stackActionMessage} + + ) : null} @@ -970,10 +1068,10 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected className={css({ minWidth: "980px", display: "grid", - gridTemplateColumns: "2fr 1.3fr 1fr 1fr 0.9fr 1.2fr", + gridTemplateColumns: "2fr 1.3fr 0.8fr 1fr 1fr 1.4fr", })} > - {["Branch", "Task", "PR", "CI / Review", "Updated", "Actions"].map((label) => ( + {["Branch", "Parent", "Ahead", "PR", "CI/Review", "Actions"].map((label) => (
- {branch.taskId ? "task" : "unmapped"} - {branch.commitSha.slice(0, 10) || "-"} + {formatRelativeAge(branch.updatedAt)} + {branch.taskId ? "task" : "unmapped"} + {branch.trackedInStack ? stack : null}
-
{branch.taskTitle ?? branch.taskId ?? "-"}
+
{branch.parentBranch ?? "-"}
+
{branch.hasUnpushed ? "yes" : "-"}
- {branch.ciStatus ?? "-"} / {branch.pullRequest ? (branch.pullRequest.isDraft ? "draft" : "ready") : "-"} + {branch.ciStatus ?? "-"} / {branch.reviewStatus ?? "-"}
-
{formatRelativeAge(branch.updatedAt)}
+ + + + + + {!branch.taskId ? (
@@ -1117,16 +1265,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected {selectedForSession ? (selectedForSession.title ?? "Determining title...") : "No task selected"} - {selectedForSession ? ( - - ) : null} + {selectedForSession ? {selectedForSession.status} : null}
{selectedForSession && !resolvedSessionId ? ( @@ -1141,11 +1280,6 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected ) : null}
- {selectedForSession ? ( - - {taskStateSummary} - - ) : null}
{resolvedSessionId && sessionState.status === "loading" ? : null} - {selectedSessionSummary && (isPendingProvision || isPendingSessionCreate) ? ( -
- - {shouldUseTaskStateEmptyState ? taskStatusState.title : isPendingProvision ? "Provisioning sandbox..." : "Creating session..."} - - - - {shouldUseTaskStateEmptyState - ? taskStateSummary - : isPendingProvision - ? "The task is still provisioning." - : "The session is being created."} - -
- ) : null} - {transcript.length === 0 && !(resolvedSessionId && sessionState.status === "loading") ? ( - {shouldUseTaskStateEmptyState - ? taskStateSummary - : isPendingProvision - ? "Provisioning sandbox..." - : isPendingSessionCreate - ? "Creating session..." - : isSessionError - ? (selectedSessionSummary?.errorMessage ?? "Session failed to start.") - : !activeSandbox?.sandboxId - ? "This task is still provisioning its sandbox." - : staleSessionId - ? `Session ${staleSessionId} is unavailable. Start a new session to continue.` - : resolvedSessionId - ? "No transcript events yet. Send a prompt to start this session." - : "No active session for this task."} + {selectedForSession.runtimeStatus === "error" && selectedForSession.statusMessage + ? `Session failed: ${selectedForSession.statusMessage}` + : !activeSandbox?.sandboxId + ? selectedForSession.statusMessage + ? `Sandbox unavailable: ${selectedForSession.statusMessage}` + : "This task is still provisioning its sandbox." + : staleSessionId + ? `Session ${staleSessionId} is unavailable. Start a new session to continue.` + : resolvedSessionId + ? "No transcript events yet. Send a prompt to start this session." + : "No active session for this task."} ) : null} @@ -1338,7 +1442,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected onChange={(event) => setDraft(event.target.value)} placeholder="Send a follow-up prompt to this session" rows={5} - disabled={!activeSandbox?.sandboxId || isPendingProvision || isPendingSessionCreate || isSessionError} + disabled={!activeSandbox?.sandboxId} overrides={textareaTestIdOverrides("task-session-prompt")} />
+
@@ -1437,10 +1535,10 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected })} > + + - - )}
@@ -1464,8 +1562,6 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected gap: theme.sizing.scale300, })} > - - @@ -1485,8 +1581,9 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected })} > - - + + + @@ -1509,7 +1606,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected - {taskStatus === "error" ? ( + {selectedForSession.runtimeStatus === "error" ? (
- Task reported an error state + Session reported an error state
- {taskStatusState.detail} + {selectedForSession.statusMessage ? selectedForSession.statusMessage : "Open transcript in the center panel for details."} ) : null} @@ -1541,6 +1638,49 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected + setAddRepoOpen(false)} overrides={modalOverrides}> + Add Repo + +
+ + Add a git remote URL to this workspace. + + setAddRepoRemote(event.target.value)} + overrides={inputTestIdOverrides("repo-add-remote")} + /> + {addRepoError ? ( + + {addRepoError} + + ) : null} +
+
+ + + + +
+ { @@ -1581,12 +1721,56 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected overrides={selectTestIdOverrides("task-create-repo")} /> {repos.length === 0 ? ( - - No imported repos yet. Create the repository in GitHub first, then sync repos from organization settings. - +
+ + No repos yet. + + +
) : null} +
+ + Agent + + setReparentParentBranch(event.target.value)} + placeholder="Parent branch" + overrides={inputTestIdOverrides("repo-overview-reparent-input")} + /> +
+ + + +
- {showDevPanel ? ( - - ) : null} ); } diff --git a/foundry/packages/frontend/src/features/tasks/model.test.ts b/foundry/packages/frontend/src/features/tasks/model.test.ts index d977fc4..08f7b76 100644 --- a/foundry/packages/frontend/src/features/tasks/model.test.ts +++ b/foundry/packages/frontend/src/features/tasks/model.test.ts @@ -1,30 +1,42 @@ import { describe, expect, it } from "vitest"; import type { TaskRecord } from "@sandbox-agent/foundry-shared"; -import { groupTasksByRepo } from "./model"; +import { formatDiffStat, groupTasksByRepo } from "./model"; const base: TaskRecord = { - organizationId: "default", + workspaceId: "default", repoId: "repo-a", repoRemote: "https://example.com/repo-a.git", taskId: "task-1", branchName: "feature/one", title: "Feature one", task: "Ship one", - sandboxProviderId: "local", + providerId: "daytona", status: "running", + statusMessage: null, activeSandboxId: "sandbox-1", - pullRequest: null, + activeSessionId: "session-1", sandboxes: [ { sandboxId: "sandbox-1", - sandboxProviderId: "local", + providerId: "daytona", sandboxActorId: null, - switchTarget: "sandbox://local/sandbox-1", + switchTarget: "daytona://sandbox-1", cwd: null, createdAt: 10, updatedAt: 10, }, ], + agentType: null, + prSubmitted: false, + diffStat: null, + prUrl: null, + prAuthor: null, + ciStatus: null, + reviewStatus: null, + reviewer: null, + conflictsWithMain: null, + hasUnpushed: null, + parentBranch: null, createdAt: 10, updatedAt: 10, }; @@ -54,3 +66,19 @@ describe("groupTasksByRepo", () => { expect(groups[1]?.repoId).toBe("repo-a"); }); }); + +describe("formatDiffStat", () => { + it("returns No changes for zero-diff values", () => { + expect(formatDiffStat("+0/-0")).toBe("No changes"); + expect(formatDiffStat("+0 -0")).toBe("No changes"); + }); + + it("returns dash for empty values", () => { + expect(formatDiffStat(null)).toBe("-"); + expect(formatDiffStat("")).toBe("-"); + }); + + it("keeps non-empty non-zero diff stats", () => { + expect(formatDiffStat("+12/-4")).toBe("+12/-4"); + }); +}); diff --git a/foundry/packages/frontend/src/features/tasks/model.ts b/foundry/packages/frontend/src/features/tasks/model.ts index 008e0e5..539d3b9 100644 --- a/foundry/packages/frontend/src/features/tasks/model.ts +++ b/foundry/packages/frontend/src/features/tasks/model.ts @@ -37,3 +37,14 @@ export function groupTasksByRepo(tasks: TaskRecord[]): RepoGroup[] { return a.repoRemote.localeCompare(b.repoRemote); }); } + +export function formatDiffStat(diffStat: string | null | undefined): string { + const normalized = diffStat?.trim(); + if (!normalized) { + return "-"; + } + if (normalized === "+0/-0" || normalized === "+0 -0" || normalized === "0 files changed") { + return "No changes"; + } + return normalized; +} diff --git a/foundry/packages/frontend/src/features/tasks/status.test.ts b/foundry/packages/frontend/src/features/tasks/status.test.ts deleted file mode 100644 index 80da16f..0000000 --- a/foundry/packages/frontend/src/features/tasks/status.test.ts +++ /dev/null @@ -1,132 +0,0 @@ -import { describe, expect, it } from "vitest"; -import { TaskStatusSchema } from "@sandbox-agent/foundry-shared"; -import { defaultTaskStatusMessage, deriveHeaderStatus, describeTaskState, isProvisioningTaskStatus, resolveTaskStateDetail } from "./status"; - -describe("defaultTaskStatusMessage", () => { - it("covers every backend task status", () => { - for (const status of TaskStatusSchema.options) { - expect(defaultTaskStatusMessage(status)).toMatch(/\S/); - } - }); - - it("returns the expected copy for init_ensure_name", () => { - expect(defaultTaskStatusMessage("init_ensure_name")).toBe("Determining title and branch."); - }); -}); - -describe("resolveTaskStateDetail", () => { - it("returns the default copy for the current task status", () => { - expect(resolveTaskStateDetail("init_complete")).toBe("Finalizing task initialization."); - }); -}); - -describe("describeTaskState", () => { - it("includes the raw backend status code in the title", () => { - expect(describeTaskState("kill_destroy_sandbox")).toEqual({ - title: "Task state: kill_destroy_sandbox", - detail: "Destroying sandbox resources.", - }); - }); -}); - -describe("isProvisioningTaskStatus", () => { - it("treats in-progress init states as provisioning", () => { - expect(isProvisioningTaskStatus("init_bootstrap_db")).toBe(true); - expect(isProvisioningTaskStatus("init_ensure_name")).toBe(true); - }); - - it("does not treat init_complete as provisioning (task is ready)", () => { - expect(isProvisioningTaskStatus("init_complete")).toBe(false); - }); - - it("does not treat steady-state or terminal states as provisioning", () => { - expect(isProvisioningTaskStatus("running")).toBe(false); - expect(isProvisioningTaskStatus("archived")).toBe(false); - expect(isProvisioningTaskStatus("killed")).toBe(false); - }); -}); - -describe("deriveHeaderStatus", () => { - it("returns error variant when session has error", () => { - const result = deriveHeaderStatus("running", "error", "Sandbox crashed"); - expect(result.variant).toBe("error"); - expect(result.label).toBe("Session error"); - expect(result.tooltip).toBe("Sandbox crashed"); - expect(result.spinning).toBe(false); - }); - - it("returns error variant when task has error", () => { - const result = deriveHeaderStatus("error", null, null); - expect(result.variant).toBe("error"); - expect(result.label).toBe("Error"); - expect(result.spinning).toBe(false); - }); - - it("returns warning variant with spinner for provisioning task", () => { - const result = deriveHeaderStatus("init_enqueue_provision", null, null); - expect(result.variant).toBe("warning"); - expect(result.label).toBe("Provisioning"); - expect(result.spinning).toBe(true); - }); - - it("returns warning variant for pending_provision session", () => { - const result = deriveHeaderStatus("running", "pending_provision", null); - expect(result.variant).toBe("warning"); - expect(result.label).toBe("Provisioning"); - expect(result.spinning).toBe(true); - }); - - it("returns warning variant for pending_session_create session", () => { - const result = deriveHeaderStatus("running", "pending_session_create", null); - expect(result.variant).toBe("warning"); - expect(result.label).toBe("Creating session"); - expect(result.spinning).toBe(true); - }); - - it("returns success variant with spinner for running session", () => { - const result = deriveHeaderStatus("running", "running", null); - expect(result.variant).toBe("success"); - expect(result.label).toBe("Running"); - expect(result.spinning).toBe(true); - }); - - it("returns success variant for idle/ready state", () => { - const result = deriveHeaderStatus("idle", "idle", null); - expect(result.variant).toBe("success"); - expect(result.label).toBe("Ready"); - expect(result.spinning).toBe(false); - }); - - it("returns neutral variant for archived task", () => { - const result = deriveHeaderStatus("archived", null, null); - expect(result.variant).toBe("neutral"); - expect(result.label).toBe("Archived"); - }); - - it("session error takes priority over task error", () => { - const result = deriveHeaderStatus("error", "error", "Sandbox OOM"); - expect(result.variant).toBe("error"); - expect(result.label).toBe("Session error"); - expect(result.tooltip).toBe("Sandbox OOM"); - }); - - it("returns warning when no sandbox is available", () => { - const result = deriveHeaderStatus("idle", "idle", null, false); - expect(result.variant).toBe("warning"); - expect(result.label).toBe("No sandbox"); - expect(result.spinning).toBe(false); - }); - - it("still shows provisioning when no sandbox but task is provisioning", () => { - const result = deriveHeaderStatus("init_enqueue_provision", null, null, false); - expect(result.variant).toBe("warning"); - expect(result.label).toBe("Provisioning"); - expect(result.spinning).toBe(true); - }); - - it("shows error over no-sandbox when session has error", () => { - const result = deriveHeaderStatus("idle", "error", "Connection lost", false); - expect(result.variant).toBe("error"); - expect(result.label).toBe("Session error"); - }); -}); diff --git a/foundry/packages/frontend/src/features/tasks/status.ts b/foundry/packages/frontend/src/features/tasks/status.ts deleted file mode 100644 index 9376786..0000000 --- a/foundry/packages/frontend/src/features/tasks/status.ts +++ /dev/null @@ -1,168 +0,0 @@ -import type { TaskStatus, WorkspaceSessionStatus } from "@sandbox-agent/foundry-shared"; -import type { HeaderStatusInfo } from "../../components/mock-layout/ui"; - -export type TaskDisplayStatus = TaskStatus; - -export interface TaskStateDescriptor { - title: string; - detail: string; -} - -export function isProvisioningTaskStatus(status: TaskDisplayStatus | null | undefined): boolean { - return status === "init_bootstrap_db" || status === "init_enqueue_provision" || status === "init_ensure_name" || status === "init_assert_name"; -} - -export function defaultTaskStatusMessage(status: TaskDisplayStatus | null | undefined): string { - switch (status) { - case "init_bootstrap_db": - return "Creating task records."; - case "init_enqueue_provision": - return "Queueing sandbox provisioning."; - case "init_ensure_name": - return "Determining title and branch."; - case "init_assert_name": - return "Validating title and branch."; - case "init_complete": - return "Finalizing task initialization."; - case "running": - return "Agent session is actively running."; - case "idle": - return "Sandbox is ready for the next prompt."; - case "archive_stop_status_sync": - return "Stopping sandbox status sync before archiving."; - case "archive_release_sandbox": - return "Releasing sandbox resources."; - case "archive_finalize": - return "Finalizing archive."; - case "archived": - return "Task has been archived."; - case "kill_destroy_sandbox": - return "Destroying sandbox resources."; - case "kill_finalize": - return "Finalizing task termination."; - case "killed": - return "Task has been terminated."; - case "error": - return "Task entered an error state."; - case null: - case undefined: - return "Task state unavailable."; - } -} - -export function resolveTaskStateDetail(status: TaskDisplayStatus | null | undefined): string { - return defaultTaskStatusMessage(status); -} - -export function describeTaskState(status: TaskDisplayStatus | null | undefined): TaskStateDescriptor { - return { - title: status ? `Task state: ${status}` : "Task state unavailable", - detail: resolveTaskStateDetail(status), - }; -} - -/** - * Derives the header status pill state from the combined task + active session + sandbox state. - * Priority: session error > task error > no sandbox > provisioning > running > ready/idle > neutral. - */ -export function deriveHeaderStatus( - taskStatus: TaskDisplayStatus | null | undefined, - sessionStatus: WorkspaceSessionStatus | null | undefined, - sessionErrorMessage: string | null | undefined, - hasSandbox?: boolean, -): HeaderStatusInfo { - // Session error takes priority - if (sessionStatus === "error") { - return { - variant: "error", - label: "Session error", - spinning: false, - tooltip: sessionErrorMessage ?? "Session failed to start.", - }; - } - - // Task error - if (taskStatus === "error") { - return { - variant: "error", - label: "Error", - spinning: false, - tooltip: "Task entered an error state.", - }; - } - - // No sandbox available (not provisioning, not errored — just missing) - if (hasSandbox === false && !isProvisioningTaskStatus(taskStatus)) { - return { - variant: "warning", - label: "No sandbox", - spinning: false, - tooltip: "Sandbox is not available for this task.", - }; - } - - // Task provisioning (init_* states) - if (isProvisioningTaskStatus(taskStatus)) { - return { - variant: "warning", - label: "Provisioning", - spinning: true, - tooltip: resolveTaskStateDetail(taskStatus), - }; - } - - // Session pending states - if (sessionStatus === "pending_provision") { - return { - variant: "warning", - label: "Provisioning", - spinning: true, - tooltip: "Provisioning sandbox...", - }; - } - - if (sessionStatus === "pending_session_create") { - return { - variant: "warning", - label: "Creating session", - spinning: true, - tooltip: "Creating agent session...", - }; - } - - // Running - if (sessionStatus === "running") { - return { - variant: "success", - label: "Running", - spinning: true, - tooltip: "Agent is actively running.", - }; - } - - // Ready / idle - if (sessionStatus === "ready" || sessionStatus === "idle" || taskStatus === "idle" || taskStatus === "running") { - return { - variant: "success", - label: "Ready", - spinning: false, - tooltip: "Sandbox is ready.", - }; - } - - // Terminal states - if (taskStatus === "archived" || taskStatus === "killed") { - return { - variant: "neutral", - label: taskStatus === "archived" ? "Archived" : "Terminated", - spinning: false, - }; - } - - // Fallback - return { - variant: "neutral", - label: taskStatus ?? "Unknown", - spinning: false, - }; -} diff --git a/foundry/packages/frontend/src/lib/backend.ts b/foundry/packages/frontend/src/lib/backend.ts index be58cdd..158e701 100644 --- a/foundry/packages/frontend/src/lib/backend.ts +++ b/foundry/packages/frontend/src/lib/backend.ts @@ -1,9 +1,8 @@ import { createBackendClient } from "@sandbox-agent/foundry-client"; -import { backendEndpoint, defaultOrganizationId, frontendClientMode } from "./env"; +import { backendEndpoint, defaultWorkspaceId, frontendClientMode } from "./env"; export const backendClient = createBackendClient({ endpoint: backendEndpoint, - defaultOrganizationId, + defaultWorkspaceId, mode: frontendClientMode, - encoding: import.meta.env.DEV ? "json" : undefined, }); diff --git a/foundry/packages/frontend/src/lib/env.ts b/foundry/packages/frontend/src/lib/env.ts index 5476f83..ea53e85 100644 --- a/foundry/packages/frontend/src/lib/env.ts +++ b/foundry/packages/frontend/src/lib/env.ts @@ -1,6 +1,6 @@ type FoundryRuntimeConfig = { backendEndpoint?: string; - defaultOrganizationId?: string; + defaultWorkspaceId?: string; frontendClientMode?: string; }; @@ -26,7 +26,7 @@ const runtimeConfig = typeof window !== "undefined" ? window.__FOUNDRY_RUNTIME_C export const backendEndpoint = runtimeConfig?.backendEndpoint?.trim() || import.meta.env.VITE_HF_BACKEND_ENDPOINT?.trim() || resolveDefaultBackendEndpoint(); -export const defaultOrganizationId = runtimeConfig?.defaultOrganizationId?.trim() || import.meta.env.VITE_HF_WORKSPACE?.trim() || "default"; +export const defaultWorkspaceId = runtimeConfig?.defaultWorkspaceId?.trim() || import.meta.env.VITE_HF_WORKSPACE?.trim() || "default"; function resolveFrontendClientMode(): "mock" | "remote" { const raw = runtimeConfig?.frontendClientMode?.trim().toLowerCase() || frontendEnv.FOUNDRY_FRONTEND_CLIENT_MODE?.trim().toLowerCase(); diff --git a/foundry/packages/frontend/src/lib/interest.ts b/foundry/packages/frontend/src/lib/interest.ts new file mode 100644 index 0000000..a736e71 --- /dev/null +++ b/foundry/packages/frontend/src/lib/interest.ts @@ -0,0 +1,5 @@ +import { MockInterestManager, RemoteInterestManager } from "@sandbox-agent/foundry-client"; +import { backendClient } from "./backend"; +import { frontendClientMode } from "./env"; + +export const interestManager = frontendClientMode === "mock" ? new MockInterestManager() : new RemoteInterestManager(backendClient); diff --git a/foundry/packages/frontend/src/lib/mock-app.ts b/foundry/packages/frontend/src/lib/mock-app.ts index 09b23ad..c72a708 100644 --- a/foundry/packages/frontend/src/lib/mock-app.ts +++ b/foundry/packages/frontend/src/lib/mock-app.ts @@ -1,21 +1,15 @@ import { useSyncExternalStore } from "react"; import { createFoundryAppClient, - useSubscription, + useInterest, currentFoundryOrganization, currentFoundryUser, eligibleFoundryOrganizations, type FoundryAppClient, } from "@sandbox-agent/foundry-client"; -import type { - FoundryAppSnapshot, - FoundryBillingPlanId, - FoundryOrganization, - UpdateFoundryOrganizationProfileInput, - WorkspaceModelId, -} from "@sandbox-agent/foundry-shared"; +import type { FoundryAppSnapshot, FoundryBillingPlanId, FoundryOrganization, UpdateFoundryOrganizationProfileInput } from "@sandbox-agent/foundry-shared"; import { backendClient } from "./backend"; -import { subscriptionManager } from "./subscription"; +import { interestManager } from "./interest"; import { frontendClientMode } from "./env"; const REMOTE_APP_SESSION_STORAGE_KEY = "sandbox-agent-foundry:remote-app-session"; @@ -43,17 +37,16 @@ const legacyAppClient: FoundryAppClient = createFoundryAppClient({ const remoteAppClient: FoundryAppClient = { getSnapshot(): FoundryAppSnapshot { - return subscriptionManager.getSnapshot("app", {}) ?? EMPTY_APP_SNAPSHOT; + return interestManager.getSnapshot("app", {}) ?? EMPTY_APP_SNAPSHOT; }, subscribe(listener: () => void): () => void { - return subscriptionManager.subscribe("app", {}, listener); + return interestManager.subscribe("app", {}, listener); }, async signInWithGithub(userId?: string): Promise { void userId; await backendClient.signInWithGithub(); }, async signOut(): Promise { - window.localStorage.removeItem(REMOTE_APP_SESSION_STORAGE_KEY); await backendClient.signOutApp(); }, async skipStarterRepo(): Promise { @@ -65,9 +58,6 @@ const remoteAppClient: FoundryAppClient = { async selectOrganization(organizationId: string): Promise { await backendClient.selectAppOrganization(organizationId); }, - async setDefaultModel(defaultModel: WorkspaceModelId): Promise { - await backendClient.setAppDefaultModel(defaultModel); - }, async updateOrganizationProfile(input: UpdateFoundryOrganizationProfileInput): Promise { await backendClient.updateAppOrganizationProfile(input); }, @@ -89,8 +79,8 @@ const remoteAppClient: FoundryAppClient = { async reconnectGithub(organizationId: string): Promise { await backendClient.reconnectAppGithub(organizationId); }, - async recordSeatUsage(organizationId: string): Promise { - await backendClient.recordAppSeatUsage(organizationId); + async recordSeatUsage(workspaceId: string): Promise { + await backendClient.recordAppSeatUsage(workspaceId); }, }; @@ -98,17 +88,9 @@ const appClient: FoundryAppClient = frontendClientMode === "remote" ? remoteAppC export function useMockAppSnapshot(): FoundryAppSnapshot { if (frontendClientMode === "remote") { - const app = useSubscription(subscriptionManager, "app", {}); + const app = useInterest(interestManager, "app", {}); if (app.status !== "loading") { firstSnapshotDelivered = true; - // Persist session sentinel so isAppSnapshotBootstrapping can show a loading - // screen instead of flashing /signin on the next page load / HMR reload. - const snapshot = app.data ?? EMPTY_APP_SNAPSHOT; - if (snapshot.auth.status === "signed_in") { - window.localStorage.setItem(REMOTE_APP_SESSION_STORAGE_KEY, "1"); - } else { - window.localStorage.removeItem(REMOTE_APP_SESSION_STORAGE_KEY); - } } return app.data ?? EMPTY_APP_SNAPSHOT; } diff --git a/foundry/packages/frontend/src/lib/subscription.ts b/foundry/packages/frontend/src/lib/subscription.ts deleted file mode 100644 index c1618fb..0000000 --- a/foundry/packages/frontend/src/lib/subscription.ts +++ /dev/null @@ -1,5 +0,0 @@ -import { MockSubscriptionManager, RemoteSubscriptionManager } from "@sandbox-agent/foundry-client"; -import { backendClient } from "./backend"; -import { frontendClientMode } from "./env"; - -export const subscriptionManager = frontendClientMode === "mock" ? new MockSubscriptionManager() : new RemoteSubscriptionManager(backendClient); diff --git a/foundry/packages/frontend/src/sandbox-agent-react.d.ts b/foundry/packages/frontend/src/sandbox-agent-react.d.ts index 60519b3..c587db7 100644 --- a/foundry/packages/frontend/src/sandbox-agent-react.d.ts +++ b/foundry/packages/frontend/src/sandbox-agent-react.d.ts @@ -43,27 +43,15 @@ declare module "@sandbox-agent/react" { className?: string; classNames?: Partial; endRef?: RefObject; - scrollRef?: RefObject; - scrollToEntryId?: string | null; sessionError?: string | null; eventError?: string | null; isThinking?: boolean; agentId?: string; - virtualize?: boolean; - onAtBottomChange?: (atBottom: boolean) => void; onEventClick?: (eventId: string) => void; onPermissionReply?: (permissionId: string, reply: PermissionReply) => void; - isDividerEntry?: (entry: TranscriptEntry) => boolean; - canOpenEvent?: (entry: TranscriptEntry) => boolean; - getToolGroupSummary?: (entries: TranscriptEntry[]) => string; renderMessageText?: (entry: TranscriptEntry) => ReactNode; renderInlinePendingIndicator?: () => ReactNode; renderThinkingState?: (context: { agentId?: string }) => ReactNode; - renderToolItemIcon?: (entry: TranscriptEntry) => ReactNode; - renderToolGroupIcon?: (entries: TranscriptEntry[], expanded: boolean) => ReactNode; - renderChevron?: (expanded: boolean) => ReactNode; - renderEventLinkContent?: (entry: TranscriptEntry) => ReactNode; - renderPermissionIcon?: (entry: TranscriptEntry) => ReactNode; renderPermissionOptionContent?: (context: PermissionOptionRenderContext) => ReactNode; } diff --git a/foundry/packages/shared/package.json b/foundry/packages/shared/package.json index 8d65361..04e3ff3 100644 --- a/foundry/packages/shared/package.json +++ b/foundry/packages/shared/package.json @@ -6,7 +6,7 @@ "main": "dist/index.js", "types": "dist/index.d.ts", "scripts": { - "build": "tsup src/index.ts --format esm --dts --tsconfig tsconfig.build.json", + "build": "tsup src/index.ts --format esm --dts", "typecheck": "tsc --noEmit", "test": "vitest run" }, diff --git a/foundry/packages/shared/src/app-shell.ts b/foundry/packages/shared/src/app-shell.ts index fa1e969..8e757c5 100644 --- a/foundry/packages/shared/src/app-shell.ts +++ b/foundry/packages/shared/src/app-shell.ts @@ -1,15 +1,7 @@ -import type { WorkspaceModelId } from "./workspace.js"; - export type FoundryBillingPlanId = "free" | "team"; export type FoundryBillingStatus = "active" | "trialing" | "past_due" | "scheduled_cancel"; export type FoundryGithubInstallationStatus = "connected" | "install_required" | "reconnect_required"; export type FoundryGithubSyncStatus = "pending" | "syncing" | "synced" | "error"; -export type FoundryGithubSyncPhase = - | "discovering_repositories" - | "syncing_repositories" - | "syncing_branches" - | "syncing_members" - | "syncing_pull_requests"; export type FoundryOrganizationKind = "personal" | "organization"; export type FoundryStarterRepoStatus = "pending" | "starred" | "skipped"; @@ -20,7 +12,6 @@ export interface FoundryUser { githubLogin: string; roleLabel: string; eligibleOrganizationIds: string[]; - defaultModel: WorkspaceModelId; } export interface FoundryOrganizationMember { @@ -57,12 +48,6 @@ export interface FoundryGithubState { importedRepoCount: number; lastSyncLabel: string; lastSyncAt: number | null; - lastWebhookAt: number | null; - lastWebhookEvent: string; - syncGeneration?: number; - syncPhase?: FoundryGithubSyncPhase | null; - processedRepositoryCount?: number; - totalRepositoryCount?: number; } export interface FoundryOrganizationSettings { @@ -70,12 +55,13 @@ export interface FoundryOrganizationSettings { slug: string; primaryDomain: string; seatAccrualMode: "first_prompt"; + defaultModel: "claude-sonnet-4" | "claude-opus-4" | "gpt-4o" | "o3"; autoImportRepos: boolean; } export interface FoundryOrganization { id: string; - organizationId: string; + workspaceId: string; kind: FoundryOrganizationKind; settings: FoundryOrganizationSettings; github: FoundryGithubState; diff --git a/foundry/packages/shared/src/config.ts b/foundry/packages/shared/src/config.ts index 44ea722..8fd31df 100644 --- a/foundry/packages/shared/src/config.ts +++ b/foundry/packages/shared/src/config.ts @@ -15,7 +15,7 @@ export const ConfigSchema = z.object({ }) .optional(), notify: z.array(NotifyBackendSchema).default(["terminal"]), - organization: z + workspace: z .object({ default: z.string().min(1).default("default"), }) @@ -39,21 +39,23 @@ export const ConfigSchema = z.object({ backup_interval_secs: 3600, backup_retention_days: 7, }), - sandboxProviders: z + providers: z .object({ local: z .object({ - image: z.string().optional(), + rootDir: z.string().optional(), + sandboxAgentPort: z.number().int().min(1).max(65535).optional(), }) .default({}), - e2b: z + daytona: z .object({ + endpoint: z.string().optional(), apiKey: z.string().optional(), - template: z.string().optional(), + image: z.string().default("ubuntu:24.04"), }) - .default({}), + .default({ image: "ubuntu:24.04" }), }) - .default({ local: {}, e2b: {} }), + .default({ local: {}, daytona: { image: "ubuntu:24.04" } }), }); export type AppConfig = z.infer; diff --git a/foundry/packages/shared/src/contracts.ts b/foundry/packages/shared/src/contracts.ts index 8d2b382..6c99d4e 100644 --- a/foundry/packages/shared/src/contracts.ts +++ b/foundry/packages/shared/src/contracts.ts @@ -1,14 +1,14 @@ import { z } from "zod"; -export const OrganizationIdSchema = z +export const WorkspaceIdSchema = z .string() .min(1) .max(64) .regex(/^[a-zA-Z0-9._-]+$/); -export type OrganizationId = z.infer; +export type WorkspaceId = z.infer; -export const SandboxProviderIdSchema = z.enum(["e2b", "local"]); -export type SandboxProviderId = z.infer; +export const ProviderIdSchema = z.enum(["daytona", "local"]); +export type ProviderId = z.infer; export const AgentTypeSchema = z.enum(["claude", "codex"]); export type AgentType = z.infer; @@ -24,6 +24,12 @@ export const TaskStatusSchema = z.enum([ "init_enqueue_provision", "init_ensure_name", "init_assert_name", + "init_create_sandbox", + "init_ensure_agent", + "init_start_sandbox_instance", + "init_create_session", + "init_write_db", + "init_start_status_sync", "init_complete", "running", "idle", @@ -39,7 +45,7 @@ export const TaskStatusSchema = z.enum([ export type TaskStatus = z.infer; export const RepoRecordSchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, repoId: RepoIdSchema, remoteUrl: RepoRemoteSchema, createdAt: z.number().int(), @@ -47,47 +53,41 @@ export const RepoRecordSchema = z.object({ }); export type RepoRecord = z.infer; +export const AddRepoInputSchema = z.object({ + workspaceId: WorkspaceIdSchema, + remoteUrl: RepoRemoteSchema, +}); +export type AddRepoInput = z.infer; + export const CreateTaskInputSchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, repoId: RepoIdSchema, task: z.string().min(1), explicitTitle: z.string().trim().min(1).optional(), explicitBranchName: z.string().trim().min(1).optional(), - sandboxProviderId: SandboxProviderIdSchema.optional(), + providerId: ProviderIdSchema.optional(), + agentType: AgentTypeSchema.optional(), onBranch: z.string().trim().min(1).optional(), }); export type CreateTaskInput = z.infer; -export const WorkspacePullRequestSummarySchema = z.object({ - number: z.number().int(), - status: z.enum(["draft", "ready"]), - title: z.string().min(1), - state: z.string().min(1), - url: z.string().min(1), - headRefName: z.string().min(1), - baseRefName: z.string().min(1), - repoFullName: z.string().min(1), - authorLogin: z.string().nullable(), - isDraft: z.boolean(), - updatedAtMs: z.number().int(), -}); - export const TaskRecordSchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, repoId: z.string().min(1), repoRemote: RepoRemoteSchema, taskId: z.string().min(1), branchName: z.string().min(1).nullable(), title: z.string().min(1).nullable(), task: z.string().min(1), - sandboxProviderId: SandboxProviderIdSchema, + providerId: ProviderIdSchema, status: TaskStatusSchema, + statusMessage: z.string().nullable(), activeSandboxId: z.string().nullable(), - pullRequest: WorkspacePullRequestSummarySchema.nullable(), + activeSessionId: z.string().nullable(), sandboxes: z.array( z.object({ sandboxId: z.string().min(1), - sandboxProviderId: SandboxProviderIdSchema, + providerId: ProviderIdSchema, sandboxActorId: z.string().nullable(), switchTarget: z.string().min(1), cwd: z.string().nullable(), @@ -95,73 +95,133 @@ export const TaskRecordSchema = z.object({ updatedAt: z.number().int(), }), ), + agentType: z.string().nullable(), + prSubmitted: z.boolean(), + diffStat: z.string().nullable(), + prUrl: z.string().nullable(), + prAuthor: z.string().nullable(), + ciStatus: z.string().nullable(), + reviewStatus: z.string().nullable(), + reviewer: z.string().nullable(), + conflictsWithMain: z.string().nullable(), + hasUnpushed: z.string().nullable(), + parentBranch: z.string().nullable(), createdAt: z.number().int(), updatedAt: z.number().int(), }); export type TaskRecord = z.infer; export const TaskSummarySchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, repoId: z.string().min(1), taskId: z.string().min(1), branchName: z.string().min(1).nullable(), title: z.string().min(1).nullable(), status: TaskStatusSchema, updatedAt: z.number().int(), - pullRequest: WorkspacePullRequestSummarySchema.nullable(), }); export type TaskSummary = z.infer; export const TaskActionInputSchema = z.object({ - organizationId: OrganizationIdSchema, - repoId: RepoIdSchema, + workspaceId: WorkspaceIdSchema, taskId: z.string().min(1), }); export type TaskActionInput = z.infer; export const SwitchResultSchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, taskId: z.string().min(1), - sandboxProviderId: SandboxProviderIdSchema, + providerId: ProviderIdSchema, switchTarget: z.string().min(1), }); export type SwitchResult = z.infer; export const ListTasksInputSchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, repoId: RepoIdSchema.optional(), }); export type ListTasksInput = z.infer; export const RepoBranchRecordSchema = z.object({ branchName: z.string().min(1), - commitSha: z.string(), + commitSha: z.string().min(1), + parentBranch: z.string().nullable(), + trackedInStack: z.boolean(), + diffStat: z.string().nullable(), + hasUnpushed: z.boolean(), + conflictsWithMain: z.boolean(), taskId: z.string().nullable(), taskTitle: z.string().nullable(), taskStatus: TaskStatusSchema.nullable(), - pullRequest: WorkspacePullRequestSummarySchema.nullable(), + prNumber: z.number().int().nullable(), + prState: z.string().nullable(), + prUrl: z.string().nullable(), ciStatus: z.string().nullable(), + reviewStatus: z.string().nullable(), + reviewer: z.string().nullable(), + firstSeenAt: z.number().int().nullable(), + lastSeenAt: z.number().int().nullable(), updatedAt: z.number().int(), }); export type RepoBranchRecord = z.infer; export const RepoOverviewSchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, repoId: RepoIdSchema, remoteUrl: RepoRemoteSchema, baseRef: z.string().nullable(), + stackAvailable: z.boolean(), fetchedAt: z.number().int(), + branchSyncAt: z.number().int().nullable(), + prSyncAt: z.number().int().nullable(), + branchSyncStatus: z.enum(["pending", "syncing", "synced", "error"]), + prSyncStatus: z.enum(["pending", "syncing", "synced", "error"]), + repoActionJobs: z.array( + z.object({ + jobId: z.string().min(1), + action: z.enum(["sync_repo", "restack_repo", "restack_subtree", "rebase_branch", "reparent_branch"]), + branchName: z.string().nullable(), + parentBranch: z.string().nullable(), + status: z.enum(["queued", "running", "completed", "error"]), + message: z.string().min(1), + createdAt: z.number().int(), + updatedAt: z.number().int(), + completedAt: z.number().int().nullable(), + }), + ), branches: z.array(RepoBranchRecordSchema), }); export type RepoOverview = z.infer; -export const OrganizationUseInputSchema = z.object({ - organizationId: OrganizationIdSchema, +export const RepoStackActionSchema = z.enum(["sync_repo", "restack_repo", "restack_subtree", "rebase_branch", "reparent_branch"]); +export type RepoStackAction = z.infer; + +export const RepoStackActionInputSchema = z.object({ + workspaceId: WorkspaceIdSchema, + repoId: RepoIdSchema, + action: RepoStackActionSchema, + branchName: z.string().trim().min(1).optional(), + parentBranch: z.string().trim().min(1).optional(), }); -export type OrganizationUseInput = z.infer; +export type RepoStackActionInput = z.infer; + +export const RepoStackActionResultSchema = z.object({ + jobId: z.string().min(1).nullable().optional(), + action: RepoStackActionSchema, + executed: z.boolean(), + status: z.enum(["queued", "running", "completed", "error"]).optional(), + message: z.string().min(1), + at: z.number().int(), +}); +export type RepoStackActionResult = z.infer; + +export const WorkspaceUseInputSchema = z.object({ + workspaceId: WorkspaceIdSchema, +}); +export type WorkspaceUseInput = z.infer; export const StarSandboxAgentRepoInputSchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, }); export type StarSandboxAgentRepoInput = z.infer; @@ -172,17 +232,16 @@ export const StarSandboxAgentRepoResultSchema = z.object({ export type StarSandboxAgentRepoResult = z.infer; export const HistoryQueryInputSchema = z.object({ - organizationId: OrganizationIdSchema, - repoId: z.string().min(1).optional(), + workspaceId: WorkspaceIdSchema, limit: z.number().int().positive().max(500).optional(), branch: z.string().min(1).optional(), taskId: z.string().min(1).optional(), }); export type HistoryQueryInput = z.infer; -export const AuditLogEventSchema = z.object({ +export const HistoryEventSchema = z.object({ id: z.number().int(), - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, repoId: z.string().nullable(), taskId: z.string().nullable(), branchName: z.string().nullable(), @@ -190,18 +249,17 @@ export const AuditLogEventSchema = z.object({ payloadJson: z.string().min(1), createdAt: z.number().int(), }); -export type AuditLogEvent = z.infer; +export type HistoryEvent = z.infer; export const PruneInputSchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, dryRun: z.boolean(), yes: z.boolean(), }); export type PruneInput = z.infer; export const KillInputSchema = z.object({ - organizationId: OrganizationIdSchema, - repoId: RepoIdSchema, + workspaceId: WorkspaceIdSchema, taskId: z.string().min(1), deleteBranch: z.boolean(), abandon: z.boolean(), @@ -209,13 +267,13 @@ export const KillInputSchema = z.object({ export type KillInput = z.infer; export const StatuslineInputSchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, format: z.enum(["table", "claude-code"]), }); export type StatuslineInput = z.infer; export const ListInputSchema = z.object({ - organizationId: OrganizationIdSchema, + workspaceId: WorkspaceIdSchema, format: z.enum(["table", "json"]), full: z.boolean(), }); diff --git a/foundry/packages/shared/src/index.ts b/foundry/packages/shared/src/index.ts index 14eee7b..be629a6 100644 --- a/foundry/packages/shared/src/index.ts +++ b/foundry/packages/shared/src/index.ts @@ -2,7 +2,6 @@ export * from "./app-shell.js"; export * from "./contracts.js"; export * from "./config.js"; export * from "./logging.js"; -export * from "./models.js"; export * from "./realtime-events.js"; +export * from "./workbench.js"; export * from "./workspace.js"; -export * from "./organization.js"; diff --git a/foundry/packages/shared/src/logging.ts b/foundry/packages/shared/src/logging.ts index 12e9abc..fa4b739 100644 --- a/foundry/packages/shared/src/logging.ts +++ b/foundry/packages/shared/src/logging.ts @@ -1,4 +1,4 @@ -import { pino, type LogFn, type Logger, type LoggerOptions } from "pino"; +import { pino, type Logger, type LoggerOptions } from "pino"; export interface FoundryLoggerOptions { service: string; @@ -160,7 +160,7 @@ export function createFoundryLogger(options: FoundryLoggerOptions): Logger { loggerOptions.timestamp = pino.stdTimeFunctions.isoTime; if (options.format === "logfmt") { loggerOptions.hooks = { - logMethod(this: Logger, args: Parameters, _method: LogFn, level: number) { + logMethod(this: Logger, args, _method, level) { const levelLabel = this.levels.labels[level] ?? "info"; const record = buildLogRecord(levelLabel, this.bindings(), args); writeLogfmtLine(formatLogfmtLine(record)); diff --git a/foundry/packages/shared/src/models.ts b/foundry/packages/shared/src/models.ts deleted file mode 100644 index a140a66..0000000 --- a/foundry/packages/shared/src/models.ts +++ /dev/null @@ -1,217 +0,0 @@ -import claudeConfig from "../../../../scripts/agent-configs/resources/claude.json" with { type: "json" }; -import codexConfig from "../../../../scripts/agent-configs/resources/codex.json" with { type: "json" }; - -export type WorkspaceAgentKind = string; -export type WorkspaceModelId = string; - -export interface WorkspaceModelOption { - id: WorkspaceModelId; - label: string; -} - -export interface WorkspaceModelGroup { - provider: string; - agentKind: WorkspaceAgentKind; - sandboxAgentId: string; - models: WorkspaceModelOption[]; -} - -interface AgentConfigResource { - defaultModel?: string; - models?: Array<{ id?: string; name?: string }>; -} - -interface SandboxAgentInfoLike { - id?: unknown; - configOptions?: unknown; -} - -function isRecord(value: unknown): value is Record { - return typeof value === "object" && value !== null; -} - -function normalizeModelLabel(model: { id?: string; name?: string }): string { - const name = model.name?.trim(); - if (name && name.length > 0) { - return name; - } - return model.id?.trim() || "Unknown"; -} - -function buildGroup(provider: string, agentKind: WorkspaceAgentKind, sandboxAgentId: string, config: AgentConfigResource): WorkspaceModelGroup { - return { - provider, - agentKind, - sandboxAgentId, - models: (config.models ?? []) - .map((model) => { - const id = model.id?.trim(); - if (!id) { - return null; - } - return { - id, - label: normalizeModelLabel(model), - }; - }) - .filter((model): model is WorkspaceModelOption => model != null), - }; -} - -function titleCaseIdentifier(value: string): string { - return value - .split(/[\s_-]+/) - .filter(Boolean) - .map((part) => part.slice(0, 1).toUpperCase() + part.slice(1)) - .join(" "); -} - -function workspaceAgentMetadata(agentId: string): { provider: string; agentKind: string } { - const normalized = agentId.trim().toLowerCase(); - switch (normalized) { - case "claude": - return { provider: "Claude", agentKind: "Claude" }; - case "codex": - return { provider: "Codex", agentKind: "Codex" }; - default: - return { - provider: titleCaseIdentifier(agentId), - agentKind: titleCaseIdentifier(agentId), - }; - } -} - -function normalizeOptionLabel(entry: Record): string | null { - const name = typeof entry.name === "string" ? entry.name.trim() : ""; - if (name) { - return name; - } - - const label = typeof entry.label === "string" ? entry.label.trim() : ""; - if (label) { - return label; - } - - const value = typeof entry.value === "string" ? entry.value.trim() : ""; - return value || null; -} - -function appendSelectOptionModels(target: WorkspaceModelOption[], options: unknown): void { - if (!Array.isArray(options)) { - return; - } - - for (const entry of options) { - if (!isRecord(entry)) { - continue; - } - - const value = typeof entry.value === "string" ? entry.value.trim() : ""; - if (value) { - target.push({ - id: value, - label: normalizeOptionLabel(entry) ?? value, - }); - continue; - } - - appendSelectOptionModels(target, entry.options); - } -} - -function normalizeAgentModels(configOptions: unknown): WorkspaceModelOption[] { - if (!Array.isArray(configOptions)) { - return []; - } - - const options = configOptions.find((entry) => isRecord(entry) && entry.category === "model" && entry.type === "select"); - if (!isRecord(options)) { - return []; - } - - const models: WorkspaceModelOption[] = []; - appendSelectOptionModels(models, options.options); - - const seen = new Set(); - return models.filter((model) => { - if (!model.id || seen.has(model.id)) { - return false; - } - seen.add(model.id); - return true; - }); -} - -export function workspaceModelGroupsFromSandboxAgents(agents: SandboxAgentInfoLike[]): WorkspaceModelGroup[] { - return agents - .map((agent) => { - const sandboxAgentId = typeof agent.id === "string" ? agent.id.trim() : ""; - if (!sandboxAgentId) { - return null; - } - - const models = normalizeAgentModels(agent.configOptions); - if (models.length === 0) { - return null; - } - - const metadata = workspaceAgentMetadata(sandboxAgentId); - return { - provider: metadata.provider, - agentKind: metadata.agentKind, - sandboxAgentId, - models, - } satisfies WorkspaceModelGroup; - }) - .filter((group): group is WorkspaceModelGroup => group != null); -} - -export const DEFAULT_WORKSPACE_MODEL_GROUPS: WorkspaceModelGroup[] = [ - buildGroup("Claude", "Claude", "claude", claudeConfig as AgentConfigResource), - buildGroup("Codex", "Codex", "codex", codexConfig as AgentConfigResource), -].filter((group) => group.models.length > 0); - -export const DEFAULT_WORKSPACE_MODEL_ID: WorkspaceModelId = - ((codexConfig as AgentConfigResource).defaultModel ?? DEFAULT_WORKSPACE_MODEL_GROUPS[0]?.models[0]?.id ?? "default").trim(); - -export function workspaceProviderAgent( - provider: string, - groups: WorkspaceModelGroup[] = DEFAULT_WORKSPACE_MODEL_GROUPS, -): WorkspaceAgentKind { - return groups.find((group) => group.provider === provider)?.agentKind ?? provider; -} - -export function workspaceModelGroupForId( - id: WorkspaceModelId, - groups: WorkspaceModelGroup[] = DEFAULT_WORKSPACE_MODEL_GROUPS, -): WorkspaceModelGroup | null { - return groups.find((group) => group.models.some((model) => model.id === id)) ?? null; -} - -export function workspaceModelLabel( - id: WorkspaceModelId, - groups: WorkspaceModelGroup[] = DEFAULT_WORKSPACE_MODEL_GROUPS, -): string { - const group = workspaceModelGroupForId(id, groups); - const model = group?.models.find((candidate) => candidate.id === id); - return model && group ? `${group.provider} ${model.label}` : id; -} - -export function workspaceAgentForModel( - id: WorkspaceModelId, - groups: WorkspaceModelGroup[] = DEFAULT_WORKSPACE_MODEL_GROUPS, -): WorkspaceAgentKind { - const group = workspaceModelGroupForId(id, groups); - if (group) { - return group.agentKind; - } - return groups[0]?.agentKind ?? "Claude"; -} - -export function workspaceSandboxAgentIdForModel( - id: WorkspaceModelId, - groups: WorkspaceModelGroup[] = DEFAULT_WORKSPACE_MODEL_GROUPS, -): string { - const group = workspaceModelGroupForId(id, groups); - return group?.sandboxAgentId ?? groups[0]?.sandboxAgentId ?? "claude"; -} diff --git a/foundry/packages/shared/src/organization.ts b/foundry/packages/shared/src/organization.ts deleted file mode 100644 index 73e1867..0000000 --- a/foundry/packages/shared/src/organization.ts +++ /dev/null @@ -1,13 +0,0 @@ -import type { AppConfig } from "./config.js"; - -export function resolveOrganizationId(flagOrganization: string | undefined, config: AppConfig): string { - if (flagOrganization && flagOrganization.trim().length > 0) { - return flagOrganization.trim(); - } - - if (config.organization.default.trim().length > 0) { - return config.organization.default.trim(); - } - - return "default"; -} diff --git a/foundry/packages/shared/src/realtime-events.ts b/foundry/packages/shared/src/realtime-events.ts index 2cb07e3..c539adc 100644 --- a/foundry/packages/shared/src/realtime-events.ts +++ b/foundry/packages/shared/src/realtime-events.ts @@ -1,5 +1,5 @@ import type { FoundryAppSnapshot } from "./app-shell.js"; -import type { OrganizationSummarySnapshot, WorkspaceSessionDetail, WorkspaceTaskDetail } from "./workspace.js"; +import type { WorkbenchRepoSummary, WorkbenchSessionDetail, WorkbenchTaskDetail, WorkbenchTaskSummary } from "./workbench.js"; export interface SandboxProcessSnapshot { id: string; @@ -15,16 +15,21 @@ export interface SandboxProcessSnapshot { tty: boolean; } -/** Organization-level events broadcast by the organization actor. */ -export type OrganizationEvent = { type: "organizationUpdated"; snapshot: OrganizationSummarySnapshot }; +/** Workspace-level events broadcast by the workspace actor. */ +export type WorkspaceEvent = + | { type: "taskSummaryUpdated"; taskSummary: WorkbenchTaskSummary } + | { type: "taskRemoved"; taskId: string } + | { type: "repoAdded"; repo: WorkbenchRepoSummary } + | { type: "repoUpdated"; repo: WorkbenchRepoSummary } + | { type: "repoRemoved"; repoId: string }; /** Task-level events broadcast by the task actor. */ -export type TaskEvent = { type: "taskUpdated"; detail: WorkspaceTaskDetail }; +export type TaskEvent = { type: "taskDetailUpdated"; detail: WorkbenchTaskDetail }; /** Session-level events broadcast by the task actor and filtered by sessionId on the client. */ -export type SessionEvent = { type: "sessionUpdated"; session: WorkspaceSessionDetail }; +export type SessionEvent = { type: "sessionUpdated"; session: WorkbenchSessionDetail }; -/** App-level events broadcast by the app organization actor. */ +/** App-level events broadcast by the app workspace actor. */ export type AppEvent = { type: "appUpdated"; snapshot: FoundryAppSnapshot }; /** Sandbox process events broadcast by the sandbox instance actor. */ diff --git a/foundry/packages/shared/src/workbench.ts b/foundry/packages/shared/src/workbench.ts new file mode 100644 index 0000000..2aa6a6e --- /dev/null +++ b/foundry/packages/shared/src/workbench.ts @@ -0,0 +1,267 @@ +import type { AgentType, ProviderId, TaskStatus } from "./contracts.js"; + +export type WorkbenchTaskStatus = "running" | "idle" | "new" | "archived"; +export type WorkbenchAgentKind = "Claude" | "Codex" | "Cursor"; +export type WorkbenchModelId = "claude-sonnet-4" | "claude-opus-4" | "gpt-4o" | "o3"; + +export interface WorkbenchTranscriptEvent { + id: string; + eventIndex: number; + sessionId: string; + createdAt: number; + connectionId: string; + sender: "client" | "agent"; + payload: unknown; +} + +export interface WorkbenchComposerDraft { + text: string; + attachments: WorkbenchLineAttachment[]; + updatedAtMs: number | null; +} + +/** Session metadata without transcript content. */ +export interface WorkbenchSessionSummary { + id: string; + sessionId: string | null; + sessionName: string; + agent: WorkbenchAgentKind; + model: WorkbenchModelId; + status: "running" | "idle" | "error"; + thinkingSinceMs: number | null; + unread: boolean; + created: boolean; +} + +/** Full session content — only fetched when viewing a specific session tab. */ +export interface WorkbenchSessionDetail { + /** Stable UI tab id used for the session topic key and routing. */ + sessionId: string; + tabId: string; + sandboxSessionId: string | null; + sessionName: string; + agent: WorkbenchAgentKind; + model: WorkbenchModelId; + status: "running" | "idle" | "error"; + thinkingSinceMs: number | null; + unread: boolean; + created: boolean; + draft: WorkbenchComposerDraft; + transcript: WorkbenchTranscriptEvent[]; +} + +export interface WorkbenchFileChange { + path: string; + added: number; + removed: number; + type: "M" | "A" | "D"; +} + +export interface WorkbenchFileTreeNode { + name: string; + path: string; + isDir: boolean; + children?: WorkbenchFileTreeNode[]; +} + +export interface WorkbenchLineAttachment { + id: string; + filePath: string; + lineNumber: number; + lineContent: string; +} + +export interface WorkbenchHistoryEvent { + id: string; + messageId: string; + preview: string; + sessionName: string; + tabId: string; + createdAtMs: number; + detail: string; +} + +export type WorkbenchDiffLineKind = "context" | "add" | "remove" | "hunk"; + +export interface WorkbenchParsedDiffLine { + kind: WorkbenchDiffLineKind; + lineNumber: number; + text: string; +} + +export interface WorkbenchPullRequestSummary { + number: number; + status: "draft" | "ready"; +} + +export interface WorkbenchSandboxSummary { + providerId: ProviderId; + sandboxId: string; + cwd: string | null; +} + +/** Sidebar-level task data. Materialized in the workspace actor's SQLite. */ +export interface WorkbenchTaskSummary { + id: string; + repoId: string; + title: string; + status: WorkbenchTaskStatus; + repoName: string; + updatedAtMs: number; + branch: string | null; + pullRequest: WorkbenchPullRequestSummary | null; + /** Summary of sessions — no transcript content. */ + sessionsSummary: WorkbenchSessionSummary[]; +} + +/** Full task detail — only fetched when viewing a specific task. */ +export interface WorkbenchTaskDetail extends WorkbenchTaskSummary { + /** Original task prompt/instructions shown in the detail view. */ + task: string; + /** Agent choice used when creating new sandbox sessions for this task. */ + agentType: AgentType | null; + /** Underlying task runtime status preserved for detail views and error handling. */ + runtimeStatus: TaskStatus; + statusMessage: string | null; + activeSessionId: string | null; + diffStat: string | null; + prUrl: string | null; + reviewStatus: string | null; + fileChanges: WorkbenchFileChange[]; + diffs: Record; + fileTree: WorkbenchFileTreeNode[]; + minutesUsed: number; + /** Sandbox info for this task. */ + sandboxes: WorkbenchSandboxSummary[]; + activeSandboxId: string | null; +} + +/** Repo-level summary for workspace sidebar. */ +export interface WorkbenchRepoSummary { + id: string; + label: string; + /** Aggregated branch/task overview state (replaces getRepoOverview polling). */ + taskCount: number; + latestActivityMs: number; +} + +/** Workspace-level snapshot — initial fetch for the workspace topic. */ +export interface WorkspaceSummarySnapshot { + workspaceId: string; + repos: WorkbenchRepoSummary[]; + taskSummaries: WorkbenchTaskSummary[]; +} + +/** + * Deprecated compatibility aliases for older mock/view-model code. + * New code should use the summary/detail/topic-specific types above. + */ +export interface WorkbenchAgentTab extends WorkbenchSessionSummary { + draft: WorkbenchComposerDraft; + transcript: WorkbenchTranscriptEvent[]; +} + +export interface WorkbenchTask { + id: string; + repoId: string; + title: string; + status: WorkbenchTaskStatus; + repoName: string; + updatedAtMs: number; + branch: string | null; + pullRequest: WorkbenchPullRequestSummary | null; + tabs: WorkbenchAgentTab[]; + fileChanges: WorkbenchFileChange[]; + diffs: Record; + fileTree: WorkbenchFileTreeNode[]; + minutesUsed: number; +} + +export interface WorkbenchRepo { + id: string; + label: string; +} + +export interface WorkbenchProjectSection { + id: string; + label: string; + updatedAtMs: number; + tasks: WorkbenchTask[]; +} + +export interface TaskWorkbenchSnapshot { + workspaceId: string; + repos: WorkbenchRepo[]; + projects: WorkbenchProjectSection[]; + tasks: WorkbenchTask[]; +} + +export interface WorkbenchModelOption { + id: WorkbenchModelId; + label: string; +} + +export interface WorkbenchModelGroup { + provider: string; + models: WorkbenchModelOption[]; +} + +export interface TaskWorkbenchSelectInput { + taskId: string; +} + +export interface TaskWorkbenchCreateTaskInput { + repoId: string; + task: string; + title?: string; + branch?: string; + model?: WorkbenchModelId; +} + +export interface TaskWorkbenchRenameInput { + taskId: string; + value: string; +} + +export interface TaskWorkbenchSendMessageInput { + taskId: string; + tabId: string; + text: string; + attachments: WorkbenchLineAttachment[]; +} + +export interface TaskWorkbenchTabInput { + taskId: string; + tabId: string; +} + +export interface TaskWorkbenchRenameSessionInput extends TaskWorkbenchTabInput { + title: string; +} + +export interface TaskWorkbenchChangeModelInput extends TaskWorkbenchTabInput { + model: WorkbenchModelId; +} + +export interface TaskWorkbenchUpdateDraftInput extends TaskWorkbenchTabInput { + text: string; + attachments: WorkbenchLineAttachment[]; +} + +export interface TaskWorkbenchSetSessionUnreadInput extends TaskWorkbenchTabInput { + unread: boolean; +} + +export interface TaskWorkbenchDiffInput { + taskId: string; + path: string; +} + +export interface TaskWorkbenchCreateTaskResponse { + taskId: string; + tabId?: string; +} + +export interface TaskWorkbenchAddTabResponse { + tabId: string; +} diff --git a/foundry/packages/shared/src/workspace.ts b/foundry/packages/shared/src/workspace.ts index e33f189..fb8e1b7 100644 --- a/foundry/packages/shared/src/workspace.ts +++ b/foundry/packages/shared/src/workspace.ts @@ -1,333 +1,13 @@ -import type { SandboxProviderId, TaskStatus } from "./contracts.js"; -import type { WorkspaceAgentKind, WorkspaceModelGroup, WorkspaceModelId, WorkspaceModelOption } from "./models.js"; +import type { AppConfig } from "./config.js"; -export type WorkspaceTaskStatus = TaskStatus; -export type WorkspaceSessionStatus = "pending_provision" | "pending_session_create" | "ready" | "running" | "idle" | "error"; +export function resolveWorkspaceId(flagWorkspace: string | undefined, config: AppConfig): string { + if (flagWorkspace && flagWorkspace.trim().length > 0) { + return flagWorkspace.trim(); + } -export type { WorkspaceAgentKind, WorkspaceModelGroup, WorkspaceModelId, WorkspaceModelOption } from "./models.js"; + if (config.workspace.default.trim().length > 0) { + return config.workspace.default.trim(); + } -export interface WorkspaceTranscriptEvent { - id: string; - eventIndex: number; - sessionId: string; - createdAt: number; - connectionId: string; - sender: "client" | "agent"; - payload: unknown; -} - -export interface WorkspaceComposerDraft { - text: string; - attachments: WorkspaceLineAttachment[]; - updatedAtMs: number | null; -} - -/** Session metadata without transcript content. */ -export interface WorkspaceSessionSummary { - id: string; - /** Stable UI session id used for routing and task-local identity. */ - sessionId: string; - /** Underlying sandbox session id when provisioning has completed. */ - sandboxSessionId?: string | null; - sessionName: string; - agent: WorkspaceAgentKind; - model: WorkspaceModelId; - status: WorkspaceSessionStatus; - thinkingSinceMs: number | null; - unread: boolean; - created: boolean; - errorMessage?: string | null; -} - -/** Full session content — only fetched when viewing a specific session. */ -export interface WorkspaceSessionDetail { - /** Stable UI session id used for the session topic key and routing. */ - sessionId: string; - sandboxSessionId: string | null; - sessionName: string; - agent: WorkspaceAgentKind; - model: WorkspaceModelId; - status: WorkspaceSessionStatus; - thinkingSinceMs: number | null; - unread: boolean; - created: boolean; - errorMessage?: string | null; - draft: WorkspaceComposerDraft; - transcript: WorkspaceTranscriptEvent[]; -} - -export interface WorkspaceFileChange { - path: string; - added: number; - removed: number; - type: "M" | "A" | "D"; -} - -export interface WorkspaceFileTreeNode { - name: string; - path: string; - isDir: boolean; - children?: WorkspaceFileTreeNode[]; -} - -export interface WorkspaceLineAttachment { - id: string; - filePath: string; - lineNumber: number; - lineContent: string; -} - -export interface WorkspaceHistoryEvent { - id: string; - messageId: string; - preview: string; - sessionName: string; - sessionId: string; - createdAtMs: number; - detail: string; -} - -export type WorkspaceDiffLineKind = "context" | "add" | "remove" | "hunk"; - -export interface WorkspaceParsedDiffLine { - kind: WorkspaceDiffLineKind; - lineNumber: number; - text: string; -} - -export interface WorkspacePullRequestSummary { - number: number; - status: "draft" | "ready"; - title?: string; - state?: string; - url?: string; - headRefName?: string; - baseRefName?: string; - repoFullName?: string; - authorLogin?: string | null; - isDraft?: boolean; - updatedAtMs?: number; -} - -export interface WorkspaceSandboxSummary { - sandboxProviderId: SandboxProviderId; - sandboxId: string; - cwd: string | null; - url: string | null; -} - -/** Sidebar-level task data. Materialized in the organization actor's SQLite. */ -export interface WorkspaceTaskSummary { - id: string; - repoId: string; - title: string; - status: WorkspaceTaskStatus; - repoName: string; - updatedAtMs: number; - branch: string | null; - pullRequest: WorkspacePullRequestSummary | null; - activeSessionId: string | null; - /** Summary of sessions — no transcript content. */ - sessionsSummary: WorkspaceSessionSummary[]; - /** GitHub login of the current primary user (task owner). */ - primaryUserLogin: string | null; - /** Avatar URL of the current primary user. */ - primaryUserAvatarUrl: string | null; -} - -/** Full task detail — only fetched when viewing a specific task. */ -export interface WorkspaceTaskDetail extends WorkspaceTaskSummary { - /** Original task prompt/instructions shown in the detail view. */ - task: string; - fileChanges: WorkspaceFileChange[]; - diffs: Record; - fileTree: WorkspaceFileTreeNode[]; - minutesUsed: number; - /** Sandbox info for this task. */ - sandboxes: WorkspaceSandboxSummary[]; - activeSandboxId: string | null; -} - -/** Repo-level summary for organization sidebar. */ -export interface WorkspaceRepositorySummary { - id: string; - label: string; - /** Aggregated branch/task overview state (replaces getRepoOverview polling). */ - taskCount: number; - latestActivityMs: number; -} - -export type OrganizationGithubSyncPhase = - | "discovering_repositories" - | "syncing_repositories" - | "syncing_branches" - | "syncing_members" - | "syncing_pull_requests"; - -export interface OrganizationGithubSummary { - connectedAccount: string; - installationStatus: "connected" | "install_required" | "reconnect_required"; - syncStatus: "pending" | "syncing" | "synced" | "error"; - importedRepoCount: number; - lastSyncLabel: string; - lastSyncAt: number | null; - lastWebhookAt: number | null; - lastWebhookEvent: string; - syncGeneration: number; - syncPhase: OrganizationGithubSyncPhase | null; - processedRepositoryCount: number; - totalRepositoryCount: number; -} - -export interface WorkspaceOpenPullRequest { - repoId: string; - repoFullName: string; - number: number; - title: string; - status: string; - state: string; - url: string; - headRefName: string; - baseRefName: string; - authorLogin: string | null; - isDraft: boolean; -} - -/** Organization-level snapshot — initial fetch for the organization topic. */ -export interface OrganizationSummarySnapshot { - organizationId: string; - github: OrganizationGithubSummary; - repos: WorkspaceRepositorySummary[]; - taskSummaries: WorkspaceTaskSummary[]; - openPullRequests?: WorkspaceOpenPullRequest[]; -} - -export interface WorkspaceSession extends WorkspaceSessionSummary { - draft: WorkspaceComposerDraft; - transcript: WorkspaceTranscriptEvent[]; -} - -export interface WorkspaceTask { - id: string; - repoId: string; - title: string; - status: WorkspaceTaskStatus; - repoName: string; - updatedAtMs: number; - branch: string | null; - pullRequest: WorkspacePullRequestSummary | null; - activeSessionId?: string | null; - sessions: WorkspaceSession[]; - fileChanges: WorkspaceFileChange[]; - diffs: Record; - fileTree: WorkspaceFileTreeNode[]; - minutesUsed: number; - sandboxes?: WorkspaceSandboxSummary[]; - activeSandboxId?: string | null; - /** GitHub login of the current primary user (task owner). */ - primaryUserLogin?: string | null; - /** Avatar URL of the current primary user. */ - primaryUserAvatarUrl?: string | null; -} - -export interface WorkspaceRepo { - id: string; - label: string; -} - -export interface WorkspaceRepositorySection { - id: string; - label: string; - updatedAtMs: number; - tasks: WorkspaceTask[]; -} - -export interface TaskWorkspaceSnapshot { - organizationId: string; - repos: WorkspaceRepo[]; - repositories: WorkspaceRepositorySection[]; - tasks: WorkspaceTask[]; -} - -export interface TaskWorkspaceSelectInput { - repoId: string; - taskId: string; - authSessionId?: string; -} - -export interface TaskWorkspaceCreateTaskInput { - repoId: string; - task: string; - title?: string; - branch?: string; - onBranch?: string; - model?: WorkspaceModelId; - authSessionId?: string; -} - -export interface TaskWorkspaceRenameInput { - repoId: string; - taskId: string; - value: string; - authSessionId?: string; -} - -export interface TaskWorkspaceSendMessageInput { - repoId: string; - taskId: string; - sessionId: string; - text: string; - attachments: WorkspaceLineAttachment[]; - authSessionId?: string; -} - -export interface TaskWorkspaceSessionInput { - repoId: string; - taskId: string; - sessionId: string; - authSessionId?: string; -} - -export interface TaskWorkspaceRenameSessionInput extends TaskWorkspaceSessionInput { - title: string; -} - -export interface TaskWorkspaceChangeModelInput extends TaskWorkspaceSessionInput { - model: WorkspaceModelId; -} - -export interface TaskWorkspaceUpdateDraftInput extends TaskWorkspaceSessionInput { - text: string; - attachments: WorkspaceLineAttachment[]; -} - -export interface TaskWorkspaceSetSessionUnreadInput extends TaskWorkspaceSessionInput { - unread: boolean; -} - -export interface TaskWorkspaceChangeOwnerInput { - repoId: string; - taskId: string; - /** User ID of the target owner (from FoundryOrganizationMember.id). */ - targetUserId: string; - /** Display name to use as fallback if GitHub login cannot be resolved. */ - targetUserName: string; - /** Email to use as fallback if GitHub email cannot be resolved. */ - targetUserEmail: string; - authSessionId?: string; -} - -export interface TaskWorkspaceDiffInput { - repoId: string; - taskId: string; - path: string; -} - -export interface TaskWorkspaceCreateTaskResponse { - taskId: string; - sessionId?: string; -} - -export interface TaskWorkspaceAddSessionResponse { - sessionId: string; + return "default"; } diff --git a/foundry/packages/shared/test/organization.test.ts b/foundry/packages/shared/test/workspace.test.ts similarity index 57% rename from foundry/packages/shared/test/organization.test.ts rename to foundry/packages/shared/test/workspace.test.ts index f1cd3f6..ab596ac 100644 --- a/foundry/packages/shared/test/organization.test.ts +++ b/foundry/packages/shared/test/workspace.test.ts @@ -1,10 +1,10 @@ import { describe, expect, it } from "vitest"; -import { ConfigSchema, resolveOrganizationId, type AppConfig } from "../src/index.js"; +import { ConfigSchema, resolveWorkspaceId, type AppConfig } from "../src/index.js"; const cfg: AppConfig = ConfigSchema.parse({ auto_submit: true, notify: ["terminal"], - organization: { default: "team-a" }, + workspace: { default: "team-a" }, backend: { host: "127.0.0.1", port: 7741, @@ -14,27 +14,26 @@ const cfg: AppConfig = ConfigSchema.parse({ backup_interval_secs: 3600, backup_retention_days: 7, }, - sandboxProviders: { - local: {}, - e2b: {}, + providers: { + daytona: { image: "ubuntu:24.04" }, }, }); -describe("resolveOrganizationId", () => { +describe("resolveWorkspaceId", () => { it("prefers explicit flag", () => { - expect(resolveOrganizationId("feature", cfg)).toBe("feature"); + expect(resolveWorkspaceId("feature", cfg)).toBe("feature"); }); it("falls back to config default", () => { - expect(resolveOrganizationId(undefined, cfg)).toBe("team-a"); + expect(resolveWorkspaceId(undefined, cfg)).toBe("team-a"); }); it("falls back to literal default when config value is empty", () => { const empty = { ...cfg, - organization: { default: "" }, + workspace: { default: "" }, } as AppConfig; - expect(resolveOrganizationId(undefined, empty)).toBe("default"); + expect(resolveWorkspaceId(undefined, empty)).toBe("default"); }); }); diff --git a/foundry/packages/shared/tsconfig.build.json b/foundry/packages/shared/tsconfig.build.json deleted file mode 100644 index 35bcdb2..0000000 --- a/foundry/packages/shared/tsconfig.build.json +++ /dev/null @@ -1,6 +0,0 @@ -{ - "extends": "./tsconfig.json", - "compilerOptions": { - "ignoreDeprecations": "6.0" - } -} diff --git a/foundry/research/friction/general.mdx b/foundry/research/friction/general.mdx index fce920b..b152287 100644 --- a/foundry/research/friction/general.mdx +++ b/foundry/research/friction/general.mdx @@ -15,8 +15,8 @@ The root cause of the duplicate HTTP request is unknown. It is not `appWorkspace ### Attempted Fix / Workaround 1. Made `completeAppGithubAuth` clear `oauthState`/`oauthStateExpiresAt` immediately after validation and before `exchangeCode`, so any duplicate request fails the state check instead of hitting GitHub with a consumed code. -2. Split `syncGithubSessionFromToken` into a fast path (`initGithubSession` — exchange code, get viewer, store token+identity) and a slow path (`syncGithubOrganizations` — list orgs, list installations, sync each organization). -3. `completeAppGithubAuth` now uses the fast path and enqueues the slow org sync to the organization workflow queue (`organization.command.syncGithubSession`, fire-and-forget). The HTTP callback returns a 302 redirect in ~2s instead of ~18s, eliminating the proxy timeout window. +2. Split `syncGithubSessionFromToken` into a fast path (`initGithubSession` — exchange code, get viewer, store token+identity) and a slow path (`syncGithubOrganizations` — list orgs, list installations, sync each workspace). +3. `completeAppGithubAuth` now uses the fast path and enqueues the slow org sync to the workspace workflow queue (`workspace.command.syncGithubSession`, fire-and-forget). The HTTP callback returns a 302 redirect in ~2s instead of ~18s, eliminating the proxy timeout window. 4. The frontend already polls `getAppSnapshot` every 500ms when any org has `syncStatus === "syncing"`, so the deferred sync is transparent to the user. 5. `bootstrapAppGithubSession` (dev-only) still calls the full synchronous `syncGithubSessionFromToken` since proxy timeouts are not a concern in dev and it needs the session fully populated before returning. @@ -38,14 +38,14 @@ Verifying the BaseUI frontend against the real `rivet-dev/sandbox-agent-testing` Three separate issues stacked together during live verification: -1. A half-created task actor remained in repository indexes after earlier runtime failures. The actor state existed, but its durable task row did not, so repo overview polling spammed `Task not found` and kept trying to load an orphaned task. +1. A half-created task actor remained in project indexes after earlier runtime failures. The actor state existed, but its durable task row did not, so repo overview polling spammed `Task not found` and kept trying to load an orphaned task. 2. Rebuilding the backend container outside `just dev` dropped injected GitHub auth, which made repo overview fall back to `Open PRs 0` until `GITHUB_TOKEN`/`GH_TOKEN` were passed back into `docker compose`. 3. In the create-task modal, the BaseUI-controlled form looked populated in the browser, but submit gating/click behavior was unreliable under browser automation, making it hard to distinguish frontend state bugs from backend failures. ### Attempted Fix / Workaround -1. Updated repository-actor stale task pruning to treat `Task not found:` the same as actor-not-found and rebuilt the backend image. -2. Recovered the orphaned task by forcing an initialize attempt, which surfaced a missing `body?.providerId` guard in the task init workflow and led to pruning the stale repository index row. +1. Updated project-actor stale task pruning to treat `Task not found:` the same as actor-not-found and rebuilt the backend image. +2. Recovered the orphaned task by forcing an initialize attempt, which surfaced a missing `body?.providerId` guard in the task init workflow and led to pruning the stale project index row. 3. Recreated the backend with `GITHUB_TOKEN="$(gh auth token)" GH_TOKEN="$(gh auth token)" docker compose ... up -d --build backend` so PR sync could see live GitHub data again. 4. Used `agent-browser` plus screenshots to separate working paths (repo overview + PR visibility) from the remaining broken path (modal submit / task creation UI). @@ -80,22 +80,22 @@ The Docker dev backend container was starting on Bun `1.2.23` and accepting TCP ### What I Was Working On -Implementing Daytona snapshot-based sandbox creation and running required organization validation. +Implementing Daytona snapshot-based sandbox creation and running required workspace validation. ### Friction / Issue -The organization `node_modules` tree is partially root-owned in this environment. `pnpm install`/cleanup failed with `EACCES` and left missing local tool entrypoints (for example `turbo`/`typescript`), which blocked `pnpm -w typecheck/build/test` from running end-to-end. +The workspace `node_modules` tree is partially root-owned in this environment. `pnpm install`/cleanup failed with `EACCES` and left missing local tool entrypoints (for example `turbo`/`typescript`), which blocked `pnpm -w typecheck/build/test` from running end-to-end. ### Attempted Fix / Workaround -1. Attempted organization reinstall (`pnpm install`, `CI=true pnpm install`) and package-level reinstall. +1. Attempted workspace reinstall (`pnpm install`, `CI=true pnpm install`) and package-level reinstall. 2. Attempted cleanup/recreate of `node_modules`, but root-owned files could not be removed. 3. Added temporary local shims for missing tool entrypoints to continue targeted validation. ### Outcome - Daytona-specific changes and backend tests were validated. -- Full organization validation remains blocked until `node_modules` ownership is repaired (or container is recreated). +- Full workspace validation remains blocked until `node_modules` ownership is repaired (or container is recreated). ## 2026-02-16 - uncommitted @@ -187,7 +187,7 @@ Vitest ESM module namespace exports are non-configurable, so `vi.spyOn(childProc ### Outcome - Backend manager tests are stable under ESM. -- Full organization tests pass with lifecycle coverage for outdated-backend restart behavior. +- Full workspace tests pass with lifecycle coverage for outdated-backend restart behavior. ## 2026-02-08 - uncommitted @@ -202,8 +202,8 @@ The environment did not provide `rg`, and docs/policy files still described Rust ### Attempted Fix / Workaround 1. Switched repository discovery to `find`/`grep`. -2. Rewrote repository guidance files (`CLAUDE.md`, `skills/SKILL.md`, docs, `SPEC.md`) to match the TypeScript architecture. -3. Added missing TUI test coverage so monorepo-wide test runs no longer fail on packages without tests. +2. Rewrote project guidance files (`CLAUDE.md`, `skills/SKILL.md`, docs, `SPEC.md`) to match the TypeScript architecture. +3. Added missing TUI test coverage so workspace-wide test runs no longer fail on packages without tests. ### Outcome @@ -214,7 +214,7 @@ The environment did not provide `rg`, and docs/policy files still described Rust ### What I Was Working On -Running full organization test validation (`pnpm -w test`) for the migrated monorepo. +Running full workspace test validation (`pnpm -w test`) for the migrated monorepo. ### Friction / Issue @@ -228,7 +228,7 @@ Backend integration tests depend on native `better-sqlite3` bindings, which were ### Outcome -- Full organization test suite passes consistently. +- Full workspace test suite passes consistently. - Backend unit coverage always runs; DB integration tests run automatically on environments with native bindings. ## 2026-02-09 - aab1012 (working tree) @@ -309,13 +309,13 @@ Running backend tests with the integration flag enabled triggered unrelated acto ### Attempted Fix / Workaround 1. Switched to package-targeted test runs for deterministic coverage (`@sandbox-agent/foundry-backend` + `@sandbox-agent/foundry-frontend`). -2. Relied on required organization validation (`pnpm -w typecheck`, `pnpm -w build`, `pnpm -w test`) plus targeted stack test files. +2. Relied on required workspace validation (`pnpm -w typecheck`, `pnpm -w build`, `pnpm -w test`) plus targeted stack test files. 3. Stopped the runaway integration run and recorded this friction for follow-up. ### Outcome - New stack-focused tests pass in deterministic targeted runs. -- Full required organization checks pass. +- Full required workspace checks pass. - Integration-gated suite remains noisy and needs separate stabilization. ## 2026-03-05 - uncommitted @@ -326,7 +326,7 @@ Reviewing architecture for simplification opportunities. ### Friction / Issue -Considered merging `repositoryPrSync` (30s) and `repositoryBranchSync` (5s) into a single `repositorySync` actor that polls at the faster cadence and does PR fetches every Nth tick. This would reduce actor count by one per repo but violates the single-responsibility-per-actor pattern established in the codebase. Mixed cadences within one actor add conditional tick logic, make the polling intervals harder to reason about independently, and couple two unrelated data sources (git branches vs GitHub API) into one failure domain. +Considered merging `projectPrSync` (30s) and `projectBranchSync` (5s) into a single `projectSync` actor that polls at the faster cadence and does PR fetches every Nth tick. This would reduce actor count by one per repo but violates the single-responsibility-per-actor pattern established in the codebase. Mixed cadences within one actor add conditional tick logic, make the polling intervals harder to reason about independently, and couple two unrelated data sources (git branches vs GitHub API) into one failure domain. ### Attempted Fix / Workaround @@ -334,7 +334,7 @@ None — rejected the idea during review. ### Outcome -- Keep `repositoryPrSync` and `repositoryBranchSync` as separate actors. +- Keep `projectPrSync` and `projectBranchSync` as separate actors. - Single-responsibility-per-sync-actor is the right pattern for this codebase. ## 2026-03-06 - 77341ff @@ -345,13 +345,13 @@ Bringing up the Docker-based local dev stack with `just dev` after the BaseUI fr ### Friction / Issue -Docker Desktop recovered, but the frontend container failed immediately with `Cannot find module @rollup/rollup-linux-arm64-gnu`. The dev compose setup bind-mounted the host organization into `/app`, so the Linux container picked up macOS `node_modules` and missed Rollup's Linux optional package. +Docker Desktop recovered, but the frontend container failed immediately with `Cannot find module @rollup/rollup-linux-arm64-gnu`. The dev compose setup bind-mounted the host workspace into `/app`, so the Linux container picked up macOS `node_modules` and missed Rollup's Linux optional package. ### Attempted Fix / Workaround 1. Confirmed Docker itself was healthy again by checking the Unix socket, `docker version`, and the backend health endpoint. 2. Reproduced the frontend crash inside `docker compose`. -3. Changed the frontend dev service to use named volumes for organization `node_modules` and the pnpm store, and to run `pnpm install --frozen-lockfile` inside the container before starting Vite. +3. Changed the frontend dev service to use named volumes for workspace `node_modules` and the pnpm store, and to run `pnpm install --frozen-lockfile` inside the container before starting Vite. ### Outcome diff --git a/foundry/research/friction/rivet.mdx b/foundry/research/friction/rivet.mdx index a2e4649..c9cb8eb 100644 --- a/foundry/research/friction/rivet.mdx +++ b/foundry/research/friction/rivet.mdx @@ -12,7 +12,7 @@ Resolving GitHub OAuth callback failures caused by stale actor state after squas 2. **No programmatic way to list or destroy actors on Rivet Cloud without the service key.** The public runner token (`pk_*`) lacks permissions for actor management (list/destroy). The Cloud API token (`cloud_api_*`) in our `.env` was returning "token not found". The actual working token format is the service key (`sk_*`) from the namespace connection URL. This was not documented — the destroy docs reference "admin tokens" which are described as "currently not supported on Rivet Cloud" ([#3530](https://github.com/rivet-dev/rivet/issues/3530)), but the `sk_*` token works. The disconnect between the docs and reality cost significant debugging time. -3. **Actor errors during `getOrCreate` are opaque.** When the `organization.completeAppGithubAuth` action triggered `getOrCreate` for org organization actors, the migration failure inside the newly-woken actor was surfaced as `"Internal error"` with no indication that it was a migration/schema issue. The actual error (`table already exists`) was only visible in actor-level logs, not in the action response or the calling backend's logs. +3. **Actor errors during `getOrCreate` are opaque.** When the `workspace.completeAppGithubAuth` action triggered `getOrCreate` for org workspace actors, the migration failure inside the newly-woken actor was surfaced as `"Internal error"` with no indication that it was a migration/schema issue. The actual error (`table already exists`) was only visible in actor-level logs, not in the action response or the calling backend's logs. ### Attempted Fix / Workaround @@ -22,7 +22,7 @@ Resolving GitHub OAuth callback failures caused by stale actor state after squas ### Outcome -- All 4 stale organization actors destroyed (3 org organizations + 1 old v2-prefixed app organization). +- All 4 stale workspace actors destroyed (3 org workspaces + 1 old v2-prefixed app workspace). - Reverted `IF NOT EXISTS` migration changes so Drizzle migrations remain standard. - After redeploy, new actors will be created fresh with the correct squashed migration journal. - **RivetKit improvement opportunities:** @@ -112,17 +112,17 @@ Diagnosing stuck tasks (`init_create_sandbox`) after switching to a linked Rivet ### Friction / Issue 1. File-system driver actor-state writes still attempted to serialize legacy `kvStorage`, which can exceed Bare's buffer limit and trigger `Failed to save actor state: BareError: (byte:0) too large buffer`. -2. Repository snapshots swallowed missing task actors and only logged warnings, so stale `task_index` rows persisted and appeared as stuck/ghost tasks in the UI. +2. Project snapshots swallowed missing task actors and only logged warnings, so stale `task_index` rows persisted and appeared as stuck/ghost tasks in the UI. ### Attempted Fix / Workaround 1. In RivetKit file-system driver writes, force persisted `kvStorage` to `[]` (runtime KV is SQLite-only) so oversized legacy payloads are never re-serialized. -2. In backend repository actor flows (`hydrate`, `snapshot`, `repo overview`, branch registration, PR-close archive), detect `Actor not found` and prune stale `task_index` rows immediately. +2. In backend project actor flows (`hydrate`, `snapshot`, `repo overview`, branch registration, PR-close archive), detect `Actor not found` and prune stale `task_index` rows immediately. ### Outcome - Prevents repeated serialization crashes caused by legacy oversized state blobs. -- Missing task actors are now self-healed from repository indexes instead of repeatedly surfacing as silent warnings. +- Missing task actors are now self-healed from project indexes instead of repeatedly surfacing as silent warnings. ## 2026-02-12 - uncommitted @@ -193,7 +193,7 @@ Adopt these concrete repo conventions: - Schema rule (critical): - SQLite is **per actor instance**, not a shared DB across all instances. -- Do not “namespace” rows with `organizationId`/`repoId`/`taskId` columns when those identifiers already live in the actor key/state. +- Do not “namespace” rows with `workspaceId`/`repoId`/`taskId` columns when those identifiers already live in the actor key/state. - Prefer single-row tables for single-instance storage (e.g. `id=1`) when appropriate. - Migration generation flow (Bun + DrizzleKit): @@ -247,7 +247,7 @@ Verifying Daytona-backed task/session flows for the new frontend and sandbox-ins ### Friction / Issue -Task workflow steps intermittently entered failed state with `StepExhaustedError` and `unknown error` during initialization replay (`init-start-sandbox-instance`, then `init-write-db`), which caused `task.get` to time out and cascaded into `repository snapshot timed out` / `organization list_tasks timed out`. +Task workflow steps intermittently entered failed state with `StepExhaustedError` and `unknown error` during initialization replay (`init-start-sandbox-instance`, then `init-write-db`), which caused `task.get` to time out and cascaded into `project snapshot timed out` / `workspace list_tasks timed out`. ### Attempted Fix / Workaround @@ -305,7 +305,7 @@ if (msg.type === "TickProjectRefresh") { // Coalesce duplicate ticks for a short window. while (Date.now() < deadline) { - const next = await c.queue.next("repository", { timeout: deadline - Date.now() }); + const next = await c.queue.next("project", { timeout: deadline - Date.now() }); if (!next) break; // timeout if (next.type === "TickProjectRefresh") { @@ -348,7 +348,7 @@ Two mistakes in the prior proposal: 2. **Coalesce by message names, not `msg.type`.** - Keep one message name per command/tick channel. -- When a tick window opens, drain and coalesce multiple tick names (e.g. `tick.repository.refresh`, `tick.pr.refresh`, `tick.sandbox.health`) into one execution per name. +- When a tick window opens, drain and coalesce multiple tick names (e.g. `tick.project.refresh`, `tick.pr.refresh`, `tick.sandbox.health`) into one execution per name. 3. **Tick coalesce pattern with timeout (single loop):** @@ -375,7 +375,7 @@ while (true) { // Timeout reached => one or more ticks are due. const due = new Set(); const at = Date.now(); - if (at >= nextProjectRefreshAt) due.add("tick.repository.refresh"); + if (at >= nextProjectRefreshAt) due.add("tick.project.refresh"); if (at >= nextPrRefreshAt) due.add("tick.pr.refresh"); if (at >= nextSandboxHealthAt) due.add("tick.sandbox.health"); @@ -388,7 +388,7 @@ while (true) { } // Execute each due tick once, in deterministic order. - if (due.has("tick.repository.refresh")) { + if (due.has("tick.project.refresh")) { await refreshProjectSnapshot(); nextProjectRefreshAt = Date.now() + 5_000; } @@ -424,7 +424,7 @@ Even with queue-timeout ticks, packing multiple independent timer cadences into ### Final Pattern 1. **Parent actors are command-only loops with no timeout.** -- `OrganizationActor`, `RepositoryActor`, `TaskActor`, and `HistoryActor` wait on queue messages only. +- `WorkspaceActor`, `ProjectActor`, `TaskActor`, and `HistoryActor` wait on queue messages only. 2. **Periodic work moves to dedicated child sync actors.** - Each child actor has exactly one timeout cadence (e.g. PR sync, branch sync, task status sync). @@ -439,7 +439,7 @@ Even with queue-timeout ticks, packing multiple independent timer cadences into ### Example Structure -- `RepositoryActor` (no timeout): handles commands + applies `repository.pr_sync.result` / `repository.branch_sync.result` writes. +- `ProjectActor` (no timeout): handles commands + applies `project.pr_sync.result` / `project.branch_sync.result` writes. - `ProjectPrSyncActor` (timeout 30s): polls PR data, sends result message. - `ProjectBranchSyncActor` (timeout 5s): polls branch data, sends result message. - `TaskActor` (no timeout): handles lifecycle + applies `task.status_sync.result` writes. @@ -502,7 +502,7 @@ Removing custom backend REST endpoints and migrating CLI/TUI calls to direct `ri ### Friction / Issue -We had implemented a `/v1/*` HTTP shim (`/v1/tasks`, `/v1/organizations/use`, etc.) between clients and actors, which duplicated actor APIs and introduced an unnecessary transport layer. +We had implemented a `/v1/*` HTTP shim (`/v1/tasks`, `/v1/workspaces/use`, etc.) between clients and actors, which duplicated actor APIs and introduced an unnecessary transport layer. ### Attempted Fix / Workaround @@ -575,21 +575,21 @@ Removing `*Actor` suffix from all actor export names and registry keys. ### Friction / Issue -RivetKit's `setup({ use: { ... } })` uses property names as actor identifiers in `client.` calls. All 8 actors were exported as `organization`, `repository`, `taskActor`, etc., which meant client code used verbose `client.organization.getOrCreate(...)` instead of `client.organization.getOrCreate(...)`. +RivetKit's `setup({ use: { ... } })` uses property names as actor identifiers in `client.` calls. All 8 actors were exported as `workspaceActor`, `projectActor`, `taskActor`, etc., which meant client code used verbose `client.workspaceActor.getOrCreate(...)` instead of `client.workspace.getOrCreate(...)`. The `Actor` suffix is redundant — everything in the registry is an actor by definition. It also leaked into type names (`WorkspaceActorHandle`, `ProjectActorInput`, `HistoryActorInput`) and local function names (`workspaceActorKey`, `taskActorKey`). ### Attempted Fix / Workaround -1. Renamed all 8 actor exports: `organization` → `organization`, `repository` → `repository`, `taskActor` → `task`, `sandboxInstanceActor` → `sandboxInstance`, `historyActor` → `history`, `repositoryPrSync` → `repositoryPrSync`, `repositoryBranchSync` → `repositoryBranchSync`, `taskStatusSyncActor` → `taskStatusSync`. +1. Renamed all 8 actor exports: `workspaceActor` → `workspace`, `projectActor` → `project`, `taskActor` → `task`, `sandboxInstanceActor` → `sandboxInstance`, `historyActor` → `history`, `projectPrSyncActor` → `projectPrSync`, `projectBranchSyncActor` → `projectBranchSync`, `taskStatusSyncActor` → `taskStatusSync`. 2. Updated registry keys in `actors/index.ts`. 3. Renamed all `client.Actor` references across 14 files (actor definitions, backend entry, CLI client, tests). -4. Renamed associated types (`ProjectActorInput` → `RepositoryInput`, `HistoryActorInput` → `HistoryInput`, `WorkspaceActorHandle` → `OrganizationHandle`, `TaskActorHandle` → `TaskHandle`). +4. Renamed associated types (`ProjectActorInput` → `ProjectInput`, `HistoryActorInput` → `HistoryInput`, `WorkspaceActorHandle` → `WorkspaceHandle`, `TaskActorHandle` → `TaskHandle`). ### Outcome - Actor names are now concise and match their semantic role. -- Client code reads naturally: `client.organization.getOrCreate(...)`, `client.task.get(...)`. +- Client code reads naturally: `client.workspace.getOrCreate(...)`, `client.task.get(...)`. - No runtime behavior change — registry property names drive actor routing. ## 2026-02-09 - uncommitted @@ -609,8 +609,8 @@ Concrete examples from our codebase: | Actor | Pattern | Why | |-------|---------|-----| -| `organization` | Plain run | Every handler is a DB query or single actor delegation | -| `repository` | Plain run | Handlers are DB upserts or delegate to task actor | +| `workspace` | Plain run | Every handler is a DB query or single actor delegation | +| `project` | Plain run | Handlers are DB upserts or delegate to task actor | | `task` | **Needs workflow** | `initialize` is a 7-step pipeline (createSandbox → ensureAgent → createSession → DB writes → start child actors); post-idle is a 5-step pipeline (commit → push → PR → cache → notify) | | `history` | Plain run | Single DB insert per message | | `sandboxInstance` | Plain run | Single-table CRUD per message | @@ -647,7 +647,7 @@ This matters when reasoning about workflow `listen()` behavior: you might assume RivetKit docs should clarify: 1. Queue names are **per-actor-instance** — two different actor instances can use the same queue name without collision. -2. The dotted naming convention (e.g. `repository.command.ensure`) is a user convention for readability, not a routing hierarchy. +2. The dotted naming convention (e.g. `project.command.ensure`) is a user convention for readability, not a routing hierarchy. 3. `c.queue.next(["a", "b"])` listens on queues named `"a"` and `"b"` *within this actor*, not across actors. ### Outcome @@ -662,7 +662,7 @@ Migrating task actor to durable workflows. AI-generated queue names used dotted ### Friction / Issue -When generating actor queue names, the AI (and our own codebase) defaulted to dotted names like `task.command.initialize`, `repository.pr_sync.result`, `task.status_sync.control.start`. These work fine in plain `run` loops, but create friction when interacting with the workflow system because `workflowQueueName()` prefixes them with `__workflow:`, producing names like `__workflow:task.command.initialize`. +When generating actor queue names, the AI (and our own codebase) defaulted to dotted names like `task.command.initialize`, `project.pr_sync.result`, `task.status_sync.control.start`. These work fine in plain `run` loops, but create friction when interacting with the workflow system because `workflowQueueName()` prefixes them with `__workflow:`, producing names like `__workflow:task.command.initialize`. Queue names should always be **camelCase** (e.g. `initializeTask`, `statusSyncResult`, `attachTask`). Dotted names are misleading — they imply hierarchy or routing semantics that don't exist (queues are flat, per-actor-instance strings). They also look like object property paths, which causes confusion when used as dynamic property keys on queue handles (`actor.queue["task.command.initialize"]`). @@ -754,4 +754,4 @@ Using `better-sqlite3` and `node:sqlite` in backend DB bootstrap caused Bun runt - Backend starts successfully under Bun. - Shared Drizzle/SQLite actor DB path still works. -- Organization build + tests pass. +- Workspace build + tests pass. diff --git a/foundry/research/friction/sandbox-agent.mdx b/foundry/research/friction/sandbox-agent.mdx index 5fb9ba8..aa4a29d 100644 --- a/foundry/research/friction/sandbox-agent.mdx +++ b/foundry/research/friction/sandbox-agent.mdx @@ -55,7 +55,7 @@ Upgrading backend integration from legacy sandbox-agent session endpoints to `sa ### Friction / Issue -`0.2.0` no longer exposes the legacy session REST endpoints used by the backend integration; direct session create/status polling via those paths returns `404`. +`0.2.0` no longer exposes `/v1/sessions` endpoints used by the backend integration; direct session create/status polling via legacy REST paths returns `404`. ### Attempted Fix / Workaround @@ -65,5 +65,5 @@ Upgrading backend integration from legacy sandbox-agent session endpoints to `sa ### Outcome -- Backend no longer depends on removed legacy session REST endpoints. +- Backend no longer depends on removed `/v1/sessions` endpoints. - Daytona flow is aligned with `sandbox-agent 0.2.0` runtime and SDK usage. diff --git a/foundry/research/friction/sandboxes.mdx b/foundry/research/friction/sandboxes.mdx index 38d4b3f..e30e85b 100644 --- a/foundry/research/friction/sandboxes.mdx +++ b/foundry/research/friction/sandboxes.mdx @@ -8,7 +8,7 @@ Implementing provider adapters (`worktree`, `daytona`) under the backend package ### Friction / Issue -Provider interface intentionally keeps `DestroySandboxRequest` minimal (`organizationId`, `sandboxId`), but local git worktree cleanup may need repo context. +Provider interface intentionally keeps `DestroySandboxRequest` minimal (`workspaceId`, `sandboxId`), but local git worktree cleanup may need repo context. ### Attempted Fix / Workaround @@ -54,8 +54,8 @@ The previous end-to-end flow implicitly depended on local filesystem paths (`rep ### Attempted Fix / Workaround -1. Introduced explicit imported repository records sourced from GitHub sync instead of local organization paths. -2. Made `RepositoryActor` assert a backend-owned local clone exists on wake and fetch remote branch state from that clone. +1. Introduced explicit repo remote records (`WorkspaceActor.addRepo`) and validated remotes with `git ls-remote`. +2. Made `ProjectActor` assert a backend-owned local clone exists on wake and fetch remote branch state from that clone. 3. Updated PR creation to avoid requiring a checked-out branch by using `gh pr create --head `. 4. Updated `DaytonaProvider.createSandbox` to clone the repo and checkout the branch into a deterministic workdir and return it as `cwd` for sandbox-agent sessions. diff --git a/foundry/research/realtime-interest-manager-spec.md b/foundry/research/realtime-interest-manager-spec.md index dff2aea..9c0fc93 100644 --- a/foundry/research/realtime-interest-manager-spec.md +++ b/foundry/research/realtime-interest-manager-spec.md @@ -4,7 +4,7 @@ Replace the current polling + empty-notification + full-refetch architecture with a push-based realtime system. The client subscribes to topics, receives the initial state, and then receives full replacement payloads for changed entities over WebSocket. No polling. No re-fetching. -This spec covers three layers: backend (materialized state + broadcast), client library (subscription manager), and frontend (hook consumption). Comment architecture-related code throughout so new contributors can understand the data flow from comments alone. +This spec covers three layers: backend (materialized state + broadcast), client library (interest manager), and frontend (hook consumption). Comment architecture-related code throughout so new contributors can understand the data flow from comments alone. --- @@ -17,7 +17,7 @@ This spec covers three layers: backend (materialized state + broadcast), client Currently `WorkbenchTask` is a single flat type carrying everything (sidebar fields + transcripts + diffs + file tree). Split it: ```typescript -/** Sidebar-level task data. Materialized in the organization actor's SQLite. */ +/** Sidebar-level task data. Materialized in the workspace actor's SQLite. */ export interface WorkbenchTaskSummary { id: string; repoId: string; @@ -44,7 +44,7 @@ export interface WorkbenchSessionSummary { created: boolean; } -/** Repo-level summary for organization sidebar. */ +/** Repo-level summary for workspace sidebar. */ export interface WorkbenchRepoSummary { id: string; label: string; @@ -93,9 +93,9 @@ export interface WorkbenchSessionDetail { transcript: WorkbenchTranscriptEvent[]; } -/** Organization-level snapshot — initial fetch for the organization topic. */ -export interface OrganizationSummarySnapshot { - organizationId: string; +/** Workspace-level snapshot — initial fetch for the workspace topic. */ +export interface WorkspaceSummarySnapshot { + workspaceId: string; repos: WorkbenchRepoSummary[]; taskSummaries: WorkbenchTaskSummary[]; } @@ -110,8 +110,8 @@ Remove the old `TaskWorkbenchSnapshot` type and `WorkbenchTask` type once migrat Each event carries the full new state of the changed entity — not a patch, not an empty notification. ```typescript -/** Organization-level events broadcast by the organization actor. */ -export type OrganizationEvent = +/** Workspace-level events broadcast by the workspace actor. */ +export type WorkspaceEvent = | { type: "taskSummaryUpdated"; taskSummary: WorkbenchTaskSummary } | { type: "taskRemoved"; taskId: string } | { type: "repoAdded"; repo: WorkbenchRepoSummary } @@ -126,7 +126,7 @@ export type TaskEvent = export type SessionEvent = | { type: "sessionUpdated"; session: WorkbenchSessionDetail }; -/** App-level events broadcast by the app organization actor. */ +/** App-level events broadcast by the app workspace actor. */ export type AppEvent = | { type: "appUpdated"; snapshot: FoundryAppSnapshot }; @@ -139,13 +139,13 @@ export type SandboxProcessesEvent = ## 2. Backend: Materialized State + Broadcasts -### 2.1 Organization actor — materialized sidebar state +### 2.1 Workspace actor — materialized sidebar state **Files:** -- `packages/backend/src/actors/organization/db/schema.ts` — add tables -- `packages/backend/src/actors/organization/actions.ts` — replace `buildWorkbenchSnapshot`, add delta handlers +- `packages/backend/src/actors/workspace/db/schema.ts` — add tables +- `packages/backend/src/actors/workspace/actions.ts` — replace `buildWorkbenchSnapshot`, add delta handlers -Add to organization actor SQLite schema: +Add to workspace actor SQLite schema: ```typescript export const taskSummaries = sqliteTable("task_summaries", { @@ -161,7 +161,7 @@ export const taskSummaries = sqliteTable("task_summaries", { }); ``` -New organization actions: +New workspace actions: ```typescript /** @@ -176,23 +176,23 @@ async applyTaskSummaryUpdate(c, input: { taskSummary: WorkbenchTaskSummary }) { await c.db.insert(taskSummaries).values(toRow(input.taskSummary)) .onConflictDoUpdate({ target: taskSummaries.taskId, set: toRow(input.taskSummary) }).run(); // Broadcast to connected clients - c.broadcast("organizationUpdated", { type: "taskSummaryUpdated", taskSummary: input.taskSummary }); + c.broadcast("workspaceUpdated", { type: "taskSummaryUpdated", taskSummary: input.taskSummary }); } async removeTaskSummary(c, input: { taskId: string }) { await c.db.delete(taskSummaries).where(eq(taskSummaries.taskId, input.taskId)).run(); - c.broadcast("organizationUpdated", { type: "taskRemoved", taskId: input.taskId }); + c.broadcast("workspaceUpdated", { type: "taskRemoved", taskId: input.taskId }); } /** - * Initial fetch for the organization topic. + * Initial fetch for the workspace topic. * Reads entirely from local SQLite — no fan-out to child actors. */ -async getWorkspaceSummary(c, input: { organizationId: string }): Promise { +async getWorkspaceSummary(c, input: { workspaceId: string }): Promise { const repoRows = await c.db.select().from(repos).orderBy(desc(repos.updatedAt)).all(); const taskRows = await c.db.select().from(taskSummaries).orderBy(desc(taskSummaries.updatedAtMs)).all(); return { - organizationId: c.state.organizationId, + workspaceId: c.state.workspaceId, repos: repoRows.map(toRepoSummary), taskSummaries: taskRows.map(toTaskSummary), }; @@ -201,7 +201,7 @@ async getWorkspaceSummary(c, input: { organizationId: string }): Promise { ... } async getSessionDetail(c, input: { sessionId: string }): Promise { ... } ``` -### 2.4 App organization actor +### 2.4 App workspace actor -**File:** `packages/backend/src/actors/organization/app-shell.ts` +**File:** `packages/backend/src/actors/workspace/app-shell.ts` Change `c.broadcast("appUpdated", { at: Date.now(), sessionId })` to: ```typescript @@ -304,12 +304,12 @@ function broadcastProcessesUpdated(c: any): void { ```typescript /** - * Topic definitions for the subscription manager. + * Topic definitions for the interest manager. * * Each topic defines how to connect to an actor, fetch initial state, * which event to listen for, and how to apply incoming events to cached state. * - * The subscription manager uses these definitions to manage WebSocket connections, + * The interest manager uses these definitions to manage WebSocket connections, * cached state, and subscriptions for all realtime data flows. */ @@ -331,10 +331,10 @@ export interface TopicDefinition { } export interface AppTopicParams {} -export interface OrganizationTopicParams { organizationId: string } -export interface TaskTopicParams { organizationId: string; repoId: string; taskId: string } -export interface SessionTopicParams { organizationId: string; repoId: string; taskId: string; sessionId: string } -export interface SandboxProcessesTopicParams { organizationId: string; providerId: string; sandboxId: string } +export interface WorkspaceTopicParams { workspaceId: string } +export interface TaskTopicParams { workspaceId: string; repoId: string; taskId: string } +export interface SessionTopicParams { workspaceId: string; repoId: string; taskId: string; sessionId: string } +export interface SandboxProcessesTopicParams { workspaceId: string; providerId: string; sandboxId: string } export const topicDefinitions = { app: { @@ -345,12 +345,12 @@ export const topicDefinitions = { applyEvent: (_current, event: AppEvent) => event.snapshot, } satisfies TopicDefinition, - organization: { - key: (p) => `organization:${p.organizationId}`, - event: "organizationUpdated", - connect: (b, p) => b.connectWorkspace(p.organizationId), - fetchInitial: (b, p) => b.getWorkspaceSummary(p.organizationId), - applyEvent: (current, event: OrganizationEvent) => { + workspace: { + key: (p) => `workspace:${p.workspaceId}`, + event: "workspaceUpdated", + connect: (b, p) => b.connectWorkspace(p.workspaceId), + fetchInitial: (b, p) => b.getWorkspaceSummary(p.workspaceId), + applyEvent: (current, event: WorkspaceEvent) => { switch (event.type) { case "taskSummaryUpdated": return { @@ -375,22 +375,22 @@ export const topicDefinitions = { }; } }, - } satisfies TopicDefinition, + } satisfies TopicDefinition, task: { - key: (p) => `task:${p.organizationId}:${p.taskId}`, + key: (p) => `task:${p.workspaceId}:${p.taskId}`, event: "taskUpdated", - connect: (b, p) => b.connectTask(p.organizationId, p.repoId, p.taskId), - fetchInitial: (b, p) => b.getTaskDetail(p.organizationId, p.repoId, p.taskId), + connect: (b, p) => b.connectTask(p.workspaceId, p.repoId, p.taskId), + fetchInitial: (b, p) => b.getTaskDetail(p.workspaceId, p.repoId, p.taskId), applyEvent: (_current, event: TaskEvent) => event.detail, } satisfies TopicDefinition, session: { - key: (p) => `session:${p.organizationId}:${p.taskId}:${p.sessionId}`, + key: (p) => `session:${p.workspaceId}:${p.taskId}:${p.sessionId}`, event: "sessionUpdated", // Reuses the task actor connection — same actor, different event. - connect: (b, p) => b.connectTask(p.organizationId, p.repoId, p.taskId), - fetchInitial: (b, p) => b.getSessionDetail(p.organizationId, p.repoId, p.taskId, p.sessionId), + connect: (b, p) => b.connectTask(p.workspaceId, p.repoId, p.taskId), + fetchInitial: (b, p) => b.getSessionDetail(p.workspaceId, p.repoId, p.taskId, p.sessionId), applyEvent: (current, event: SessionEvent) => { // Filter: only apply if this event is for our session if (event.session.sessionId !== current.sessionId) return current; @@ -399,10 +399,10 @@ export const topicDefinitions = { } satisfies TopicDefinition, sandboxProcesses: { - key: (p) => `sandbox:${p.organizationId}:${p.sandboxId}`, + key: (p) => `sandbox:${p.workspaceId}:${p.sandboxId}`, event: "processesUpdated", - connect: (b, p) => b.connectSandbox(p.organizationId, p.providerId, p.sandboxId), - fetchInitial: (b, p) => b.listSandboxProcesses(p.organizationId, p.providerId, p.sandboxId), + connect: (b, p) => b.connectSandbox(p.workspaceId, p.providerId, p.sandboxId), + fetchInitial: (b, p) => b.listSandboxProcesses(p.workspaceId, p.providerId, p.sandboxId), applyEvent: (_current, event: SandboxProcessesEvent) => event.processes, } satisfies TopicDefinition, } as const; @@ -413,16 +413,16 @@ export type TopicParams = Parameters<(typeof topicDefinition export type TopicData = Awaited>; ``` -### 3.2 Subscription manager interface +### 3.2 Interest manager interface **File:** `packages/client/src/interest/manager.ts` (new) ```typescript /** - * The SubscriptionManager owns all realtime actor connections and cached state. + * The InterestManager owns all realtime actor connections and cached state. * * Architecture: - * - Each topic (app, organization, task, session, sandboxProcesses) maps to an actor + event. + * - Each topic (app, workspace, task, session, sandboxProcesses) maps to an actor + event. * - On first subscription, the manager opens a WebSocket connection, fetches initial state, * and listens for events. Events carry full replacement payloads for the changed entity. * - Multiple subscribers to the same topic share one connection and one cached state. @@ -430,7 +430,7 @@ export type TopicData = Awaited { const GRACE_PERIOD_MS = 30_000; /** - * Remote implementation of SubscriptionManager. + * Remote implementation of InterestManager. * Manages WebSocket connections to RivetKit actors via BackendClient. */ -export class RemoteSubscriptionManager implements SubscriptionManager { +export class RemoteInterestManager implements InterestManager { private entries = new Map>(); constructor(private backend: BackendClient) {} @@ -634,7 +634,7 @@ class TopicEntry { **File:** `packages/client/src/interest/mock-manager.ts` (new) -Same `SubscriptionManager` interface. Uses in-memory state. Topic definitions provide mock data. Mutations call `applyEvent` directly on the entry to simulate broadcasts. No WebSocket connections. +Same `InterestManager` interface. Uses in-memory state. Topic definitions provide mock data. Mutations call `applyEvent` directly on the entry to simulate broadcasts. No WebSocket connections. ### 3.5 React hook @@ -651,17 +651,17 @@ import { useSyncExternalStore, useMemo } from "react"; * - Multiple components subscribing to the same topic share one connection. * * @example - * // Subscribe to organization sidebar data - * const organization = useSubscription("organization", { organizationId }); + * // Subscribe to workspace sidebar data + * const workspace = useInterest("workspace", { workspaceId }); * * // Subscribe to task detail (only when viewing a task) - * const task = useSubscription("task", selectedTaskId ? { organizationId, repoId, taskId } : null); + * const task = useInterest("task", selectedTaskId ? { workspaceId, repoId, taskId } : null); * * // Subscribe to active session content - * const session = useSubscription("session", activeSessionId ? { organizationId, repoId, taskId, sessionId } : null); + * const session = useInterest("session", activeSessionId ? { workspaceId, repoId, taskId, sessionId } : null); */ -export function useSubscription( - manager: SubscriptionManager, +export function useInterest( + manager: InterestManager, topicKey: K, params: TopicParams | null, ): TopicState { @@ -698,18 +698,18 @@ Add to the `BackendClient` interface: ```typescript // New connection methods (return WebSocket-based ActorConn) -connectWorkspace(organizationId: string): Promise; -connectTask(organizationId: string, repoId: string, taskId: string): Promise; -connectSandbox(organizationId: string, providerId: string, sandboxId: string): Promise; +connectWorkspace(workspaceId: string): Promise; +connectTask(workspaceId: string, repoId: string, taskId: string): Promise; +connectSandbox(workspaceId: string, providerId: string, sandboxId: string): Promise; // New fetch methods (read from materialized state) -getWorkspaceSummary(organizationId: string): Promise; -getTaskDetail(organizationId: string, repoId: string, taskId: string): Promise; -getSessionDetail(organizationId: string, repoId: string, taskId: string, sessionId: string): Promise; +getWorkspaceSummary(workspaceId: string): Promise; +getTaskDetail(workspaceId: string, repoId: string, taskId: string): Promise; +getSessionDetail(workspaceId: string, repoId: string, taskId: string, sessionId: string): Promise; ``` Remove: -- `subscribeWorkbench`, `subscribeApp`, `subscribeSandboxProcesses` (replaced by subscription manager) +- `subscribeWorkbench`, `subscribeApp`, `subscribeSandboxProcesses` (replaced by interest manager) - `getWorkbench` (replaced by `getWorkspaceSummary` + `getTaskDetail`) --- @@ -721,16 +721,16 @@ Remove: **File:** `packages/frontend/src/lib/interest.ts` (new) ```typescript -import { RemoteSubscriptionManager } from "@sandbox-agent/foundry-client"; +import { RemoteInterestManager } from "@sandbox-agent/foundry-client"; import { backendClient } from "./backend"; -export const subscriptionManager = new RemoteSubscriptionManager(backendClient); +export const interestManager = new RemoteInterestManager(backendClient); ``` Or for mock mode: ```typescript -import { MockSubscriptionManager } from "@sandbox-agent/foundry-client"; -export const subscriptionManager = new MockSubscriptionManager(); +import { MockInterestManager } from "@sandbox-agent/foundry-client"; +export const interestManager = new MockInterestManager(); ``` ### 4.2 Replace MockLayout workbench subscription @@ -739,7 +739,7 @@ export const subscriptionManager = new MockSubscriptionManager(); Before: ```typescript -const taskWorkbenchClient = useMemo(() => getTaskWorkbenchClient(organizationId), [organizationId]); +const taskWorkbenchClient = useMemo(() => getTaskWorkbenchClient(workspaceId), [workspaceId]); const viewModel = useSyncExternalStore( taskWorkbenchClient.subscribe.bind(taskWorkbenchClient), taskWorkbenchClient.getSnapshot.bind(taskWorkbenchClient), @@ -749,9 +749,9 @@ const tasks = viewModel.tasks ?? []; After: ```typescript -const organization = useSubscription(subscriptionManager, "organization", { organizationId }); -const taskSummaries = organization.data?.taskSummaries ?? []; -const repos = organization.data?.repos ?? []; +const workspace = useInterest(interestManager, "workspace", { workspaceId }); +const taskSummaries = workspace.data?.taskSummaries ?? []; +const repos = workspace.data?.repos ?? []; ``` ### 4.3 Replace MockLayout task detail @@ -759,8 +759,8 @@ const repos = organization.data?.repos ?? []; When a task is selected, subscribe to its detail: ```typescript -const taskDetail = useSubscription(subscriptionManager, "task", - selectedTaskId ? { organizationId, repoId: activeRepoId, taskId: selectedTaskId } : null +const taskDetail = useInterest(interestManager, "task", + selectedTaskId ? { workspaceId, repoId: activeRepoId, taskId: selectedTaskId } : null ); ``` @@ -769,25 +769,25 @@ const taskDetail = useSubscription(subscriptionManager, "task", When a session tab is active: ```typescript -const sessionDetail = useSubscription(subscriptionManager, "session", - activeSessionId ? { organizationId, repoId, taskId, sessionId: activeSessionId } : null +const sessionDetail = useInterest(interestManager, "session", + activeSessionId ? { workspaceId, repoId, taskId, sessionId: activeSessionId } : null ); ``` -### 4.5 Replace organization-dashboard.tsx polling +### 4.5 Replace workspace-dashboard.tsx polling Remove ALL `useQuery` with `refetchInterval` in this file: -- `tasksQuery` (2.5s polling) → `useSubscription("organization", ...)` -- `taskDetailQuery` (2.5s polling) → `useSubscription("task", ...)` -- `reposQuery` (10s polling) → `useSubscription("organization", ...)` -- `repoOverviewQuery` (5s polling) → `useSubscription("organization", ...)` -- `sessionsQuery` (3s polling) → `useSubscription("task", ...)` (sessionsSummary field) -- `eventsQuery` (2.5s polling) → `useSubscription("session", ...)` +- `tasksQuery` (2.5s polling) → `useInterest("workspace", ...)` +- `taskDetailQuery` (2.5s polling) → `useInterest("task", ...)` +- `reposQuery` (10s polling) → `useInterest("workspace", ...)` +- `repoOverviewQuery` (5s polling) → `useInterest("workspace", ...)` +- `sessionsQuery` (3s polling) → `useInterest("task", ...)` (sessionsSummary field) +- `eventsQuery` (2.5s polling) → `useInterest("session", ...)` ### 4.6 Replace terminal-pane.tsx polling -- `taskQuery` (2s polling) → `useSubscription("task", ...)` -- `processesQuery` (3s polling) → `useSubscription("sandboxProcesses", ...)` +- `taskQuery` (2s polling) → `useInterest("task", ...)` +- `processesQuery` (3s polling) → `useInterest("sandboxProcesses", ...)` - Remove `subscribeSandboxProcesses` useEffect ### 4.7 Replace app client subscription @@ -804,14 +804,14 @@ export function useMockAppSnapshot(): FoundryAppSnapshot { After: ```typescript export function useAppSnapshot(): FoundryAppSnapshot { - const app = useSubscription(subscriptionManager, "app", {}); + const app = useInterest(interestManager, "app", {}); return app.data ?? DEFAULT_APP_SNAPSHOT; } ``` ### 4.8 Mutations -Mutations (`createTask`, `renameTask`, `sendMessage`, etc.) no longer need manual `refetch()` or `refresh()` calls after completion. The backend mutation triggers a broadcast, which the subscription manager receives and applies automatically. +Mutations (`createTask`, `renameTask`, `sendMessage`, etc.) no longer need manual `refetch()` or `refresh()` calls after completion. The backend mutation triggers a broadcast, which the interest manager receives and applies automatically. Before: ```typescript @@ -841,24 +841,24 @@ const createSession = useMutation({ | File/Code | Reason | |---|---| -| `packages/client/src/remote/workbench-client.ts` | Replaced by subscription manager `organization` + `task` topics | -| `packages/client/src/remote/app-client.ts` | Replaced by subscription manager `app` topic | +| `packages/client/src/remote/workbench-client.ts` | Replaced by interest manager `workspace` + `task` topics | +| `packages/client/src/remote/app-client.ts` | Replaced by interest manager `app` topic | | `packages/client/src/workbench-client.ts` | Factory for above — no longer needed | | `packages/client/src/app-client.ts` | Factory for above — no longer needed | -| `packages/frontend/src/lib/workbench.ts` | Workbench client singleton — replaced by subscription manager | -| `subscribeWorkbench` in `backend-client.ts` | Replaced by `connectWorkspace` + subscription manager | -| `subscribeSandboxProcesses` in `backend-client.ts` | Replaced by `connectSandbox` + subscription manager | -| `subscribeApp` in `backend-client.ts` | Replaced by `connectWorkspace("app")` + subscription manager | -| `buildWorkbenchSnapshot` in `organization/actions.ts` | Replaced by `getWorkspaceSummary` (local reads). Keep as `reconcileWorkbenchState` for recovery only. | -| `notifyWorkbenchUpdated` in `organization/actions.ts` | Replaced by `applyTaskSummaryUpdate` + `c.broadcast` with payload | +| `packages/frontend/src/lib/workbench.ts` | Workbench client singleton — replaced by interest manager | +| `subscribeWorkbench` in `backend-client.ts` | Replaced by `connectWorkspace` + interest manager | +| `subscribeSandboxProcesses` in `backend-client.ts` | Replaced by `connectSandbox` + interest manager | +| `subscribeApp` in `backend-client.ts` | Replaced by `connectWorkspace("app")` + interest manager | +| `buildWorkbenchSnapshot` in `workspace/actions.ts` | Replaced by `getWorkspaceSummary` (local reads). Keep as `reconcileWorkbenchState` for recovery only. | +| `notifyWorkbenchUpdated` in `workspace/actions.ts` | Replaced by `applyTaskSummaryUpdate` + `c.broadcast` with payload | | `notifyWorkbenchUpdated` in `task/workbench.ts` | Replaced by `broadcastTaskUpdate` helper | -| `TaskWorkbenchSnapshot` in `shared/workbench.ts` | Replaced by `OrganizationSummarySnapshot` + `WorkbenchTaskDetail` | +| `TaskWorkbenchSnapshot` in `shared/workbench.ts` | Replaced by `WorkspaceSummarySnapshot` + `WorkbenchTaskDetail` | | `WorkbenchTask` in `shared/workbench.ts` | Split into `WorkbenchTaskSummary` + `WorkbenchTaskDetail` | -| `getWorkbench` action on organization actor | Replaced by `getWorkspaceSummary` | -| `TaskWorkbenchClient` interface | Replaced by `SubscriptionManager` + `useSubscription` hook | -| All `useQuery` with `refetchInterval` in `organization-dashboard.tsx` | Replaced by `useSubscription` | -| All `useQuery` with `refetchInterval` in `terminal-pane.tsx` | Replaced by `useSubscription` | -| Mock workbench client (`packages/client/src/mock/workbench-client.ts`) | Replaced by `MockSubscriptionManager` | +| `getWorkbench` action on workspace actor | Replaced by `getWorkspaceSummary` | +| `TaskWorkbenchClient` interface | Replaced by `InterestManager` + `useInterest` hook | +| All `useQuery` with `refetchInterval` in `workspace-dashboard.tsx` | Replaced by `useInterest` | +| All `useQuery` with `refetchInterval` in `terminal-pane.tsx` | Replaced by `useInterest` | +| Mock workbench client (`packages/client/src/mock/workbench-client.ts`) | Replaced by `MockInterestManager` | --- @@ -867,27 +867,27 @@ const createSession = useMutation({ Implement in this order to keep the system working at each step: ### Phase 1: Types and backend materialization -1. Add new types to `packages/shared` (`WorkbenchTaskSummary`, `WorkbenchTaskDetail`, `WorkbenchSessionSummary`, `WorkbenchSessionDetail`, `OrganizationSummarySnapshot`, event types). -2. Add `taskSummaries` table to organization actor schema. -3. Add `applyTaskSummaryUpdate`, `removeTaskSummary`, `getWorkspaceSummary` actions to organization actor. +1. Add new types to `packages/shared` (`WorkbenchTaskSummary`, `WorkbenchTaskDetail`, `WorkbenchSessionSummary`, `WorkbenchSessionDetail`, `WorkspaceSummarySnapshot`, event types). +2. Add `taskSummaries` table to workspace actor schema. +3. Add `applyTaskSummaryUpdate`, `removeTaskSummary`, `getWorkspaceSummary` actions to workspace actor. 4. Add `getTaskDetail`, `getSessionDetail` actions to task actor. 5. Replace all `notifyWorkbenchUpdated` call sites with `broadcastTaskUpdate` that pushes summary + broadcasts detail with payload. 6. Change app actor broadcast to include snapshot payload. 7. Change sandbox actor broadcast to include process list payload. 8. Add one-time reconciliation action to populate `taskSummaries` table from existing task actors (run on startup or on-demand). -### Phase 2: Client subscription manager -9. Add `SubscriptionManager` interface, `RemoteSubscriptionManager`, `MockSubscriptionManager` to `packages/client`. +### Phase 2: Client interest manager +9. Add `InterestManager` interface, `RemoteInterestManager`, `MockInterestManager` to `packages/client`. 10. Add topic definitions registry. -11. Add `useSubscription` hook. +11. Add `useInterest` hook. 12. Add `connectWorkspace`, `connectTask`, `connectSandbox`, `getWorkspaceSummary`, `getTaskDetail`, `getSessionDetail` to `BackendClient`. ### Phase 3: Frontend migration -13. Replace `useMockAppSnapshot` with `useSubscription("app", ...)`. -14. Replace `MockLayout` workbench subscription with `useSubscription("organization", ...)`. -15. Replace task detail view with `useSubscription("task", ...)` + `useSubscription("session", ...)`. -16. Replace `organization-dashboard.tsx` polling queries with `useSubscription`. -17. Replace `terminal-pane.tsx` polling queries with `useSubscription`. +13. Replace `useMockAppSnapshot` with `useInterest("app", ...)`. +14. Replace `MockLayout` workbench subscription with `useInterest("workspace", ...)`. +15. Replace task detail view with `useInterest("task", ...)` + `useInterest("session", ...)`. +16. Replace `workspace-dashboard.tsx` polling queries with `useInterest`. +17. Replace `terminal-pane.tsx` polling queries with `useInterest`. 18. Remove manual `refetch()` calls from mutations. ### Phase 4: Cleanup @@ -902,10 +902,10 @@ Implement in this order to keep the system working at each step: Add doc comments at these locations: - **Topic definitions** — explain the materialized state pattern, why events carry full entity state instead of patches, and the relationship between topics. -- **`broadcastTaskUpdate` helper** — explain the dual-broadcast pattern (push summary to organization + broadcast detail to direct subscribers). -- **`SubscriptionManager` interface** — explain the grace period, deduplication, and why mock/remote share the same interface. -- **`useSubscription` hook** — explain `useSyncExternalStore` integration, null params for conditional interest, and how params key stabilization works. -- **Organization actor `taskSummaries` table** — explain this is a materialized read projection maintained by task actor pushes, not a source of truth. +- **`broadcastTaskUpdate` helper** — explain the dual-broadcast pattern (push summary to workspace + broadcast detail to direct subscribers). +- **`InterestManager` interface** — explain the grace period, deduplication, and why mock/remote share the same interface. +- **`useInterest` hook** — explain `useSyncExternalStore` integration, null params for conditional interest, and how params key stabilization works. +- **Workspace actor `taskSummaries` table** — explain this is a materialized read projection maintained by task actor pushes, not a source of truth. - **`applyTaskSummaryUpdate` action** — explain this is the write path for the materialized projection, called by task actors, not by clients. - **`getWorkspaceSummary` action** — explain this reads from local SQLite only, no fan-out, and why that's the correct pattern. @@ -913,7 +913,7 @@ Add doc comments at these locations: ## 8. Testing -- Subscription manager unit tests: subscribe/unsubscribe lifecycle, grace period, deduplication, event application. -- Mock implementation tests: verify same behavior as remote through shared test suite against the `SubscriptionManager` interface. +- Interest manager unit tests: subscribe/unsubscribe lifecycle, grace period, deduplication, event application. +- Mock implementation tests: verify same behavior as remote through shared test suite against the `InterestManager` interface. - Backend integration: verify `applyTaskSummaryUpdate` correctly materializes and broadcasts. - E2E: verify that a task mutation (e.g. rename) updates the sidebar in realtime without polling. diff --git a/foundry/research/specs/async-action-fixes/00-end-to-end-async-realtime-plan.md b/foundry/research/specs/async-action-fixes/00-end-to-end-async-realtime-plan.md index 1cb4d37..cd9dcbf 100644 --- a/foundry/research/specs/async-action-fixes/00-end-to-end-async-realtime-plan.md +++ b/foundry/research/specs/async-action-fixes/00-end-to-end-async-realtime-plan.md @@ -28,7 +28,7 @@ The goal is not just to make individual endpoints faster. The goal is to move Fo ### Workbench -- `getWorkbench` still represents a monolithic organization read that aggregates repo, repository, and task state. +- `getWorkbench` still represents a monolithic workspace read that aggregates repo, project, and task state. - The remote workbench store still responds to every event by pulling a full fresh snapshot. - Some task/workbench detail is still too expensive to compute inline and too broad to refresh after every mutation. @@ -57,7 +57,7 @@ Requests should not block on provider calls, repo sync, sandbox provisioning, tr ### View-model rule - App shell view connects to app/session state and only the org actors visible on screen. -- Organization/task-list view connects to a organization-owned summary projection. +- Workspace/task-list view connects to a workspace-owned summary projection. - Task detail view connects directly to the selected task actor. - Sandbox/session detail connects only when the user opens that detail. @@ -99,7 +99,7 @@ The app shell should stop using `/app/snapshot` as the steady-state read model. #### Changes -1. Introduce a small app-shell projection owned by the app organization actor: +1. Introduce a small app-shell projection owned by the app workspace actor: - auth status - current user summary - active org id @@ -121,7 +121,7 @@ The app shell should stop using `/app/snapshot` as the steady-state read model. #### Likely files -- `foundry/packages/backend/src/actors/organization/app-shell.ts` +- `foundry/packages/backend/src/actors/workspace/app-shell.ts` - `foundry/packages/client/src/backend-client.ts` - `foundry/packages/client/src/remote/app-client.ts` - `foundry/packages/shared/src/app-shell.ts` @@ -133,42 +133,42 @@ The app shell should stop using `/app/snapshot` as the steady-state read model. - Selecting an org returns quickly and the UI updates from actor events. - App shell refresh cost is bounded by visible state, not every eligible organization on every poll. -### 3. Organization summary becomes a projection, not a full snapshot +### 3. Workspace summary becomes a projection, not a full snapshot -The task list should read a organization-owned summary projection instead of calling into every task actor on each refresh. +The task list should read a workspace-owned summary projection instead of calling into every task actor on each refresh. #### Changes -1. Define a durable organization summary model with only list-screen fields: +1. Define a durable workspace summary model with only list-screen fields: - repo summary - - repository summary + - project summary - task summary - selected/open task ids - unread/session status summary - coarse git/PR state summary -2. Update organization actor workflows so task/repository changes incrementally update this projection. +2. Update workspace actor workflows so task/project changes incrementally update this projection. 3. Change `getWorkbench` to return the projection only. 4. Change `workbenchUpdated` from "invalidate and refetch everything" to "here is the updated projection version or changed entity ids". 5. Remove task-actor fan-out from the default list read path. #### Likely files -- `foundry/packages/backend/src/actors/organization/actions.ts` -- `foundry/packages/backend/src/actors/repository/actions.ts` +- `foundry/packages/backend/src/actors/workspace/actions.ts` +- `foundry/packages/backend/src/actors/project/actions.ts` - `foundry/packages/backend/src/actors/task/index.ts` - `foundry/packages/backend/src/actors/task/workbench.ts` -- task/organization DB schema and migrations +- task/workspace DB schema and migrations - `foundry/packages/client/src/remote/workbench-client.ts` #### Acceptance criteria - Workbench list refresh does not call every task actor. - A websocket event does not force a full cross-actor rebuild. -- Initial task-list load time scales roughly with organization summary size, not repo count times task count times detail reads. +- Initial task-list load time scales roughly with workspace summary size, not repo count times task count times detail reads. ### 4. Task detail moves to direct actor reads and events -Heavy task detail should move out of the organization summary and into the selected task actor. +Heavy task detail should move out of the workspace summary and into the selected task actor. #### Changes @@ -258,7 +258,7 @@ Do not delete bootstrap endpoints first. Shrink them after the subscription mode 4. `06-daytona-provisioning-staged-background-flow.md` 5. App shell realtime subscription model 6. `02-repo-overview-from-cached-projection.md` -7. Organization summary projection +7. Workspace summary projection 8. `04-workbench-session-creation-without-inline-provisioning.md` 9. `05-workbench-snapshot-from-derived-state.md` 10. Task-detail direct actor reads/subscriptions @@ -270,7 +270,7 @@ Do not delete bootstrap endpoints first. Shrink them after the subscription mode - Runtime hardening removes the most dangerous correctness bug before more UI load shifts onto actor connections. - The first async workflow items reduce the biggest user-visible stalls quickly. - App shell realtime is smaller and lower-risk than the workbench migration, and it removes the current polling loop. -- Organization summary and task-detail split should happen after the async workflow moves so the projection model does not encode old synchronous assumptions. +- Workspace summary and task-detail split should happen after the async workflow moves so the projection model does not encode old synchronous assumptions. - Auth simplification is valuable but not required to remove the current refresh/polling/runtime problems. ## Observability Requirements @@ -291,7 +291,7 @@ Each log line should include a request id or actor/event correlation id where po 1. Ship runtime hardening and observability first. 2. Ship app-shell realtime behind a client flag while keeping snapshot bootstrap. -3. Ship organization summary projection behind a separate flag. +3. Ship workspace summary projection behind a separate flag. 4. Migrate one heavy detail pane at a time off the monolithic workbench payload. 5. Remove polling once the matching event path is proven stable. 6. Only then remove or demote the old snapshot-heavy steady-state flows. diff --git a/foundry/research/specs/async-action-fixes/01-task-creation-bootstrap-only.md b/foundry/research/specs/async-action-fixes/01-task-creation-bootstrap-only.md index 1eb1594..2aa9f50 100644 --- a/foundry/research/specs/async-action-fixes/01-task-creation-bootstrap-only.md +++ b/foundry/research/specs/async-action-fixes/01-task-creation-bootstrap-only.md @@ -10,8 +10,8 @@ That makes a user-facing action depend on queue-backed and provider-backed work ## Current Code Context -- Organization entry point: `foundry/packages/backend/src/actors/organization/actions.ts` -- Repository task creation path: `foundry/packages/backend/src/actors/repository/actions.ts` +- Workspace entry point: `foundry/packages/backend/src/actors/workspace/actions.ts` +- Project task creation path: `foundry/packages/backend/src/actors/project/actions.ts` - Task action surface: `foundry/packages/backend/src/actors/task/index.ts` - Task workflow: `foundry/packages/backend/src/actors/task/workflow/index.ts` - Task init/provision steps: `foundry/packages/backend/src/actors/task/workflow/init.ts` @@ -33,8 +33,8 @@ That makes a user-facing action depend on queue-backed and provider-backed work - persisting any immediately-known metadata - returning the current task record 3. After initialize completes, enqueue `task.command.provision` with `wait: false`. -4. Change `organization.createTask` to: - - create or resolve the repository +4. Change `workspace.createTask` to: + - create or resolve the project - create the task actor - call `task.initialize(...)` - stop awaiting `task.provision(...)` @@ -51,12 +51,12 @@ That makes a user-facing action depend on queue-backed and provider-backed work ## Files Likely To Change -- `foundry/packages/backend/src/actors/organization/actions.ts` -- `foundry/packages/backend/src/actors/repository/actions.ts` +- `foundry/packages/backend/src/actors/workspace/actions.ts` +- `foundry/packages/backend/src/actors/project/actions.ts` - `foundry/packages/backend/src/actors/task/index.ts` - `foundry/packages/backend/src/actors/task/workflow/index.ts` - `foundry/packages/backend/src/actors/task/workflow/init.ts` -- `foundry/packages/frontend/src/components/organization-dashboard.tsx` +- `foundry/packages/frontend/src/components/workspace-dashboard.tsx` - `foundry/packages/client/src/remote/workbench-client.ts` ## Client Impact diff --git a/foundry/research/specs/async-action-fixes/02-repo-overview-from-cached-projection.md b/foundry/research/specs/async-action-fixes/02-repo-overview-from-cached-projection.md index 1d31216..27afad5 100644 --- a/foundry/research/specs/async-action-fixes/02-repo-overview-from-cached-projection.md +++ b/foundry/research/specs/async-action-fixes/02-repo-overview-from-cached-projection.md @@ -15,11 +15,11 @@ The frontend polls repo overview repeatedly, so this design multiplies slow work ## Current Code Context -- Organization overview entry point: `foundry/packages/backend/src/actors/organization/actions.ts` -- Repository overview implementation: `foundry/packages/backend/src/actors/repository/actions.ts` -- Branch sync poller: `foundry/packages/backend/src/actors/repository-branch-sync/index.ts` -- PR sync poller: `foundry/packages/backend/src/actors/repository-pr-sync/index.ts` -- Repo overview client polling: `foundry/packages/frontend/src/components/organization-dashboard.tsx` +- Workspace overview entry point: `foundry/packages/backend/src/actors/workspace/actions.ts` +- Project overview implementation: `foundry/packages/backend/src/actors/project/actions.ts` +- Branch sync poller: `foundry/packages/backend/src/actors/project-branch-sync/index.ts` +- PR sync poller: `foundry/packages/backend/src/actors/project-pr-sync/index.ts` +- Repo overview client polling: `foundry/packages/frontend/src/components/workspace-dashboard.tsx` ## Target Contract @@ -30,27 +30,27 @@ The frontend polls repo overview repeatedly, so this design multiplies slow work ## Proposed Fix 1. Remove inline `forceProjectSync()` from `getRepoOverview`. -2. Add freshness fields to the repository projection, for example: +2. Add freshness fields to the project projection, for example: - `branchSyncAt` - `prSyncAt` - `branchSyncStatus` - `prSyncStatus` 3. Let the existing polling actors own cache refresh. -4. If the client needs a manual refresh, add a non-blocking command such as `repository.requestOverviewRefresh` that: +4. If the client needs a manual refresh, add a non-blocking command such as `project.requestOverviewRefresh` that: - enqueues refresh work - updates sync status to `queued` or `running` - returns immediately -5. Keep `getRepoOverview` as a pure read over repository SQLite state. +5. Keep `getRepoOverview` as a pure read over project SQLite state. ## Files Likely To Change -- `foundry/packages/backend/src/actors/organization/actions.ts` -- `foundry/packages/backend/src/actors/repository/actions.ts` -- `foundry/packages/backend/src/actors/repository/db/schema.ts` -- `foundry/packages/backend/src/actors/repository/db/migrations.ts` -- `foundry/packages/backend/src/actors/repository-branch-sync/index.ts` -- `foundry/packages/backend/src/actors/repository-pr-sync/index.ts` -- `foundry/packages/frontend/src/components/organization-dashboard.tsx` +- `foundry/packages/backend/src/actors/workspace/actions.ts` +- `foundry/packages/backend/src/actors/project/actions.ts` +- `foundry/packages/backend/src/actors/project/db/schema.ts` +- `foundry/packages/backend/src/actors/project/db/migrations.ts` +- `foundry/packages/backend/src/actors/project-branch-sync/index.ts` +- `foundry/packages/backend/src/actors/project-pr-sync/index.ts` +- `foundry/packages/frontend/src/components/workspace-dashboard.tsx` ## Client Impact diff --git a/foundry/research/specs/async-action-fixes/03-repo-actions-via-background-workflow.md b/foundry/research/specs/async-action-fixes/03-repo-actions-via-background-workflow.md index 9fdd46a..2c1738c 100644 --- a/foundry/research/specs/async-action-fixes/03-repo-actions-via-background-workflow.md +++ b/foundry/research/specs/async-action-fixes/03-repo-actions-via-background-workflow.md @@ -10,20 +10,20 @@ These flows depend on repo/network state and can take minutes. They should not h ## Current Code Context -- Organization repo action entry point: `foundry/packages/backend/src/actors/organization/actions.ts` -- Repository repo action implementation: `foundry/packages/backend/src/actors/repository/actions.ts` -- Branch/task index state lives in the repository actor SQLite DB. +- Workspace repo action entry point: `foundry/packages/backend/src/actors/workspace/actions.ts` +- Project repo action implementation: `foundry/packages/backend/src/actors/project/actions.ts` +- Branch/task index state lives in the project actor SQLite DB. - Current forced sync uses the PR and branch polling actors before and after the action. ## Target Contract - Repo-affecting actions are accepted quickly and run in the background. -- The repository actor owns a durable action record with progress and final result. -- Clients observe status via repository/task state instead of waiting for a single response. +- The project actor owns a durable action record with progress and final result. +- Clients observe status via project/task state instead of waiting for a single response. ## Proposed Fix -1. Introduce a repository-level workflow/job model for repo actions, for example: +1. Introduce a project-level workflow/job model for repo actions, for example: - `sync_repo` - `restack_repo` - `restack_subtree` @@ -49,11 +49,11 @@ These flows depend on repo/network state and can take minutes. They should not h ## Files Likely To Change -- `foundry/packages/backend/src/actors/organization/actions.ts` -- `foundry/packages/backend/src/actors/repository/actions.ts` -- `foundry/packages/backend/src/actors/repository/db/schema.ts` -- `foundry/packages/backend/src/actors/repository/db/migrations.ts` -- `foundry/packages/frontend/src/components/organization-dashboard.tsx` +- `foundry/packages/backend/src/actors/workspace/actions.ts` +- `foundry/packages/backend/src/actors/project/actions.ts` +- `foundry/packages/backend/src/actors/project/db/schema.ts` +- `foundry/packages/backend/src/actors/project/db/migrations.ts` +- `foundry/packages/frontend/src/components/workspace-dashboard.tsx` - Any shared types in `foundry/packages/shared/src` ## Client Impact @@ -70,5 +70,5 @@ These flows depend on repo/network state and can take minutes. They should not h ## Implementation Notes - Keep validation cheap in the request path; expensive repo inspection belongs in the workflow. -- If job rows are added, decide whether they are repository-owned only or also mirrored into history events for UI consumption. +- If job rows are added, decide whether they are project-owned only or also mirrored into history events for UI consumption. - Fresh-agent check: branch-backed task creation and explicit repo stack actions should use the same background job/status vocabulary where possible. diff --git a/foundry/research/specs/async-action-fixes/04-workbench-session-creation-without-inline-provisioning.md b/foundry/research/specs/async-action-fixes/04-workbench-session-creation-without-inline-provisioning.md index d48e4f0..9221780 100644 --- a/foundry/research/specs/async-action-fixes/04-workbench-session-creation-without-inline-provisioning.md +++ b/foundry/research/specs/async-action-fixes/04-workbench-session-creation-without-inline-provisioning.md @@ -8,7 +8,7 @@ Creating a workbench tab currently provisions the whole task if no active sandbo ## Current Code Context -- Organization workbench action entry point: `foundry/packages/backend/src/actors/organization/actions.ts` +- Workspace workbench action entry point: `foundry/packages/backend/src/actors/workspace/actions.ts` - Task workbench behavior: `foundry/packages/backend/src/actors/task/workbench.ts` - Task provision action: `foundry/packages/backend/src/actors/task/index.ts` - Sandbox session creation path: `foundry/packages/backend/src/actors/sandbox-instance/index.ts` @@ -36,7 +36,7 @@ Creating a workbench tab currently provisions the whole task if no active sandbo ## Files Likely To Change -- `foundry/packages/backend/src/actors/organization/actions.ts` +- `foundry/packages/backend/src/actors/workspace/actions.ts` - `foundry/packages/backend/src/actors/task/workbench.ts` - `foundry/packages/backend/src/actors/task/index.ts` - `foundry/packages/backend/src/actors/task/db/schema.ts` diff --git a/foundry/research/specs/async-action-fixes/05-workbench-snapshot-from-derived-state.md b/foundry/research/specs/async-action-fixes/05-workbench-snapshot-from-derived-state.md index 07cc0a5..55401a7 100644 --- a/foundry/research/specs/async-action-fixes/05-workbench-snapshot-from-derived-state.md +++ b/foundry/research/specs/async-action-fixes/05-workbench-snapshot-from-derived-state.md @@ -17,7 +17,7 @@ The remote workbench client refreshes after each action and on update events, so ## Current Code Context -- Organization workbench snapshot builder: `foundry/packages/backend/src/actors/organization/actions.ts` +- Workspace workbench snapshot builder: `foundry/packages/backend/src/actors/workspace/actions.ts` - Task workbench snapshot builder: `foundry/packages/backend/src/actors/task/workbench.ts` - Sandbox session event persistence: `foundry/packages/backend/src/actors/sandbox-instance/persist.ts` - Remote workbench client refresh loop: `foundry/packages/client/src/remote/workbench-client.ts` @@ -43,7 +43,7 @@ The remote workbench client refreshes after each action and on update events, so ## Files Likely To Change -- `foundry/packages/backend/src/actors/organization/actions.ts` +- `foundry/packages/backend/src/actors/workspace/actions.ts` - `foundry/packages/backend/src/actors/task/workbench.ts` - `foundry/packages/backend/src/actors/task/db/schema.ts` - `foundry/packages/backend/src/actors/task/db/migrations.ts` diff --git a/foundry/research/specs/async-action-fixes/07-auth-identity-simplification.md b/foundry/research/specs/async-action-fixes/07-auth-identity-simplification.md index dbaf976..50f3b56 100644 --- a/foundry/research/specs/async-action-fixes/07-auth-identity-simplification.md +++ b/foundry/research/specs/async-action-fixes/07-auth-identity-simplification.md @@ -17,8 +17,8 @@ Authentication and user identity are conflated into a single `appSessions` table ## Current Code Context - Custom OAuth flow: `foundry/packages/backend/src/services/app-github.ts` (`buildAuthorizeUrl`, `exchangeCode`, `getViewer`) -- Session + identity management: `foundry/packages/backend/src/actors/organization/app-shell.ts` (`ensureAppSession`, `updateAppSession`, `initGithubSession`, `syncGithubOrganizations`) -- Session schema: `foundry/packages/backend/src/actors/organization/db/schema.ts` (`appSessions` table) +- Session + identity management: `foundry/packages/backend/src/actors/workspace/app-shell.ts` (`ensureAppSession`, `updateAppSession`, `initGithubSession`, `syncGithubOrganizations`) +- Session schema: `foundry/packages/backend/src/actors/workspace/db/schema.ts` (`appSessions` table) - Shared types: `foundry/packages/shared/src/app-shell.ts` (`FoundryUser`, `FoundryAppSnapshot`) - HTTP routes: `foundry/packages/backend/src/index.ts` (`resolveSessionId`, `/v1/auth/github/*`, all `/v1/app/*` routes) - Frontend session persistence: `foundry/packages/client/src/backend-client.ts` (`persistAppSessionId`, `x-foundry-session` header, `foundrySession` URL param extraction) @@ -41,7 +41,7 @@ Authentication and user identity are conflated into a single `appSessions` table - BetterAuth uses a custom adapter that routes all DB operations through RivetKit actors. - Each user has their own actor. BetterAuth's `user`, `session`, and `account` tables live in the per-user actor's SQLite via `c.db`. - The adapter resolves which actor to target based on the primary key BetterAuth passes for each operation (user ID, session ID, account ID). -- A lightweight **session index** on the app-shell organization actor maps session tokens → user actor identity, so inbound requests can be routed to the correct user actor without knowing the user ID upfront. +- A lightweight **session index** on the app-shell workspace actor maps session tokens → user actor identity, so inbound requests can be routed to the correct user actor without knowing the user ID upfront. ### Canonical user record @@ -70,9 +70,9 @@ BetterAuth expects a single database. Foundry uses per-actor SQLite — each act When an HTTP request arrives, the backend has a session token but doesn't know the user ID yet. BetterAuth calls adapter methods like `findSession(sessionId)` to resolve this. But which actor holds that session row? -**Solution: session index on the app-shell organization actor.** +**Solution: session index on the app-shell workspace actor.** -The app-shell organization actor (which already handles auth routing) maintains a lightweight index table: +The app-shell workspace actor (which already handles auth routing) maintains a lightweight index table: ``` sessionIndex @@ -83,7 +83,7 @@ sessionIndex The adapter flow for session lookup: 1. BetterAuth calls `findSession(sessionId)`. -2. Adapter queries `sessionIndex` on the organization actor to resolve `userActorKey`. +2. Adapter queries `sessionIndex` on the workspace actor to resolve `userActorKey`. 3. Adapter gets the user actor handle and queries BetterAuth's `session` table in that actor's `c.db`. The adapter flow for user creation (OAuth callback): @@ -91,12 +91,12 @@ The adapter flow for user creation (OAuth callback): 2. Adapter resolves the GitHub numeric ID from the user data. 3. Adapter creates/gets the user actor keyed by GitHub ID. 4. Adapter inserts into BetterAuth's `user` table in that actor's `c.db`. -5. When `createSession` follows, adapter writes to the user actor's `session` table AND inserts into the organization actor's `sessionIndex`. +5. When `createSession` follows, adapter writes to the user actor's `session` table AND inserts into the workspace actor's `sessionIndex`. ### User actor shape ```text -UserActor (key: ["ws", organizationId, "user", githubNumericId]) +UserActor (key: ["ws", workspaceId, "user", githubNumericId]) ├── BetterAuth tables: user, session, account (managed by BetterAuth schema) ├── userProfiles (app-specific: eligibleOrganizationIds, starterRepoStatus, roleLabel) └── sessionState (app-specific: activeOrganizationId per session) @@ -127,15 +127,15 @@ The adapter must inspect `model` and `where` to determine the target actor: | Model | Routing strategy | |-------|-----------------| | `user` (by id) | User actor key derived directly from user ID | -| `user` (by email) | `emailIndex` on organization actor → user actor key | -| `session` (by token) | `sessionIndex` on organization actor → user actor key | -| `session` (by id) | `sessionIndex` on organization actor → user actor key | +| `user` (by email) | `emailIndex` on workspace actor → user actor key | +| `session` (by token) | `sessionIndex` on workspace actor → user actor key | +| `session` (by id) | `sessionIndex` on workspace actor → user actor key | | `session` (by userId) | User actor key derived directly from userId | | `account` | Always has `userId` in where or data → user actor key | -| `verification` | Organization actor (not user-scoped — used for email verification, password reset) | +| `verification` | Workspace actor (not user-scoped — used for email verification, password reset) | -On `create` for `session` model: write to user actor's `session` table AND insert into organization actor's `sessionIndex`. -On `delete` for `session` model: delete from user actor's `session` table AND remove from organization actor's `sessionIndex`. +On `create` for `session` model: write to user actor's `session` table AND insert into workspace actor's `sessionIndex`. +On `delete` for `session` model: delete from user actor's `session` table AND remove from workspace actor's `sessionIndex`. #### Adapter construction @@ -188,14 +188,14 @@ session: { #### BetterAuth core tables -Four tables, all in the per-user actor's SQLite (except `verification` which goes on organization actor): +Four tables, all in the per-user actor's SQLite (except `verification` which goes on workspace actor): **`user`**: `id`, `name`, `email`, `emailVerified`, `image`, `createdAt`, `updatedAt` **`session`**: `id`, `token`, `userId`, `expiresAt`, `ipAddress?`, `userAgent?`, `createdAt`, `updatedAt` **`account`**: `id`, `userId`, `accountId` (GitHub numeric ID), `providerId` ("github"), `accessToken?`, `refreshToken?`, `scope?`, `createdAt`, `updatedAt` **`verification`**: `id`, `identifier`, `value`, `expiresAt`, `createdAt`, `updatedAt` -For `findUserByEmail`, a secondary index (email → user actor key) is needed on the organization actor alongside `sessionIndex`. +For `findUserByEmail`, a secondary index (email → user actor key) is needed on the workspace actor alongside `sessionIndex`. ## Implementation Plan @@ -210,12 +210,12 @@ Research confirms: 1. **Prototype the adapter + user actor end-to-end** — wire up `createAdapterFactory` with a minimal actor-routed implementation. Confirm that BetterAuth's GitHub OAuth flow completes successfully with user/session/account records landing in the correct per-user actor's SQLite. 2. **Verify `findOne` for session model** — confirm the `where` clause BetterAuth passes for session lookup includes the `token` field (not just `id`), so the adapter can route via `sessionIndex` keyed by token. -3. **Measure cookie-cached vs uncached request latency** — confirm that with cookie caching enabled, the adapter is not called on every request, and that the uncached fallback (organization actor index → user actor → session table) is acceptable. +3. **Measure cookie-cached vs uncached request latency** — confirm that with cookie caching enabled, the adapter is not called on every request, and that the uncached fallback (workspace actor index → user actor → session table) is acceptable. ### Phase 1: User actor + adapter infrastructure (no behavior change) 1. **Install `better-auth` package** in `packages/backend`. -2. **Define `UserActor`** with actor key `["ws", organizationId, "user", githubNumericId]`. Include BetterAuth's required tables (`user`, `session`, `account`) plus app-specific tables in its schema. +2. **Define `UserActor`** with actor key `["ws", workspaceId, "user", githubNumericId]`. Include BetterAuth's required tables (`user`, `session`, `account`) plus app-specific tables in its schema. 3. **Create `userProfiles` table** in user actor schema: ``` userProfiles @@ -237,7 +237,7 @@ Research confirms: ├── createdAt (integer) ├── updatedAt (integer) ``` -5. **Create `sessionIndex` and `emailIndex` tables** on the app-shell organization actor: +5. **Create `sessionIndex` and `emailIndex` tables** on the app-shell workspace actor: ``` sessionIndex ├── sessionId (text, PK) @@ -256,7 +256,7 @@ Research confirms: ### Phase 2: Migrate OAuth flow to BetterAuth 1. **Replace `startAppGithubAuth`** — delegate to BetterAuth's GitHub OAuth initiation instead of hand-rolling `buildAuthorizeUrl` + `oauthState` + `oauthStateExpiresAt`. -2. **Replace `completeAppGithubAuth`** — delegate to BetterAuth's callback handler. BetterAuth creates/updates the user record in the user actor and creates a signed session. The adapter writes to `sessionIndex` on the organization actor. +2. **Replace `completeAppGithubAuth`** — delegate to BetterAuth's callback handler. BetterAuth creates/updates the user record in the user actor and creates a signed session. The adapter writes to `sessionIndex` on the workspace actor. 3. **After BetterAuth callback completes**, populate `userProfiles` in the user actor with app-specific fields and enqueue the slow org sync (same background workflow pattern as today). 4. **Replace `signOutApp`** — delegate to BetterAuth session invalidation. Adapter removes entry from `sessionIndex`. 5. **Update `resolveSessionId`** in `index.ts` — validate the session via BetterAuth (which routes through the adapter → `sessionIndex` → user actor). BetterAuth verifies the signature and checks expiration. @@ -288,18 +288,18 @@ Research confirms: ## Constraints - **Actor-routed adapter.** BetterAuth does not natively support per-user actor databases. The custom adapter must route every DB operation to the correct actor. This adds a layer of indirection and latency (actor handle resolution + message) on adapter calls. -- **Session index cost is mitigated by cookie caching.** With `cookieCache` enabled, BetterAuth validates sessions from a signed cookie on most requests — the adapter (and thus the `sessionIndex` lookup + user actor round-trip) is only called when the cache expires or on writes. Without caching, every authenticated request would hit the organization actor's `sessionIndex` table then the user actor. -- **Two-actor write on session create/destroy.** Creating or destroying a session requires writing to both the user actor (BetterAuth's `session` table) and the organization actor (`sessionIndex`). These must be consistent — if the user actor write succeeds but the index write fails, the session exists but is unreachable. +- **Session index cost is mitigated by cookie caching.** With `cookieCache` enabled, BetterAuth validates sessions from a signed cookie on most requests — the adapter (and thus the `sessionIndex` lookup + user actor round-trip) is only called when the cache expires or on writes. Without caching, every authenticated request would hit the workspace actor's `sessionIndex` table then the user actor. +- **Two-actor write on session create/destroy.** Creating or destroying a session requires writing to both the user actor (BetterAuth's `session` table) and the workspace actor (`sessionIndex`). These must be consistent — if the user actor write succeeds but the index write fails, the session exists but is unreachable. - **Background org sync pattern must be preserved.** The fast-path/slow-path split (`initGithubSession` returns immediately, `syncGithubOrganizations` runs in workflow queue) is critical for avoiding proxy timeout retries. BetterAuth handles the OAuth exchange, but the org sync stays as a background workflow. - **`GitHubAppClient` is still needed.** BetterAuth replaces the OAuth user-auth flow, but installation tokens, webhook verification, repo listing, and org listing are GitHub App operations that BetterAuth does not cover. - **User ID migration.** Changing user IDs from `user-${slugify(login)}` to GitHub numeric IDs affects `organizationMembers`, `seatAssignments`, and any cross-actor references to user IDs. Existing data needs a migration path. -- **`findUserByEmail` requires a secondary index.** BetterAuth sometimes looks up users by email (e.g., account linking). An `emailIndex` table on the organization actor is needed. This must be kept in sync with the user actor's email field. +- **`findUserByEmail` requires a secondary index.** BetterAuth sometimes looks up users by email (e.g., account linking). An `emailIndex` table on the workspace actor is needed. This must be kept in sync with the user actor's email field. ## Risk Assessment - **Adapter call context — RESOLVED.** Research confirms BetterAuth adapter methods are plain async functions with no request context dependency. The adapter closes over the RivetKit registry at init time and resolves actor handles on demand. No ambient `c` context needed. - **Hot-path latency — MITIGATED.** Cookie caching (`cookieCache` with `strategy: "compact"`) means most authenticated requests validate the session from a signed cookie without calling the adapter at all. The adapter (and thus the actor round-trip) is only hit when the cache expires (configurable, e.g., every 5 minutes) or on writes. This makes the session index + user actor lookup acceptable. -- **Two-actor consistency.** Session create/destroy touches two actors (user actor + organization index). If either write fails, the system is in an inconsistent state. Recommended: write index first, then user actor. A dangling index entry pointing to a nonexistent session is benign — BetterAuth treats it as "session not found" and the user just re-authenticates. +- **Two-actor consistency.** Session create/destroy touches two actors (user actor + workspace index). If either write fails, the system is in an inconsistent state. Recommended: write index first, then user actor. A dangling index entry pointing to a nonexistent session is benign — BetterAuth treats it as "session not found" and the user just re-authenticates. - **Cookie vs header auth.** BetterAuth defaults to HTTP-only cookies (`better-auth.session_token`). The current system uses a custom `x-foundry-session` header with `localStorage`. BetterAuth supports `bearer` token mode for programmatic clients via its `bearer` plugin. Enable both for browser + API access. - **Dev bootstrap flow.** `bootstrapAppGithubSession` bypasses the normal OAuth flow for local development. BetterAuth supports programmatic session creation via its internal adapter — the dev path can call the adapter's `create` method directly for the `session` and `account` models. - **Actor lifecycle for users.** User actors are long-lived but low-traffic. RivetKit will idle/unload them. With cookie caching, cold-start only happens when the cache expires — not on every request. Acceptable. diff --git a/foundry/research/specs/async-action-fixes/README.md b/foundry/research/specs/async-action-fixes/README.md index a26fd0e..1dae650 100644 --- a/foundry/research/specs/async-action-fixes/README.md +++ b/foundry/research/specs/async-action-fixes/README.md @@ -19,7 +19,7 @@ The governing policy now lives in `foundry/CLAUDE.md`: - Backend actor entry points live under `foundry/packages/backend/src/actors`. - Provider-backed long-running work lives under `foundry/packages/backend/src/providers`. - The main UI consumers are: - - `foundry/packages/frontend/src/components/organization-dashboard.tsx` + - `foundry/packages/frontend/src/components/workspace-dashboard.tsx` - `foundry/packages/frontend/src/components/mock-layout.tsx` - `foundry/packages/client/src/remote/workbench-client.ts` - Existing non-blocking examples already exist in app-shell GitHub auth/import flows. Use those as the reference pattern for request returns plus background completion. @@ -32,7 +32,7 @@ The governing policy now lives in `foundry/CLAUDE.md`: 4. `06-daytona-provisioning-staged-background-flow.md` 5. App shell realtime subscription work from `00-end-to-end-async-realtime-plan.md` 6. `02-repo-overview-from-cached-projection.md` -7. Organization summary projection work from `00-end-to-end-async-realtime-plan.md` +7. Workspace summary projection work from `00-end-to-end-async-realtime-plan.md` 8. `04-workbench-session-creation-without-inline-provisioning.md` 9. `05-workbench-snapshot-from-derived-state.md` 10. Task-detail direct subscription work from `00-end-to-end-async-realtime-plan.md` @@ -42,7 +42,7 @@ The governing policy now lives in `foundry/CLAUDE.md`: - Runtime hardening and the first async workflow items remove the highest-risk correctness and timeout issues first. - App shell realtime is a smaller migration than the workbench and removes the current polling loop early. -- Organization summary and task-detail subscription work are easier once long-running mutations already report durable background state. +- Workspace summary and task-detail subscription work are easier once long-running mutations already report durable background state. - Auth simplification is important, but it should not block the snapshot/polling/runtime fixes. ## Fresh Agent Checklist diff --git a/foundry/research/specs/frontend.md b/foundry/research/specs/frontend.md index 6c384ae..2eb4ce5 100644 --- a/foundry/research/specs/frontend.md +++ b/foundry/research/specs/frontend.md @@ -24,8 +24,8 @@ be thorough and careful with your impelmentaiton. this is going to be the ground - left sidebar is similar to the hf switch ui: - list each repo - under each repo, show all of the tasks - - you should see all tasks for the entire organization here grouped by repo -- the main content area shows the current organization + - you should see all tasks for the entire workspace here grouped by repo +- the main content area shows the current workspace - there is a main agent session for the main agent thatn's making the change, so show this by default - build a ui for interacting with sessions - see ~/sandbox-agent/frontend/packages/inspector/ for reference ui diff --git a/foundry/research/specs/github-data-actor.md b/foundry/research/specs/github-data-actor.md deleted file mode 100644 index 75a71a1..0000000 --- a/foundry/research/specs/github-data-actor.md +++ /dev/null @@ -1,169 +0,0 @@ -# Spec: GitHub Data Actor & Webhook-Driven State - -## Summary - -Replace the per-repo polling PR sync actor (`ProjectPrSyncActor`) and per-repo PR cache (`prCache` table) with a single organization-scoped `github-state` actor that owns all GitHub data (repos, PRs, members). All GitHub state updates flow exclusively through webhooks, with a one-shot full sync on initial connection. Manual reload actions are exposed per-entity (org, repo, PR) for recovery from missed webhooks. - -Open PRs are surfaced in the left sidebar alongside tasks via a unified organization subscription topic, with lazy task/sandbox creation when a user clicks on a PR. - -## Reference Implementation - -A prior implementation of the `github-state` actor exists in git checkpoint `0aca2c7` (from PR #247 "Refactor Foundry GitHub state and sandbox runtime"). This was never merged to a branch but contains working code for: - -- `foundry/packages/backend/src/actors/github-state/index.ts` — full actor with DB, sync workflow, webhook handler, PR CRUD -- `foundry/packages/backend/src/actors/github-state/db/schema.ts` — `github_meta`, `github_repositories`, `github_members`, `github_pull_requests` tables -- `foundry/packages/backend/src/actors/organization/app-shell.ts` lines 1056-1180 — webhook dispatch to `githubState.handlePullRequestWebhook()` and `githubState.fullSync()` - -Use `git show 0aca2c7:` to read the reference files. Adapt (don't copy blindly) — the current branch structure has diverged. - -## Constraints - -1. **No polling.** Delete `ProjectPrSyncActor` (`actors/repository-pr-sync/`), all references to it in handles/keys/index, and the `prCache` table in `RepositoryActor`'s DB schema. Remove `prSyncStatus`/`prSyncAt` from `getRepoOverview`. -2. **Keep `ProjectBranchSyncActor`.** This polls the local git clone (not GitHub API) and is the sandbox git status mechanism. It stays. -3. **Webhooks are the sole live update path.** The only GitHub API calls happen during: - - Initial full sync on org connection/installation - - Manual reload actions (per-entity) -4. **GitHub does not auto-retry failed webhook deliveries** ([docs](https://docs.github.com/en/webhooks/using-webhooks/handling-failed-webhook-deliveries)). Manual reload is the recovery mechanism. -5. **No `user-github-data` actor in this spec.** OAuth/auth is already handled correctly on the current branch. Only the org-scoped `github-state` actor is in scope. - -## Architecture - -### Actor: `github-state` (one per organization) - -**Key:** `["org", organizationId, "github"]` - -**DB tables:** -- `github_meta` — sync status, installation info, connected account -- `github_repositories` — repos accessible via the GitHub App installation -- `github_pull_requests` — all open PRs across all repos in the org -- `github_members` — org members (existing from checkpoint, keep for completeness) - -**Actions (from checkpoint, to adapt):** -- `fullSync(input)` — one-shot fetch of repos + PRs via installation token. Enqueues as a workflow step. Used on initial connection and `installation.created`/`unsuspend` webhooks. -- `handlePullRequestWebhook(input)` — upserts a single PR from webhook payload, notifies downstream. -- `getSummary()` — returns sync meta + row counts. -- `listRepositories()` — returns all known repos. -- `listPullRequestsForRepository({ repoId })` — returns PRs for a repo. -- `getPullRequestForBranch({ repoId, branchName })` — returns PR info for a branch. -- `createPullRequest({ repoId, repoPath, branchName, title, body })` — creates PR via GitHub API, stores locally. -- `clearState(input)` — wipes all data (on `installation.deleted`, `suspend`). - -**New actions (not in checkpoint):** -- `reloadOrganization()` — re-fetches repos + members from GitHub API (not PRs). Updates `github_repositories` and `github_members`. Notifies downstream. -- `reloadRepository({ repoId })` — re-fetches metadata for a single repo from GitHub API. Updates the `github_repositories` row. Does NOT re-fetch PRs. -- `reloadPullRequest({ repoId, prNumber })` — re-fetches a single PR from GitHub API by number. Updates the `github_pull_requests` row. Notifies downstream. - -### Webhook Dispatch (in app-shell) - -Replace the current TODO at `app-shell.ts:1521` with dispatch logic adapted from checkpoint `0aca2c7:foundry/packages/backend/src/actors/organization/app-shell.ts` lines 1056-1180: - -| Webhook event | Action | -|---|---| -| `installation.created` | `githubState.fullSync({ force: true })` | -| `installation.deleted` | `githubState.clearState(...)` | -| `installation.suspend` | `githubState.clearState(...)` | -| `installation.unsuspend` | `githubState.fullSync({ force: true })` | -| `installation_repositories` | `githubState.fullSync({ force: true })` | -| `pull_request` (any action) | `githubState.handlePullRequestWebhook(...)` | -| `push`, `create`, `delete`, `check_run`, `check_suite`, `status`, `pull_request_review`, `pull_request_review_comment` | Log for now, extend later | - -### Downstream Notifications - -When `github-state` receives a PR update (webhook or manual reload), it should: - -1. Update its own `github_pull_requests` table -2. Call `notifyOrganizationUpdated()` → which broadcasts `organizationUpdated` to connected clients -3. If the PR branch matches an existing task's branch, update that task's `pullRequest` summary in the organization actor - -### Organization Summary Changes - -Extend `OrganizationSummarySnapshot` to include open PRs: - -```typescript -export interface OrganizationSummarySnapshot { - organizationId: string; - repos: WorkbenchRepoSummary[]; - taskSummaries: WorkbenchTaskSummary[]; - openPullRequests: WorkbenchOpenPrSummary[]; // NEW -} - -export interface WorkbenchOpenPrSummary { - prId: string; // "repoId#number" - repoId: string; - repoFullName: string; - number: number; - title: string; - state: string; - url: string; - headRefName: string; - baseRefName: string; - authorLogin: string | null; - isDraft: boolean; - updatedAtMs: number; -} -``` - -The organization actor fetches open PRs from the `github-state` actor when building the summary snapshot. PRs that already have an associated task (matched by branch name) should be excluded from `openPullRequests` (they already appear in `taskSummaries` with their `pullRequest` field populated). - -### Interest Manager - -The `organization` subscription topic already returns `OrganizationSummarySnapshot`. Adding `openPullRequests` to that type means the sidebar automatically gets PR data without a new topic. - -`organizationUpdated` events should include a new variant for PR changes: -```typescript -{ type: "pullRequestUpdated", pullRequest: WorkbenchOpenPrSummary } -{ type: "pullRequestRemoved", prId: string } -``` - -### Sidebar Changes - -The left sidebar currently renders `repositories: RepositorySection[]` where each repository has `tasks: Task[]`. Extend this to include open PRs as lightweight entries within each repository section: - -- Open PRs appear in the same list as tasks, sorted by `updatedAtMs` -- PRs should be visually distinct: show PR icon instead of task indicator, display `#number` and author -- Clicking a PR creates a task lazily (creates the task + sandbox on demand), then navigates to it -- PRs that already have a task are filtered out (they show as the task instead) - -This is similar to what `buildPrTasks()` does in the mock data (`workbench-model.ts:1154-1182`), but driven by real data from the `github-state` actor. - -### Frontend: Manual Reload - -Add a "three dots" menu button in the top-right of the sidebar header. Dropdown options: - -- **Reload organization** — calls `githubState.reloadOrganization()` via backend API -- **Reload all PRs** — calls `githubState.fullSync({ force: true })` (convenience shortcut) - -For per-repo and per-PR reload, add context menu options: -- Right-click a repository header → "Reload repository" -- Right-click a PR entry → "Reload pull request" - -These call the corresponding `reloadRepository`/`reloadPullRequest` actions on the `github-state` actor. - -## Deletions - -Files/code to remove: - -1. `foundry/packages/backend/src/actors/repository-pr-sync/` — entire directory -2. `foundry/packages/backend/src/actors/repository/db/schema.ts` — `prCache` table -3. `foundry/packages/backend/src/actors/repository/actions.ts` — `applyPrSyncResultMutation`, `getPullRequestForBranch` (moves to github-state), `prSyncStatus`/`prSyncAt` from `getRepoOverview` -4. `foundry/packages/backend/src/actors/handles.ts` — `getOrCreateProjectPrSync`, `selfProjectPrSync` -5. `foundry/packages/backend/src/actors/keys.ts` — any PR sync key helper -6. `foundry/packages/backend/src/actors/index.ts` — `repositoryPrSync` import and registration -7. All call sites in `RepositoryActor` that spawn or call the PR sync actor (`initProject`, `refreshProject`) - -## Migration Path - -The `prCache` table in `RepositoryActor`'s DB can simply be dropped — no data migration needed since the `github-state` actor will re-fetch everything on its first `fullSync`. Existing task `pullRequest` fields are populated from the github-state actor going forward. - -## Implementation Order - -1. Create `github-state` actor (adapt from checkpoint `0aca2c7`) -2. Wire up actor in registry, handles, keys -3. Implement webhook dispatch in app-shell (replace TODO) -4. Delete `ProjectPrSyncActor` and `prCache` from repository actor -5. Add manual reload actions to github-state -6. Extend `OrganizationSummarySnapshot` with `openPullRequests` -7. Wire through subscription manager + organization events -8. Update sidebar to render open PRs -9. Add three-dots menu with reload options -10. Update task creation flow for lazy PR→task conversion diff --git a/foundry/research/specs/remove-local-git-clone.md b/foundry/research/specs/remove-local-git-clone.md deleted file mode 100644 index 261ffc2..0000000 --- a/foundry/research/specs/remove-local-git-clone.md +++ /dev/null @@ -1,381 +0,0 @@ -# Remove Local Git Clone from Backend - -## Goal - -The Foundry backend stores zero git state. No clones, no refs, no working trees, no git-spice. All git operations execute inside sandboxes. Repo metadata (branches, default branch, PRs) comes from GitHub API/webhooks which we already have. - -## Terminology renames - -Rename Foundry domain terms across the entire `foundry/` directory. All changes are breaking — no backwards compatibility needed. Execute as separate atomic commits in this order. `pnpm -w typecheck && pnpm -w build && pnpm -w test` must pass between each. - -| New name | Old name (current code) | -|---|---| -| **Organization** | Workspace | -| **Repository** | Project | -| **Session** (not "tab") | Tab / Session (mixed) | -| **Subscription** | Interest | -| **SandboxProviderId** | ProviderId | - -### Rename 1: `interest` → `subscription` - -The realtime pub/sub system in `client/src/interest/`. Rename the directory, all types (`InterestManager` → `SubscriptionManager`, `MockInterestManager` → `MockSubscriptionManager`, `RemoteInterestManager` → `RemoteSubscriptionManager`, `DebugInterestTopic` → `DebugSubscriptionTopic`), the `useInterest` hook → `useSubscription`, and all imports in client + frontend. Rename `frontend/src/lib/interest.ts` → `subscription.ts`. Rename test file `client/test/interest-manager.test.ts` → `subscription-manager.test.ts`. - -### Rename 2: `tab` → `session` - -The UI "tab" concept is really a session. Rename `TabStrip` → `SessionStrip`, `tabId` → `sessionId`, `closeTab` → `closeSession`, `addTab` → `addSession`, `WorkbenchAgentTab` → `WorkbenchAgentSession`, `TaskWorkbenchTabInput` → `TaskWorkbenchSessionInput`, `TaskWorkbenchAddTabResponse` → `TaskWorkbenchAddSessionResponse`, and all related props/DOM attrs (`activeTabId` → `activeSessionId`, `onSwitchTab` → `onSwitchSession`, `onCloseTab` → `onCloseSession`, `data-tab` → `data-session`, `editingSessionTabId` → `editingSessionId`). Rename file `tab-strip.tsx` → `session-strip.tsx`. **Leave "diff tabs" alone** (`isDiffTab`, `diffTabId`) — those are file viewer panes, a different concept. - -### Rename 3: `ProviderId` → `SandboxProviderId` - -The `ProviderId` type (`"e2b" | "local"`) is specifically a sandbox provider. Rename the type (`ProviderId` → `SandboxProviderId`), schema (`ProviderIdSchema` → `SandboxProviderIdSchema`), and all `providerId` fields that refer to sandbox hosting (`CreateTaskInput`, `TaskRecord`, `SwitchResult`, `WorkbenchSandboxSummary`, task DB schema `task.provider_id` → `sandbox_provider_id`, `task_sandboxes.provider_id` → `sandbox_provider_id`, topic params). Rename config key `providers` → `sandboxProviders`. DB column renames need Drizzle migrations. - -**Do NOT rename**: `model.provider` (AI model provider), `auth_account_index.provider_id` (auth provider), `providerAgent()` (model→agent mapping), `WorkbenchModelGroup.provider`. - -Also **delete the `providerProfiles` table entirely** — it's written but never read (dead code). Remove the table definition from the organization actor DB schema, all writes in organization actions, and the `refreshProviderProfiles` queue command/handler/interface. - -### Rename 4: `project` → `repository` - -The "project" actor/entity is a git repository. Rename: -- Actor directory `actors/project/` → `actors/repository/` -- Actor directory `actors/project-branch-sync/` → `actors/repository-branch-sync/` -- Actor registry keys `project` → `repository`, `projectBranchSync` → `repositoryBranchSync` -- Actor name string `"Project"` → `"Repository"` -- All functions: `projectKey` → `repositoryKey`, `getOrCreateProject` → `getOrCreateRepository`, `getProject` → `getRepository`, `selfProject` → `selfRepository`, `projectBranchSyncKey` → `repositoryBranchSyncKey`, `projectPrSyncKey` → `repositoryPrSyncKey`, `projectWorkflowQueueName` → `repositoryWorkflowQueueName` -- Types: `ProjectInput` → `RepositoryInput`, `WorkbenchProjectSection` → `WorkbenchRepositorySection`, `PROJECT_QUEUE_NAMES` → `REPOSITORY_QUEUE_NAMES` -- Queue names: `"project.command.*"` → `"repository.command.*"` -- Actor key strings: change `"project"` to `"repository"` in key arrays (e.g. `["ws", id, "project", repoId]` → `["org", id, "repository", repoId]`) -- Frontend: `projects` → `repositories`, `collapsedProjects` → `collapsedRepositories`, `hoveredProjectId` → `hoveredRepositoryId`, `PROJECT_COLORS` → `REPOSITORY_COLORS`, `data-project-*` → `data-repository-*`, `groupWorkbenchProjects` → `groupWorkbenchRepositories` -- Client keys: `projectKey()` → `repositoryKey()`, `projectBranchSyncKey()` → `repositoryBranchSyncKey()`, `projectPrSyncKey()` → `repositoryPrSyncKey()` - -### Rename 5: `workspace` → `organization` - -The "workspace" is really an organization. Rename: -- Actor directory `actors/workspace/` → `actors/organization/` -- Actor registry key `workspace` → `organization` -- Actor name string `"Workspace"` → `"Organization"` -- All types: `WorkspaceIdSchema` → `OrganizationIdSchema`, `WorkspaceId` → `OrganizationId`, `WorkspaceEvent` → `OrganizationEvent`, `WorkspaceSummarySnapshot` → `OrganizationSummarySnapshot`, `WorkspaceUseInputSchema` → `OrganizationUseInputSchema`, `WorkspaceHandle` → `OrganizationHandle`, `WorkspaceTopicParams` → `OrganizationTopicParams` -- All `workspaceId` fields/params → `organizationId` (~20+ schemas in contracts.ts, plus topic params, task snapshot, etc.) -- `FoundryOrganization.workspaceId` → `FoundryOrganization.organizationId` (or just `id`) -- All functions: `workspaceKey` → `organizationKey`, `getOrCreateWorkspace` → `getOrCreateOrganization`, `selfWorkspace` → `selfOrganization`, `resolveWorkspaceId` → `resolveOrganizationId`, `defaultWorkspace` → `defaultOrganization`, `workspaceWorkflowQueueName` → `organizationWorkflowQueueName`, `WORKSPACE_QUEUE_NAMES` → `ORGANIZATION_QUEUE_NAMES` -- Actor key strings: change `"ws"` to `"org"` in key arrays (e.g. `["ws", id]` → `["org", id]`) -- Queue names: `"workspace.command.*"` → `"organization.command.*"` -- Topic keys: `"workspace:${id}"` → `"organization:${id}"`, event `"workspaceUpdated"` → `"organizationUpdated"` -- Methods: `connectWorkspace` → `connectOrganization`, `getWorkspaceSummary` → `getOrganizationSummary`, `useWorkspace` → `useOrganization` -- Files: `shared/src/workspace.ts` → `organization.ts`, `backend/src/config/workspace.ts` → `organization.ts` -- Config keys: `config.workspace.default` → `config.organization.default` -- URL paths: `/workspaces/$workspaceId` → `/organizations/$organizationId` -- UI strings: `"Loading workspace..."` → `"Loading organization..."` -- Tests: rename `workspace-*.test.ts` files, update `workspaceSnapshot()` → `organizationSnapshot()`, `workspaceId: "ws-1"` → `organizationId: "org-1"` - -### After all renames: update CLAUDE.md files - -Update `foundry/CLAUDE.md` and `foundry/packages/backend/CLAUDE.md` to use new terminology throughout (organization instead of workspace, repository instead of project, etc.). The rest of this spec already uses the new names. - -## What gets deleted - -### Entire directories/files - -| Path (relative to `packages/backend/src/`) | Reason | -|---|---| -| `integrations/git/index.ts` | All local git operations | -| `integrations/git-spice/index.ts` | Stack management via git-spice | -| `actors/repository-branch-sync/` (currently `project-branch-sync/`) | Polling actor that fetches + reads local clone every 5s | -| `actors/project-pr-sync/` | Empty directory, already dead | -| `actors/repository/stack-model.ts` (currently `project/stack-model.ts`) | Stack parent/sort model (git-spice dependent) | -| `test/git-spice.test.ts` | Tests for deleted git-spice integration | -| `test/git-validate-remote.test.ts` | Tests for deleted git validation | -| `test/stack-model.test.ts` | Tests for deleted stack model | - -### Driver interfaces removed from `driver.ts` - -- `GitDriver` — entire interface deleted -- `StackDriver` — entire interface deleted -- `BackendDriver.git` — removed -- `BackendDriver.stack` — removed -- All imports from `integrations/git/` and `integrations/git-spice/` - -`BackendDriver` keeps only `github` and `tmux`. - -### Test driver cleanup (`test/helpers/test-driver.ts`) - -- Delete `createTestGitDriver()` -- Delete `createTestStackDriver()` -- Remove `git` and `stack` from `createTestDriver()` - -### Docker volume removed (`compose.dev.yaml`, `compose.preview.yaml`) - -- Remove `foundry_git_repos` volume and its mount at `/root/.local/share/foundry/repos` -- Remove the CLAUDE.md note about the repos volume - -### Actor registry cleanup (`actors/index.ts`, `actors/keys.ts`, `actors/handles.ts`) - -- Remove `RepositoryBranchSyncActor` (currently `ProjectBranchSyncActor`) registration -- Remove `repositoryBranchSyncKey` (currently `projectBranchSyncKey`) -- Remove branch sync handle helpers - -### Client key cleanup (`packages/client/src/keys.ts`, `packages/client/test/keys.test.ts`) - -- Remove `repositoryBranchSyncKey` (currently `projectBranchSyncKey`) if exported - -### Dead code removal: `providerProfiles` table - -The `providerProfiles` table in the organization actor (currently workspace actor) DB is written but never read. Delete: - -- Table definition in `actors/organization/db/schema.ts` (currently `workspace/db/schema.ts`) -- All writes in `actors/organization/actions.ts` (currently `workspace/actions.ts`) -- The `refreshProviderProfiles` queue command and handler -- The `RefreshProviderProfilesCommand` interface -- Add a DB migration to drop the `provider_profiles` table - -### Ensure pattern cleanup (`actors/repository/actions.ts`, currently `project/actions.ts`) - -Delete all `ensure*` functions that block action handlers on external I/O or cross-actor fan-out: - -- **`ensureLocalClone()`** — Delete (git clone removal). -- **`ensureProjectReady()`** / **`ensureRepositoryReady()`** — Delete (wrapper around `ensureLocalClone` + sync actors). -- **`ensureProjectReadyForRead()`** / **`ensureRepositoryReadyForRead()`** — Delete (dispatches ensure with 10s wait on read path). -- **`ensureProjectSyncActors()`** / **`ensureRepositorySyncActors()`** — Delete (spawns branch sync actor which is being removed). -- **`forceProjectSync()`** / **`forceRepositorySync()`** — Delete (triggers branch sync actor). -- **`ensureTaskIndexHydrated()`** — Delete. This is the migration path from `HistoryActor` → `task_index` table. Since we assume fresh repositories, no migration needed. The task index is populated on write (`createTask` inserts the row). -- **`ensureTaskIndexHydratedForRead()`** — Delete (wrapper that dispatches `hydrateTaskIndex`). -- **`taskIndexHydrated` state flag** — Delete from repository actor state. - -The `ensureAskpassScript()` is fine — it's a fast local operation. - -### Dead schema tables and helpers (`actors/repository/db/schema.ts`, `actors/repository/actions.ts`) - -With the branch sync actor and git-spice stack operations deleted, these tables have no writer and should be removed: - -- **`branches` table** — populated by `RepositoryBranchSyncActor` from the local clone. Delete the table, its schema definition, and all reads from it (including `enrichTaskRecord` which reads `diffStat`, `hasUnpushed`, `conflictsWithMain`, `parentBranch` from this table). -- **`repoActionJobs` table** — populated by `runRepoStackAction()` for git-spice stack operations. Delete the table, its schema definition, and all helpers: `ensureRepoActionJobsTable()`, `writeRepoActionJob()`, `listRepoActionJobRows()`. - -## What gets modified - -### `actors/repository/actions.ts` (currently `project/actions.ts`) - -This is the biggest change. Current git operations in this file: - -1. **`createTaskMutation()`** — Currently calls `listLocalRemoteRefs` to check branch name conflicts against remote branches. Replace: branch conflict checking uses only the repository actor's `task_index` table (which branches are already taken by tasks). We don't need to check against remote branches — if the branch already exists on the remote, `git push` in the sandbox will handle it. -2. **`registerTaskBranch()`** — Currently does `fetch` + `remoteDefaultBaseRef` + `revParse` + git-spice stack tracking. Replace: default base branch comes from GitHub repo metadata (already stored from webhook/API at repo add time). SHA resolution is not needed at task creation — the sandbox handles it. Delete all git-spice stack tracking. -3. **`getRepoOverview()`** — Currently calls `listLocalRemoteRefs` + `remoteDefaultBaseRef` + `stack.available` + `stack.listStack`. Replace: branch data comes from GitHub API data we already store from webhooks (push/create/delete events feed branch state). Stack data is deleted. The overview returns branches from stored GitHub webhook data. -4. **`runRepoStackAction()`** — Delete entirely (all git-spice stack operations). -5. **All `normalizeBaseBranchName` imports from git-spice** — Inline or move to a simple utility if still needed. -6. **All `ensureTaskIndexHydrated*` / `ensureRepositoryReady*` call sites** — Remove. Read actions query the `task_index` table directly; if it's empty, it's empty. Write actions populate it on create. - -### `actors/repository/index.ts` (currently `project/index.ts`) - -- Remove local clone path from state/initialization -- Remove branch sync actor spawning -- Remove any `ensureLocalClone` calls in lifecycle - -### `actors/task/workbench.ts` - -- **`ensureSandboxRepo()` line 405**: Currently calls `driver.git.remoteDefaultBaseRef()` on the local clone. Replace: read default branch from repository actor state (which gets it from GitHub API/webhook data at repo add time). - -### `actors/organization/actions.ts` (currently `workspace/actions.ts`) - -- **`addRemote()` line 320**: Currently calls `driver.git.validateRemote()` which runs `git ls-remote`. Replace: validate via GitHub API — `GET /repos/{owner}/{repo}` returns 404 for invalid repos. We already parse the remote URL into owner/repo for GitHub operations. - -### `actors/keys.ts` / `actors/handles.ts` - -- Remove `repositoryBranchSyncKey` (currently `projectBranchSyncKey`) export -- Remove branch sync handle creation - -## What stays the same - -- `driver.github.*` — already uses GitHub API, no changes -- `driver.tmux.*` — unrelated, no changes -- `integrations/github/index.ts` — already GitHub API based, keeps working -- All sandbox execution (`executeInSandbox()`) — already correct pattern -- Webhook handlers for push/create/delete events — already feed GitHub data into backend - -## CLAUDE.md updates - -### `foundry/packages/backend/CLAUDE.md` - -Remove `RepositoryBranchSyncActor` (currently `ProjectBranchSyncActor`) from the actor hierarchy tree: - -```text -OrganizationActor -├─ HistoryActor(organization-scoped global feed) -├─ GithubDataActor -├─ RepositoryActor(repo) -│ └─ TaskActor(task) -│ ├─ TaskSessionActor(session) x N -│ │ └─ SessionStatusSyncActor(session) x 0..1 -│ └─ Task-local workbench state -└─ SandboxInstanceActor(sandboxProviderId, sandboxId) x N -``` - -Add to Ownership Rules: - -> - The backend stores no local git state. No clones, no refs, no working trees, no git-spice. Repo metadata (branches, default branch) comes from GitHub API and webhook events. All git operations that require a working tree execute inside sandboxes via `executeInSandbox()`. - -### `foundry/CLAUDE.md` - -Add a new section: - -```markdown -## Git State Policy - -- The backend stores **zero git state**. No local clones, no refs, no working trees, no git-spice. -- Repo metadata (branches, default branch, PRs) comes from GitHub API and webhook events already flowing into the system. -- All git operations that require a working tree (diff, push, conflict check, rev-parse) execute inside the task's sandbox via `executeInSandbox()`. -- Do not add local git clone paths, `git fetch`, `git for-each-ref`, or any direct git CLI calls to the backend. If you need git data, either read it from stored GitHub webhook/API data or run it in a sandbox. -- The `BackendDriver` has no `GitDriver` or `StackDriver`. Only `GithubDriver` and `TmuxDriver` remain. -- git-spice is not used anywhere in the system. -``` - -Remove from CLAUDE.md: - -> - Docker dev: `compose.dev.yaml` mounts a named volume at `/root/.local/share/foundry/repos` to persist backend-managed git clones across restarts. Code must still work if this volume is not present (create directories as needed). - -## Concerns - -1. **Concurrent agent work**: Another agent is currently modifying `workspace/actions.ts`, `project/actions.ts`, `task/workbench.ts`, `task/workflow/init.ts`, `task/workflow/queue.ts`, `driver.ts`, and `project-branch-sync/index.ts`. Those changes are adding `listLocalRemoteRefs` to the driver and removing polling loops/timeouts. The git clone removal work will **delete** the code the other agent is modifying. Coordinate: let the other agent's changes land first, then this spec deletes the git integration entirely. - -2. **Rename ordering**: The rename spec (workspace→organization, project→repository, etc.) should ideally land **before** this spec is executed, so the file paths and identifiers match. If not, the implementing agent should map old names → new names using the table above. - -3. **`project-pr-sync/` directory**: This is already an empty directory. Delete it as part of cleanup. - -4. **`ensureRepoActionJobsTable()`**: The current spec mentions this should stay but the `repoActionJobs` table is being deleted. Updating: both the table and the ensure function should be deleted. - -## Validation - -After implementation, run: - -```bash -pnpm -w typecheck -pnpm -w build -pnpm -w test -``` - -Then restart the dev stack and run the main user flow end-to-end: - -```bash -just foundry-dev-down && just foundry-dev -``` - -Verify: -1. Add a repo to an organization -2. Create a task (should return immediately with taskId) -3. Task appears in sidebar with pending status -4. Task provisions and transitions to ready -5. Session is created and initial message is sent -6. Agent responds in the session transcript - -This must work against a real GitHub repo (`rivet-dev/sandbox-agent-testing`) with the dev environment credentials. - -### Codebase grep validation - -After implementation, verify no local git operations or git-spice references remain in the backend: - -```bash -# No local git CLI calls (excludes integrations/github which is GitHub API, not local git) -rg -l 'execFileAsync\("git"' foundry/packages/backend/src/ && echo "FAIL: local git CLI calls found" || echo "PASS" - -# No git-spice references -rg -l 'git.spice|gitSpice|git_spice' foundry/packages/backend/src/ && echo "FAIL: git-spice references found" || echo "PASS" - -# No GitDriver or StackDriver references -rg -l 'GitDriver|StackDriver' foundry/packages/backend/src/ && echo "FAIL: deleted driver interfaces still referenced" || echo "PASS" - -# No local clone path references -rg -l 'localPath|ensureCloned|ensureLocalClone|foundryRepoClonePath' foundry/packages/backend/src/ && echo "FAIL: local clone references found" || echo "PASS" - -# No branch sync actor references -rg -l 'BranchSync|branchSync|branch.sync' foundry/packages/backend/src/ && echo "FAIL: branch sync references found" || echo "PASS" - -# No deleted ensure patterns -rg -l 'ensureProjectReady|ensureTaskIndexHydrated|taskIndexHydrated' foundry/packages/backend/src/ && echo "FAIL: deleted ensure patterns found" || echo "PASS" - -# integrations/git/ and integrations/git-spice/ directories should not exist -ls foundry/packages/backend/src/integrations/git/index.ts 2>/dev/null && echo "FAIL: git integration not deleted" || echo "PASS" -ls foundry/packages/backend/src/integrations/git-spice/index.ts 2>/dev/null && echo "FAIL: git-spice integration not deleted" || echo "PASS" -``` - -All checks must pass before the change is considered complete. - -### Rename verification - -After the rename spec has landed, verify no old names remain anywhere in `foundry/`: - -```bash -# --- workspace → organization --- -# No "WorkspaceActor", "WorkspaceEvent", "WorkspaceId", "WorkspaceSummary", etc. (exclude pnpm-workspace.yaml, node_modules, .turbo) -rg -l 'WorkspaceActor|WorkspaceEvent|WorkspaceId|WorkspaceSummary|WorkspaceHandle|WorkspaceUseInput|WorkspaceTopicParams' foundry/packages/ && echo "FAIL: workspace type references remain" || echo "PASS" - -# No workspaceId in domain code (exclude pnpm-workspace, node_modules, .turbo, this spec file) -rg -l 'workspaceId' foundry/packages/ --glob '!node_modules' --glob '!*.md' && echo "FAIL: workspaceId references remain" || echo "PASS" - -# No workspace actor directory -ls foundry/packages/backend/src/actors/workspace/ 2>/dev/null && echo "FAIL: workspace actor directory not renamed" || echo "PASS" - -# No workspaceKey function -rg 'workspaceKey|selfWorkspace|getOrCreateWorkspace|resolveWorkspaceId|defaultWorkspace' foundry/packages/ --glob '!node_modules' && echo "FAIL: workspace function references remain" || echo "PASS" - -# No "ws" actor key string (the old key prefix) -rg '"\\"ws\\""|\["ws"' foundry/packages/ --glob '!node_modules' && echo "FAIL: old 'ws' actor key strings remain" || echo "PASS" - -# No workspace queue names -rg 'workspace\.command\.' foundry/packages/ --glob '!node_modules' --glob '!*.md' && echo "FAIL: workspace queue names remain" || echo "PASS" - -# No /workspaces/ URL paths -rg '/workspaces/' foundry/packages/ --glob '!node_modules' --glob '!*.md' && echo "FAIL: /workspaces/ URL paths remain" || echo "PASS" - -# No config.workspace -rg 'config\.workspace' foundry/packages/ --glob '!node_modules' --glob '!*.md' && echo "FAIL: config.workspace references remain" || echo "PASS" - -# --- project → repository --- -# No ProjectActor, ProjectInput, ProjectSection, etc. -rg -l 'ProjectActor|ProjectInput|ProjectSection|PROJECT_QUEUE|PROJECT_COLORS' foundry/packages/ --glob '!node_modules' && echo "FAIL: project type references remain" || echo "PASS" - -# No project actor directory -ls foundry/packages/backend/src/actors/project/ 2>/dev/null && echo "FAIL: project actor directory not renamed" || echo "PASS" - -# No projectKey, selfProject, getOrCreateProject, etc. -rg 'projectKey|selfProject|getOrCreateProject|getProject\b|projectBranchSync|projectPrSync|projectWorkflow' foundry/packages/ --glob '!node_modules' && echo "FAIL: project function references remain" || echo "PASS" - -# No "project" actor key string -rg '"\\"project\\""|\[".*"project"' foundry/packages/ --glob '!node_modules' --glob '!*.md' && echo "FAIL: old project actor key strings remain" || echo "PASS" - -# No project.command.* queue names -rg 'project\.command\.' foundry/packages/ --glob '!node_modules' --glob '!*.md' && echo "FAIL: project queue names remain" || echo "PASS" - -# --- tab → session --- -# No WorkbenchAgentTab, TaskWorkbenchTabInput, TabStrip, tabId (in workbench context) -rg -l 'WorkbenchAgentTab|TaskWorkbenchTabInput|TaskWorkbenchAddTabResponse|TabStrip' foundry/packages/ --glob '!node_modules' && echo "FAIL: tab type references remain" || echo "PASS" - -# No tabId (should be sessionId now) -rg '\btabId\b' foundry/packages/ --glob '!node_modules' && echo "FAIL: tabId references remain" || echo "PASS" - -# No tab-strip.tsx file -ls foundry/packages/frontend/src/components/mock-layout/tab-strip.tsx 2>/dev/null && echo "FAIL: tab-strip.tsx not renamed" || echo "PASS" - -# No closeTab/addTab (should be closeSession/addSession) -rg '\bcloseTab\b|\baddTab\b' foundry/packages/ --glob '!node_modules' && echo "FAIL: closeTab/addTab references remain" || echo "PASS" - -# --- interest → subscription --- -# No InterestManager, useInterest, etc. -rg -l 'InterestManager|useInterest|DebugInterestTopic' foundry/packages/ --glob '!node_modules' && echo "FAIL: interest type references remain" || echo "PASS" - -# No interest/ directory -ls foundry/packages/client/src/interest/ 2>/dev/null && echo "FAIL: interest directory not renamed" || echo "PASS" - -# --- ProviderId → SandboxProviderId --- -# No bare ProviderId/ProviderIdSchema (but allow sandboxProviderId, model.provider, auth provider_id) -rg '\bProviderIdSchema\b|\bProviderId\b' foundry/packages/shared/src/contracts.ts && echo "FAIL: bare ProviderId in contracts.ts" || echo "PASS" - -# No bare providerId for sandbox context (check task schema) -rg '\bproviderId\b' foundry/packages/backend/src/actors/task/db/schema.ts && echo "FAIL: bare providerId in task schema" || echo "PASS" - -# No providerProfiles table (dead code, should be deleted) -rg 'providerProfiles|provider_profiles|refreshProviderProfiles' foundry/packages/ --glob '!node_modules' --glob '!*.md' && echo "FAIL: providerProfiles references remain" || echo "PASS" - -# --- Verify new names exist --- -rg -l 'OrganizationActor|OrganizationEvent|OrganizationId' foundry/packages/ --glob '!node_modules' | head -3 || echo "WARN: new organization names not found" -rg -l 'RepositoryActor|RepositoryInput|RepositorySection' foundry/packages/ --glob '!node_modules' | head -3 || echo "WARN: new repository names not found" -rg -l 'SubscriptionManager|useSubscription' foundry/packages/ --glob '!node_modules' | head -3 || echo "WARN: new subscription names not found" -rg -l 'SandboxProviderIdSchema|SandboxProviderId' foundry/packages/ --glob '!node_modules' | head -3 || echo "WARN: new sandbox provider names not found" -``` - -All checks must pass. False positives from markdown files, comments referencing old names in migration context, or `node_modules` should be excluded via the globs above. diff --git a/foundry/research/specs/rivetkit-opentui-migration-plan.md b/foundry/research/specs/rivetkit-opentui-migration-plan.md index 78acccc..d078c9a 100644 --- a/foundry/research/specs/rivetkit-opentui-migration-plan.md +++ b/foundry/research/specs/rivetkit-opentui-migration-plan.md @@ -6,19 +6,19 @@ Date: 2026-02-08 ## Locked Decisions 1. Entire rewrite is TypeScript. All Rust code will be deleted at cutover. -2. Repo stays a single monorepo, managed with `pnpm` organizations + Turborepo. +2. Repo stays a single monorepo, managed with `pnpm` workspaces + Turborepo. 3. `core` package is renamed to `shared`. 4. `integrations` and `providers` live inside the backend package (not top-level packages). 5. Rivet-backed state uses SQLite + Drizzle only. 6. RivetKit dependencies come from local `../rivet` builds only; no published npm packages. -7. Everything is organization-scoped. Organization is configurable from CLI. -8. `ControlPlaneActor` is renamed to `OrganizationActor` (organization coordinator). -9. Every actor key is prefixed by organization. -10. `--organization` is optional; commands resolve organization via flag -> config default -> `default`. +7. Everything is workspace-scoped. Workspace is configurable from CLI. +8. `ControlPlaneActor` is renamed to `WorkspaceActor` (workspace coordinator). +9. Every actor key is prefixed by workspace. +10. `--workspace` is optional; commands resolve workspace via flag -> config default -> `default`. 11. RivetKit local dependency wiring is `link:`-based. 12. Keep the existing config file path (`~/.config/foundry/config.toml`) and evolve keys in place. 13. `.agents` and skill files are in scope for migration updates. -14. Parent orchestration actors (`organization`, `repository`, `task`) use command-only loops with no timeout. +14. Parent orchestration actors (`workspace`, `project`, `task`) use command-only loops with no timeout. 15. Periodic syncing/polling runs in dedicated child actors, each with a single timeout cadence. 16. For each actor, define the main loop and exactly what data it mutates; keep single-writer ownership strict. @@ -38,10 +38,10 @@ The core architecture changes from "worktree-per-task" to "provider-selected san 1. Rust binaries/backend removed. 2. Existing IPC replaced by new TypeScript transport. -3. Configuration schema changes for organization selection and sandbox provider defaults. -4. Runtime model changes from global control plane to organization coordinator actor. -5. Database schema migrates to organization + provider + sandbox identity model. -6. Command options evolve to include organization and provider selection. +3. Configuration schema changes for workspace selection and sandbox provider defaults. +4. Runtime model changes from global control plane to workspace coordinator actor. +5. Database schema migrates to workspace + provider + sandbox identity model. +6. Command options evolve to include workspace and provider selection. ## Monorepo and Build Tooling @@ -49,7 +49,7 @@ Root tooling is standardized: - `pnpm-workspace.yaml` - `turbo.json` -- organization scripts through `pnpm` + `turbo run ...` +- workspace scripts through `pnpm` + `turbo run ...` Target package layout: @@ -59,13 +59,13 @@ packages/ backend/ src/ actors/ - organization.ts - repository.ts + workspace.ts + project.ts task.ts sandbox-instance.ts history.ts - repository-pr-sync.ts - repository-branch-sync.ts + project-pr-sync.ts + project-branch-sync.ts task-status-sync.ts keys.ts events.ts @@ -88,13 +88,13 @@ packages/ server.ts types.ts config/ - organization.ts + workspace.ts backend.ts cli/ # hf command surface src/ commands/ client/ # backend transport client - organization/ # organization selection resolver + workspace/ # workspace selection resolver tui/ # OpenTUI app src/ app/ @@ -111,13 +111,13 @@ CLI and TUI are separate packages in the same monorepo, not separate repositorie Backend actor files and responsibilities: -1. `packages/backend/src/actors/organization.ts` -- `OrganizationActor` implementation. -- Provider profile resolution and organization-level coordination. -- Spawns/routes to `RepositoryActor` handles. +1. `packages/backend/src/actors/workspace.ts` +- `WorkspaceActor` implementation. +- Provider profile resolution and workspace-level coordination. +- Spawns/routes to `ProjectActor` handles. -2. `packages/backend/src/actors/repository.ts` -- `RepositoryActor` implementation. +2. `packages/backend/src/actors/project.ts` +- `ProjectActor` implementation. - Branch snapshot refresh, PR cache orchestration, stream publication. - Routes task actions to `TaskActor`. @@ -134,7 +134,7 @@ Backend actor files and responsibilities: - Writes workflow events to SQLite via Drizzle. 6. `packages/backend/src/actors/keys.ts` -- Organization-prefixed actor key builders/parsers. +- Workspace-prefixed actor key builders/parsers. 7. `packages/backend/src/actors/events.ts` - Internal actor event envelopes and stream payload types. @@ -145,13 +145,13 @@ Backend actor files and responsibilities: 9. `packages/backend/src/actors/index.ts` - Actor exports and composition wiring. -10. `packages/backend/src/actors/repository-pr-sync.ts` +10. `packages/backend/src/actors/project-pr-sync.ts` - Read-only PR polling loop (single timeout cadence). -- Sends sync results back to `RepositoryActor`. +- Sends sync results back to `ProjectActor`. -11. `packages/backend/src/actors/repository-branch-sync.ts` +11. `packages/backend/src/actors/project-branch-sync.ts` - Read-only branch snapshot polling loop (single timeout cadence). -- Sends sync results back to `RepositoryActor`. +- Sends sync results back to `ProjectActor`. 12. `packages/backend/src/actors/task-status-sync.ts` - Read-only session/sandbox status polling loop (single timeout cadence). @@ -169,17 +169,17 @@ pnpm build -F rivetkit 2. Consume via local `link:` dependencies to built artifacts. 3. Keep dependency wiring deterministic and documented in repo scripts. -## Organization Model +## Workspace Model -Every command executes against a resolved organization context. +Every command executes against a resolved workspace context. -Organization selection: +Workspace selection: -1. CLI flag: `--organization ` -2. Config default organization +1. CLI flag: `--workspace ` +2. Config default workspace 3. Fallback to `default` -Organization controls: +Workspace controls: 1. provider profile defaults 2. sandbox policy @@ -188,45 +188,45 @@ Organization controls: ## New Actor Implementation Overview -RivetKit registry actor keys are organization-prefixed: +RivetKit registry actor keys are workspace-prefixed: -1. `OrganizationActor` (organization coordinator) -- Key: `["ws", organizationId]` -- Owns organization config/runtime coordination, provider registry, organization health. -- Resolves provider defaults and organization-level policies. +1. `WorkspaceActor` (workspace coordinator) +- Key: `["ws", workspaceId]` +- Owns workspace config/runtime coordination, provider registry, workspace health. +- Resolves provider defaults and workspace-level policies. -2. `RepositoryActor` -- Key: `["ws", organizationId, "repository", repoId]` +2. `ProjectActor` +- Key: `["ws", workspaceId, "project", repoId]` - Owns repo snapshot cache and PR cache refresh orchestration. - Routes branch/task commands to task actors. -- Streams repository updates to CLI/TUI subscribers. +- Streams project updates to CLI/TUI subscribers. 3. `TaskActor` -- Key: `["ws", organizationId, "repository", repoId, "task", taskId]` +- Key: `["ws", workspaceId, "project", repoId, "task", taskId]` - Owns task metadata/runtime state. - Creates/resumes sandbox + session through provider adapter. - Handles attach/push/sync/merge/archive/kill and post-idle automation. 4. `SandboxInstanceActor` (optional but recommended) -- Key: `["ws", organizationId, "provider", providerId, "sandbox", sandboxId]` +- Key: `["ws", workspaceId, "provider", providerId, "sandbox", sandboxId]` - Owns sandbox lifecycle, heartbeat, endpoint readiness, recovery. 5. `HistoryActor` -- Key: `["ws", organizationId, "repository", repoId, "history"]` +- Key: `["ws", workspaceId, "project", repoId, "history"]` - Owns `events` writes and workflow timeline completeness. 6. `ProjectPrSyncActor` (child poller) -- Key: `["ws", organizationId, "repository", repoId, "pr-sync"]` -- Polls PR state on interval and emits results to `RepositoryActor`. +- Key: `["ws", workspaceId, "project", repoId, "pr-sync"]` +- Polls PR state on interval and emits results to `ProjectActor`. - Does not write DB directly. 7. `ProjectBranchSyncActor` (child poller) -- Key: `["ws", organizationId, "repository", repoId, "branch-sync"]` -- Polls branch/worktree state on interval and emits results to `RepositoryActor`. +- Key: `["ws", workspaceId, "project", repoId, "branch-sync"]` +- Polls branch/worktree state on interval and emits results to `ProjectActor`. - Does not write DB directly. 8. `TaskStatusSyncActor` (child poller) -- Key: `["ws", organizationId, "repository", repoId, "task", taskId, "status-sync"]` +- Key: `["ws", workspaceId, "project", repoId, "task", taskId, "status-sync"]` - Polls agent/session/sandbox health on interval and emits results to `TaskActor`. - Does not write DB directly. @@ -236,10 +236,10 @@ Ownership rule: each table/row has one actor writer. Always define actor run-loop + mutated state together: -1. `OrganizationActor` -- Mutates: `organizations`, `workspace_provider_profiles`. +1. `WorkspaceActor` +- Mutates: `workspaces`, `workspace_provider_profiles`. -2. `RepositoryActor` +2. `ProjectActor` - Mutates: `repos`, `branches`, `pr_cache` (applies child poller results). 3. `TaskActor` @@ -251,30 +251,30 @@ Always define actor run-loop + mutated state together: 5. `HistoryActor` - Mutates: `events`. -6. Child sync actors (`repository-pr-sync`, `repository-branch-sync`, `task-status-sync`) +6. Child sync actors (`project-pr-sync`, `project-branch-sync`, `task-status-sync`) - Mutates: none (read-only pollers; publish result messages only). ## Run Loop Patterns (Required) Parent orchestration actors: no timeout, command-only queue loops. -### `OrganizationActor` (no timeout) +### `WorkspaceActor` (no timeout) ```ts run: async (c) => { while (true) { - const msg = await c.queue.next("organization.command"); - await handleOrganizationCommand(c, msg); // writes organization-owned tables only + const msg = await c.queue.next("workspace.command"); + await handleWorkspaceCommand(c, msg); // writes workspace-owned tables only } }; ``` -### `RepositoryActor` (no timeout) +### `ProjectActor` (no timeout) ```ts run: async (c) => { while (true) { - const msg = await c.queue.next("repository.command"); + const msg = await c.queue.next("project.command"); await handleProjectCommand(c, msg); // includes applying sync results to branches/pr_cache } }; @@ -321,10 +321,10 @@ Child sync actors: one timeout each, one cadence each. run: async (c) => { const intervalMs = 30_000; while (true) { - const msg = await c.queue.next("repository.pr_sync.command", { timeout: intervalMs }); + const msg = await c.queue.next("project.pr_sync.command", { timeout: intervalMs }); if (!msg) { const result = await pollPrState(); - await sendToProject({ name: "repository.pr_sync.result", result }); + await sendToProject({ name: "project.pr_sync.result", result }); continue; } await handlePrSyncControl(c, msg); // force/stop/update-interval @@ -338,10 +338,10 @@ run: async (c) => { run: async (c) => { const intervalMs = 5_000; while (true) { - const msg = await c.queue.next("repository.branch_sync.command", { timeout: intervalMs }); + const msg = await c.queue.next("project.branch_sync.command", { timeout: intervalMs }); if (!msg) { const result = await pollBranchState(); - await sendToProject({ name: "repository.branch_sync.result", result }); + await sendToProject({ name: "project.branch_sync.result", result }); continue; } await handleBranchSyncControl(c, msg); @@ -368,7 +368,7 @@ run: async (c) => { ## Sandbox Provider Interface -Provider contract lives under `packages/backend/src/providers/provider-api` and is consumed by organization/repository/task actors. +Provider contract lives under `packages/backend/src/providers/provider-api` and is consumed by workspace/project/task actors. ```ts interface SandboxProvider { @@ -398,26 +398,26 @@ Initial providers: - Boots/ensures Sandbox Agent inside sandbox. - Returns endpoint/token for session operations. -## Command Surface (Organization + Provider Aware) +## Command Surface (Workspace + Provider Aware) -1. `hf create ... --organization --provider ` -2. `hf switch --organization [target]` -3. `hf attach --organization [task]` -4. `hf list --organization ` -5. `hf kill|archive|merge|push|sync --organization ...` -6. `hf organization use ` to set default organization +1. `hf create ... --workspace --provider ` +2. `hf switch --workspace [target]` +3. `hf attach --workspace [task]` +4. `hf list --workspace ` +5. `hf kill|archive|merge|push|sync --workspace ...` +6. `hf workspace use ` to set default workspace List/TUI include provider and sandbox health metadata. -`--organization` remains optional; omitted values use the standard resolution order. +`--workspace` remains optional; omitted values use the standard resolution order. ## Data Model v2 (SQLite + Drizzle) All persistent state is SQLite via Drizzle schema + migrations. -Tables (organization-scoped): +Tables (workspace-scoped): -1. `organizations` +1. `workspaces` 2. `workspace_provider_profiles` 3. `repos` (`workspace_id`, `repo_id`, ...) 4. `branches` (`workspace_id`, `repo_id`, ...) @@ -433,10 +433,10 @@ Migration approach: one-way migration from existing schema during TS backend boo 1. TypeScript backend exposes local control API (socket or localhost HTTP). 2. CLI/TUI are thin clients; all mutations go through backend actors. -3. OpenTUI subscribes to repository streams from organization-scoped repository actors. -4. Organization is required context on all backend mutation requests. +3. OpenTUI subscribes to project streams from workspace-scoped project actors. +4. Workspace is required context on all backend mutation requests. -CLI/TUI are responsible for resolving organization context before calling backend mutations. +CLI/TUI are responsible for resolving workspace context before calling backend mutations. ## CLI + TUI Packaging @@ -451,10 +451,10 @@ The package still calls the same backend API and shares contracts from `packages ## Implementation Phases -## Phase 0: Contracts and Organization Spec +## Phase 0: Contracts and Workspace Spec -1. Freeze organization model, provider contract, and actor ownership map. -2. Freeze command flags for organization + provider selection. +1. Freeze workspace model, provider contract, and actor ownership map. +2. Freeze command flags for workspace + provider selection. 3. Define Drizzle schema draft and migration plan. Exit criteria: @@ -462,7 +462,7 @@ Exit criteria: ## Phase 1: TypeScript Monorepo Bootstrap -1. Add `pnpm` organization + Turborepo pipeline. +1. Add `pnpm` workspace + Turborepo pipeline. 2. Create `shared`, `backend`, and `cli` packages (with TUI integrated into CLI). 3. Add strict TypeScript config and CI checks. @@ -473,10 +473,10 @@ Exit criteria: 1. Wire local RivetKit dependency from `../rivet`. 2. Add SQLite + Drizzle migrations and query layer. -3. Implement actor registry with organization-prefixed keys. +3. Implement actor registry with workspace-prefixed keys. Exit criteria: -- Backend boot + organization actor health checks pass. +- Backend boot + workspace actor health checks pass. ## Phase 3: Provider Layer in Backend @@ -487,9 +487,9 @@ Exit criteria: Exit criteria: - `create/list/switch/attach/push/sync/kill` pass on worktree provider. -## Phase 4: Organization/Task Lifecycle +## Phase 4: Workspace/Task Lifecycle -1. Implement organization coordinator flows. +1. Implement workspace coordinator flows. 2. Implement TaskActor full lifecycle + post-idle automation. 3. Implement history events and PR/CI/review change tracking. @@ -509,7 +509,7 @@ Exit criteria: 1. Build interactive list/switch UI in OpenTUI. 2. Implement key actions (attach/open PR/archive/merge/sync). -3. Add organization switcher UX and provider/sandbox indicators. +3. Add workspace switcher UX and provider/sandbox indicators. Exit criteria: - TUI parity and responsive streaming updates. @@ -534,7 +534,7 @@ Exit criteria: 2. Integration tests - backend + sqlite + provider fakes -- organization isolation boundaries +- workspace isolation boundaries - session recovery and restart handling 3. E2E tests diff --git a/foundry/scripts/build-test-image.sh b/foundry/scripts/build-test-image.sh index a8cae9b..284c8bc 100755 --- a/foundry/scripts/build-test-image.sh +++ b/foundry/scripts/build-test-image.sh @@ -2,7 +2,7 @@ set -euo pipefail echo "Docker integration test image is not part of the TypeScript migration baseline." -echo "Use monorepo tests instead:" +echo "Use workspace tests instead:" echo " pnpm -w typecheck" echo " pnpm -w build" echo " pnpm -w test" diff --git a/foundry/scripts/data/rivet-dev.json b/foundry/scripts/data/rivet-dev.json index 3534cac..2b1b6f0 100644 --- a/foundry/scripts/data/rivet-dev.json +++ b/foundry/scripts/data/rivet-dev.json @@ -1060,7 +1060,7 @@ }, { "number": 222, - "title": "Recover wellington organization state", + "title": "Recover wellington workspace state", "state": "open", "draft": false, "headRefName": "recovery/wellington-20260309", @@ -1070,7 +1070,7 @@ }, { "number": 220, - "title": "Recover lisbon organization state", + "title": "Recover lisbon workspace state", "state": "open", "draft": false, "headRefName": "recovery/lisbon-20260309", @@ -1080,7 +1080,7 @@ }, { "number": 219, - "title": "Recover karachi-v2 organization state", + "title": "Recover karachi-v2 workspace state", "state": "open", "draft": false, "headRefName": "recovery/karachi-v2-20260309", @@ -1090,7 +1090,7 @@ }, { "number": 218, - "title": "Recover hamburg organization state", + "title": "Recover hamburg workspace state", "state": "open", "draft": false, "headRefName": "recovery/hamburg-20260309", @@ -1100,7 +1100,7 @@ }, { "number": 217, - "title": "Recover geneva organization state", + "title": "Recover geneva workspace state", "state": "open", "draft": false, "headRefName": "recovery/geneva-20260309", @@ -1110,7 +1110,7 @@ }, { "number": 216, - "title": "Recover edinburgh organization state", + "title": "Recover edinburgh workspace state", "state": "open", "draft": false, "headRefName": "recovery/edinburgh-20260309", diff --git a/foundry/scripts/publish-foundry-base.sh b/foundry/scripts/publish-foundry-base.sh deleted file mode 100755 index c6f1b15..0000000 --- a/foundry/scripts/publish-foundry-base.sh +++ /dev/null @@ -1,65 +0,0 @@ -#!/usr/bin/env bash -# -# Build and push the Foundry base sandbox image to Docker Hub. -# -# Usage: -# ./foundry/scripts/publish-foundry-base.sh # build + push -# ./foundry/scripts/publish-foundry-base.sh --dry-run # build only, no push -# -# Prerequisites: -# - docker login to Docker Hub (rivetdev org) -# - Docker buildx available (ships with Docker Desktop / modern Docker) -# -# The image is tagged: -# rivetdev/sandbox-agent:foundry-base-TZ -# rivetdev/sandbox-agent:foundry-base-latest -# -set -euo pipefail - -REPO_ROOT="$(cd "$(dirname "$0")/../.." && pwd)" -IMAGE="rivetdev/sandbox-agent" -TIMESTAMP="$(date -u '+%Y%m%dT%H%M%SZ')" -TAG_DATED="${IMAGE}:foundry-base-${TIMESTAMP}" -TAG_LATEST="${IMAGE}:foundry-base-latest" -DRY_RUN=false - -for arg in "$@"; do - case "$arg" in - --dry-run) DRY_RUN=true ;; - *) echo "Unknown argument: $arg" >&2; exit 1 ;; - esac -done - -echo "==> Building ${TAG_DATED}" -echo " (also tagged ${TAG_LATEST})" -echo " Platform: linux/amd64" -echo "" - -docker build \ - --platform linux/amd64 \ - -f "${REPO_ROOT}/foundry/docker/foundry-base.Dockerfile" \ - -t "${TAG_DATED}" \ - -t "${TAG_LATEST}" \ - "${REPO_ROOT}" - -echo "" -echo "==> Build complete" -echo " ${TAG_DATED}" -echo " ${TAG_LATEST}" - -if [ "$DRY_RUN" = true ]; then - echo "" - echo "==> Dry run — skipping push" - exit 0 -fi - -echo "" -echo "==> Pushing ${TAG_DATED}" -docker push "${TAG_DATED}" - -echo "==> Pushing ${TAG_LATEST}" -docker push "${TAG_LATEST}" - -echo "" -echo "==> Done" -echo " ${TAG_DATED}" diff --git a/foundry/scripts/pull-org-data.ts b/foundry/scripts/pull-org-data.ts index 3580baa..1759cad 100644 --- a/foundry/scripts/pull-org-data.ts +++ b/foundry/scripts/pull-org-data.ts @@ -2,8 +2,8 @@ /** * Pull public GitHub organization data into a JSON fixture file. * - * This script mirrors the sync logic in the backend organization actor - * (see: packages/backend/src/actors/organization/app-shell.ts — syncGithubOrganizations + * This script mirrors the sync logic in the backend workspace actor + * (see: packages/backend/src/actors/workspace/app-shell.ts — syncGithubOrganizations * and syncGithubOrganizationRepos). Keep the two in sync: when the backend * sync workflow changes what data it fetches or how it structures organizations, * update this script to match. @@ -205,8 +205,8 @@ async function pullOrgData(orgLogin: string): Promise { console.log(` ${members.length} public members`); // 4. Fetch open PRs across all public repos - // Backend equivalent: open PR metadata is pulled from GitHub and merged into - // the organization/repository projections used by the UI. + // Backend equivalent: ProjectPrSyncActor polls GitHub for open PRs per repo + // and stores them in the pr_cache table on the project actor const openPullRequests: OrgFixturePullRequest[] = []; for (const repo of repos) { const rawPrs = await githubPaginate<{ diff --git a/frontend/packages/inspector/index.html b/frontend/packages/inspector/index.html index 6a5d064..5893717 100644 --- a/frontend/packages/inspector/index.html +++ b/frontend/packages/inspector/index.html @@ -2889,181 +2889,6 @@ gap: 20px; } - .desktop-panel { - display: flex; - flex-direction: column; - gap: 16px; - } - - .desktop-state-grid { - display: grid; - grid-template-columns: repeat(3, minmax(0, 1fr)); - gap: 12px; - margin-bottom: 12px; - } - - .desktop-start-controls { - display: grid; - grid-template-columns: repeat(3, minmax(0, 1fr)); - gap: 10px; - } - - .desktop-screenshot-controls { - display: flex; - align-items: flex-end; - gap: 10px; - flex-wrap: wrap; - margin-bottom: 12px; - } - - .desktop-checkbox-label { - display: flex; - align-items: center; - gap: 6px; - font-size: 12px; - cursor: pointer; - white-space: nowrap; - padding-bottom: 4px; - } - - .desktop-advanced-grid { - display: grid; - grid-template-columns: repeat(3, minmax(0, 1fr)); - gap: 10px; - margin-top: 8px; - } - - .desktop-input-group { - display: flex; - flex-direction: column; - gap: 4px; - } - - .desktop-chip-list { - display: flex; - flex-wrap: wrap; - gap: 8px; - } - - .desktop-command { - margin-top: 6px; - padding: 8px 10px; - border-radius: var(--radius); - border: 1px solid var(--border); - background: var(--surface); - overflow-x: auto; - } - - .desktop-diagnostic-block + .desktop-diagnostic-block { - margin-top: 14px; - } - - .desktop-process-list { - display: flex; - flex-direction: column; - gap: 10px; - margin-top: 8px; - } - - .desktop-process-item { - padding: 10px; - border-radius: var(--radius); - border: 1px solid var(--border); - background: var(--surface); - display: flex; - flex-direction: column; - gap: 4px; - } - - .desktop-clipboard-text { - margin: 4px 0 0; - padding: 8px 10px; - border-radius: var(--radius); - border: 1px solid var(--border); - background: var(--surface); - font-size: 12px; - white-space: pre-wrap; - word-break: break-all; - max-height: 120px; - overflow-y: auto; - } - - .desktop-window-item { - padding: 10px; - border-radius: var(--radius); - border: 1px solid var(--border); - background: var(--surface); - display: flex; - flex-direction: column; - gap: 6px; - } - - .desktop-window-focused { - border-color: var(--success); - box-shadow: inset 0 0 0 1px var(--success); - } - - .desktop-window-editor { - display: flex; - align-items: center; - gap: 6px; - margin-top: 6px; - padding-top: 6px; - border-top: 1px solid var(--border); - } - - .desktop-launch-row { - display: flex; - align-items: center; - gap: 8px; - margin-top: 6px; - flex-wrap: wrap; - } - - .desktop-mouse-pos { - display: flex; - align-items: center; - gap: 8px; - margin-top: 8px; - } - - .desktop-stream-hint { - display: flex; - align-items: center; - justify-content: space-between; - gap: 12px; - margin-bottom: 8px; - font-size: 11px; - color: var(--muted); - } - - .desktop-screenshot-empty { - padding: 18px; - border: 1px dashed var(--border); - border-radius: var(--radius); - color: var(--muted); - background: var(--surface); - text-align: center; - } - - .desktop-screenshot-frame { - border-radius: calc(var(--radius) + 2px); - overflow: hidden; - border: 1px solid var(--border); - background: - linear-gradient(135deg, rgba(15, 23, 42, 0.9), rgba(30, 41, 59, 0.92)), - radial-gradient(circle at top right, rgba(56, 189, 248, 0.12), transparent 40%); - padding: 10px; - } - - .desktop-screenshot-image { - display: block; - width: 100%; - height: auto; - border-radius: var(--radius); - background: rgba(0, 0, 0, 0.24); - } - .processes-section { display: flex; flex-direction: column; @@ -3726,12 +3551,6 @@ grid-template-columns: 1fr; } - .desktop-state-grid, - .desktop-start-controls, - .desktop-advanced-grid { - grid-template-columns: 1fr; - } - .session-sidebar { display: none; } diff --git a/frontend/packages/inspector/package.json b/frontend/packages/inspector/package.json index 6d86357..9671ecb 100644 --- a/frontend/packages/inspector/package.json +++ b/frontend/packages/inspector/package.json @@ -6,24 +6,24 @@ "type": "module", "scripts": { "dev": "vite", - "build": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/react build && vite build", + "build": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/persist-indexeddb build && pnpm --filter @sandbox-agent/react build && vite build", "preview": "vite preview", - "typecheck": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/react build && tsc --noEmit", - "test": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/react build && vitest run" + "typecheck": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/persist-indexeddb build && pnpm --filter @sandbox-agent/react build && tsc --noEmit", + "test": "SKIP_OPENAPI_GEN=1 pnpm --filter @sandbox-agent/persist-indexeddb build && pnpm --filter @sandbox-agent/react build && vitest run" }, "devDependencies": { "@sandbox-agent/react": "workspace:*", "sandbox-agent": "workspace:*", - "@types/react": "^19.1.12", - "@types/react-dom": "^19.1.6", + "@types/react": "^18.3.3", + "@types/react-dom": "^18.3.0", "@vitejs/plugin-react": "^4.3.1", "fake-indexeddb": "^6.2.4", - "jsdom": "^26.1.0", "typescript": "^5.7.3", "vite": "^5.4.7", "vitest": "^3.0.0" }, "dependencies": { + "@sandbox-agent/persist-indexeddb": "workspace:*", "lucide-react": "^0.469.0", "react": "^18.3.1", "react-dom": "^18.3.1" diff --git a/frontend/packages/inspector/src/App.tsx b/frontend/packages/inspector/src/App.tsx index f6e319c..a829ae6 100644 --- a/frontend/packages/inspector/src/App.tsx +++ b/frontend/packages/inspector/src/App.tsx @@ -24,7 +24,7 @@ type ConfigOption = { }; type AgentModeInfo = { id: string; name: string; description: string }; type AgentModelInfo = { id: string; name?: string }; -import { IndexedDbSessionPersistDriver } from "./persist-indexeddb"; +import { IndexedDbSessionPersistDriver } from "@sandbox-agent/persist-indexeddb"; import ChatPanel from "./components/chat/ChatPanel"; import ConnectScreen from "./components/ConnectScreen"; import DebugPanel, { type DebugTab } from "./components/debug/DebugPanel"; @@ -286,7 +286,7 @@ export default function App() { const [highlightedEventId, setHighlightedEventId] = useState(null); const [debugPanelCollapsed, setDebugPanelCollapsed] = useState(false); - const transcriptScrollRef = useRef(null); + const messagesEndRef = useRef(null); const clientRef = useRef(null); const activeSessionRef = useRef(null); @@ -1434,6 +1434,10 @@ export default function App() { }); }, [connected, sessionId, sessions, getClient, subscribeToSession]); + useEffect(() => { + messagesEndRef.current?.scrollIntoView({ behavior: "smooth" }); + }, [transcriptEntries]); + const currentAgent = agents.find((agent) => agent.id === agentId); const agentLabel = agentDisplayNames[agentId] ?? agentId; const selectedSession = sessions.find((s) => s.sessionId === sessionId); @@ -1739,7 +1743,7 @@ export default function App() { } agentsLoading={agentsLoading} agentsError={agentsError} - scrollRef={transcriptScrollRef} + messagesEndRef={messagesEndRef} agentLabel={agentLabel} modelLabel={modelPillLabel} currentAgentVersion={currentAgent?.version ?? null} diff --git a/frontend/packages/inspector/src/components/chat/ChatPanel.tsx b/frontend/packages/inspector/src/components/chat/ChatPanel.tsx index 9203105..5d77a93 100644 --- a/frontend/packages/inspector/src/components/chat/ChatPanel.tsx +++ b/frontend/packages/inspector/src/components/chat/ChatPanel.tsx @@ -1,6 +1,6 @@ import type { TranscriptEntry } from "@sandbox-agent/react"; import { AlertTriangle, Archive, CheckSquare, MessageSquare, Plus, Square, Terminal } from "lucide-react"; -import { useEffect, useRef, useState, type RefObject } from "react"; +import { useEffect, useRef, useState } from "react"; import type { AgentInfo } from "sandbox-agent"; import { formatShortId } from "../../utils/format"; @@ -40,7 +40,7 @@ const ChatPanel = ({ agents, agentsLoading, agentsError, - scrollRef, + messagesEndRef, agentLabel, modelLabel, currentAgentVersion, @@ -71,7 +71,7 @@ const ChatPanel = ({ agents: AgentInfo[]; agentsLoading: boolean; agentsError: string | null; - scrollRef: RefObject; + messagesEndRef: React.RefObject; agentLabel: string; modelLabel?: string | null; currentAgentVersion?: string | null; @@ -233,7 +233,7 @@ const ChatPanel = ({ entries={transcriptEntries} sessionError={sessionError} eventError={null} - scrollRef={scrollRef} + messagesEndRef={messagesEndRef} onEventClick={onEventClick} isThinking={isThinking} agentId={agentId} diff --git a/frontend/packages/inspector/src/components/chat/InspectorConversation.tsx b/frontend/packages/inspector/src/components/chat/InspectorConversation.tsx index f14e39e..cb3c1af 100644 --- a/frontend/packages/inspector/src/components/chat/InspectorConversation.tsx +++ b/frontend/packages/inspector/src/components/chat/InspectorConversation.tsx @@ -7,7 +7,7 @@ import { type TranscriptEntry, } from "@sandbox-agent/react"; import { AlertTriangle, Brain, Check, ChevronDown, ChevronRight, ExternalLink, Info, PlayCircle, Send, Shield, Wrench, X } from "lucide-react"; -import type { ReactNode, RefObject } from "react"; +import type { ReactNode } from "react"; import MarkdownText from "./MarkdownText"; const agentLogos: Record = { @@ -84,7 +84,7 @@ export interface InspectorConversationProps { entries: TranscriptEntry[]; sessionError: string | null; eventError?: string | null; - scrollRef: RefObject; + messagesEndRef: React.RefObject; onEventClick?: (eventId: string) => void; isThinking?: boolean; agentId?: string; @@ -102,7 +102,7 @@ const InspectorConversation = ({ entries, sessionError, eventError, - scrollRef, + messagesEndRef, onEventClick, isThinking, agentId, @@ -119,13 +119,12 @@ const InspectorConversation = ({ Agents - - {isActive && !liveViewActive && ( - - )} - - {isActive && !liveViewActive && ( -
-
- - -
- {screenshotFormat !== "png" && ( -
- - setScreenshotQuality(event.target.value)} - inputMode="numeric" - style={{ maxWidth: 60 }} - /> -
- )} -
- - setScreenshotScale(event.target.value)} - inputMode="decimal" - style={{ maxWidth: 60 }} - /> -
- -
- )} - {error &&
{error}
} - {screenshotError &&
{screenshotError}
} - {/* ========== Runtime Section ========== */} -
-
- - - Desktop Runtime - - - {status?.state ?? "unknown"} - -
-
-
-
Display
-
{status?.display ?? "Not assigned"}
-
-
-
Resolution
-
{resolutionLabel}
-
-
-
Started
-
{formatStartedAt(status?.startedAt)}
-
-
-
-
- - setWidth(event.target.value)} inputMode="numeric" /> -
-
- - setHeight(event.target.value)} inputMode="numeric" /> -
-
- - setDpi(event.target.value)} inputMode="numeric" /> -
-
- - {showAdvancedStart && ( -
-
- - -
-
- - -
-
- - setStreamFrameRate(event.target.value)} - inputMode="numeric" - disabled={isActive} - /> -
-
- - setWebrtcPortRange(event.target.value)} disabled={isActive} /> -
-
- - setDefaultRecordingFps(event.target.value)} - inputMode="numeric" - disabled={isActive} - /> -
-
- )} -
- {isActive ? ( - - ) : ( - - )} -
-
- {/* ========== Missing Dependencies ========== */} - {status?.missingDependencies && status.missingDependencies.length > 0 && ( -
-
- Missing Dependencies -
-
- {status.missingDependencies.map((dependency) => ( - - {dependency} - - ))} -
- {status.installCommand && ( - <> -
- Install command -
-
{status.installCommand}
- - )} -
- )} - {/* ========== Live View Section ========== */} -
-
- - - {isActive && ( - - )} -
- {liveViewError && ( -
- {liveViewError} -
- )} - {!isActive &&
Start the desktop runtime to enable live view.
} - {isActive && liveViewActive && ( - <> -
- Right click to open window - {status?.resolution && ( - - {status.resolution.width}x{status.resolution.height} - - )} -
- - - )} - {isActive && !liveViewActive && ( - <> - {screenshotUrl ? ( -
- Desktop screenshot -
- ) : ( -
Click "Start Stream" for live desktop view, or use the Screenshot button above.
- )} - - )} - {isActive && ( -
- - {mousePos && ( - - ({mousePos.x}, {mousePos.y}) - - )} -
- )} -
- {isActive && ( -
-
- - - Clipboard - -
- - -
-
- {clipboardError && ( -
- {clipboardError} -
- )} -
-
Current contents
-
-              {clipboardText ? clipboardText : (empty)}
-            
-
-
-
Write to clipboard
-