diff --git a/.claude/commands/post-release-testing.md b/.claude/commands/post-release-testing.md index 09e2b6a..10cf6ff 100644 --- a/.claude/commands/post-release-testing.md +++ b/.claude/commands/post-release-testing.md @@ -43,7 +43,7 @@ Manually verify the install script works in a fresh environment: ```bash docker run --rm alpine:latest sh -c " apk add --no-cache curl ca-certificates libstdc++ libgcc bash && - curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh && + curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh && sandbox-agent --version " ``` diff --git a/.gitignore b/.gitignore index 7b6c859..de4d863 100644 --- a/.gitignore +++ b/.gitignore @@ -59,3 +59,4 @@ sdks/cli/platforms/*/bin/ # Foundry desktop app build artifacts foundry/packages/desktop/frontend-dist/ foundry/packages/desktop/src-tauri/sidecars/ +.context/ diff --git a/CLAUDE.md b/CLAUDE.md index 4935aa5..248f075 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -20,20 +20,7 @@ - For HTTP/CLI docs/examples, source of truth is: - `server/packages/sandbox-agent/src/router.rs` - `server/packages/sandbox-agent/src/cli.rs` -- Keep docs aligned to implemented endpoints/commands only (for example ACP under `/v1/acp`, not legacy `/v1/sessions` APIs). - -## E2E Agent Testing - -- When asked to test agents e2e and you do not have the API tokens/credentials required, always stop and ask the user where to find the tokens before proceeding. - -## ACP Adapter Audit - -- `scripts/audit-acp-deps/adapters.json` is the single source of truth for ACP adapter npm packages, pinned versions, and the `@agentclientprotocol/sdk` pin. -- The Rust fallback install path in `server/packages/agent-management/src/agents.rs` reads adapter entries from `adapters.json` at compile time via `include_str!`. -- Run `cd scripts/audit-acp-deps && npx tsx audit.ts` to compare our pinned versions against the ACP registry and npm latest. -- When bumping an adapter version, update `adapters.json` only — the Rust code picks it up automatically. -- When adding a new agent, add an entry to `adapters.json` (the `_` fallback arm in `install_agent_process_fallback` handles it). -- When updating the `@agentclientprotocol/sdk` pin, update both `adapters.json` (sdkDeps) and `sdks/acp-http-client/package.json`. +- Keep docs aligned to implemented endpoints/commands only (for example ACP under `/v1/acp`, not legacy session REST APIs). ## Change Tracking @@ -43,41 +30,22 @@ - Regenerate `docs/openapi.json` when HTTP contracts change. - Keep `docs/inspector.mdx` and `docs/sdks/typescript.mdx` aligned with implementation. - Append blockers/decisions to `research/acp/friction.md` during ACP work. -- Each agent has its own doc page at `docs/agents/.mdx` listing models, modes, and thought levels. Update the relevant page when changing `fallback_config_options`. To regenerate capability data, run `cd scripts/agent-configs && npx tsx dump.ts`. Source data: `scripts/agent-configs/resources/*.json` and hardcoded entries in `server/packages/sandbox-agent/src/router/support.rs` (`fallback_config_options`). +- `docs/agent-capabilities.mdx` lists models/modes/thought levels per agent. Update it when adding a new agent or changing `fallback_config_options`. If its "Last updated" date is >2 weeks old, re-run `cd scripts/agent-configs && npx tsx dump.ts` and update the doc to match. Source data: `scripts/agent-configs/resources/*.json` and hardcoded entries in `server/packages/sandbox-agent/src/router/support.rs` (`fallback_config_options`). - Some agent models are gated by subscription (e.g. Claude `opus`). The live report only shows models available to the current credentials. The static doc and JSON resource files should list all known models regardless of subscription tier. -## Adding Providers +## Docker Test Image -When adding a new sandbox provider, update all of the following: +- Docker-backed Rust and TypeScript tests build `docker/test-agent/Dockerfile` directly in-process and cache the image tag only in memory (`OnceLock` in Rust, module-level variable in TypeScript). +- Do not add cross-process image-build scripts unless there is a concrete need for them. -- `sdks/typescript/src/providers/.ts` — provider implementation -- `sdks/typescript/package.json` — add `./` export, peerDependencies, peerDependenciesMeta, devDependencies -- `sdks/typescript/tsup.config.ts` — add entry point and external -- `sdks/typescript/tests/providers.test.ts` — add test entry -- `examples//` — create example with `src/index.ts` and `tests/.test.ts` -- `docs/deploy/.mdx` — create deploy guide -- `docs/docs.json` — add to Deploy pages navigation -- `docs/quickstart.mdx` — add tab in "Start the sandbox" step, add credentials entry in "Passing LLM credentials" accordion +## Common Software Sync -## Adding Agents - -When adding a new agent, update all of the following: - -- `docs/agents/.mdx` — create agent page with usage snippet and capabilities table -- `docs/docs.json` — add to the Agents group under Agent -- `docs/quickstart.mdx` — add tab in the "Create a session and send a prompt" CodeGroup - -## Persist Packages (Deprecated) - -- The `@sandbox-agent/persist-*` npm packages (`persist-sqlite`, `persist-postgres`, `persist-indexeddb`, `persist-rivet`) are deprecated stubs. They still publish to npm but throw a deprecation error at import time. -- Driver implementations now live inline in examples and consuming packages: - - SQLite: `examples/persist-sqlite/src/persist.ts` - - Postgres: `examples/persist-postgres/src/persist.ts` - - IndexedDB: `frontend/packages/inspector/src/persist-indexeddb.ts` - - Rivet: inlined in `docs/multiplayer.mdx` - - In-memory: built into the main `sandbox-agent` SDK (`InMemorySessionPersistDriver`) -- Docs (`docs/session-persistence.mdx`) link to the example implementations on GitHub instead of referencing the packages. -- Do not re-add `@sandbox-agent/persist-*` as dependencies anywhere. New persist drivers should be copied into the consuming project directly. +- These three files must stay in sync: + - `docs/common-software.mdx` (user-facing documentation) + - `docker/test-common-software/Dockerfile` (packages installed in the test image) + - `server/packages/sandbox-agent/tests/common_software.rs` (test assertions) +- When adding or removing software from `docs/common-software.mdx`, also add/remove the corresponding `apt-get install` line in the Dockerfile and add/remove the test in `common_software.rs`. +- Run `cargo test -p sandbox-agent --test common_software` to verify. ## Install Version References @@ -93,28 +61,20 @@ When adding a new agent, update all of the following: - `docs/sdk-overview.mdx` - `docs/react-components.mdx` - `docs/session-persistence.mdx` - - `docs/architecture.mdx` - `docs/deploy/local.mdx` - `docs/deploy/cloudflare.mdx` - `docs/deploy/vercel.mdx` - `docs/deploy/daytona.mdx` - `docs/deploy/e2b.mdx` - `docs/deploy/docker.mdx` - - `docs/deploy/boxlite.mdx` - - `docs/deploy/modal.mdx` - - `docs/deploy/computesdk.mdx` - `frontend/packages/website/src/components/GetStarted.tsx` - `.claude/commands/post-release-testing.md` - `examples/cloudflare/Dockerfile` - - `examples/boxlite/Dockerfile` - - `examples/boxlite-python/Dockerfile` - `examples/daytona/src/index.ts` - `examples/shared/src/docker.ts` - `examples/docker/src/index.ts` - `examples/e2b/src/index.ts` - `examples/vercel/src/index.ts` - - `sdks/typescript/src/providers/shared.ts` - `scripts/release/main.ts` - `scripts/release/promote-artifacts.ts` - `scripts/release/sdk.ts` - - `scripts/sandbox-testing/test-sandbox.ts` diff --git a/Cargo.toml b/Cargo.toml index de0ff12..0fc4dc8 100644 --- a/Cargo.toml +++ b/Cargo.toml @@ -4,7 +4,7 @@ members = ["server/packages/*", "gigacode"] exclude = ["factory/packages/desktop/src-tauri", "foundry/packages/desktop/src-tauri"] [workspace.package] -version = "0.4.0" +version = "0.4.2" edition = "2021" authors = [ "Rivet Gaming, LLC " ] license = "Apache-2.0" @@ -13,13 +13,13 @@ description = "Universal API for automatic coding agents in sandboxes. Supports [workspace.dependencies] # Internal crates -sandbox-agent = { version = "0.4.0", path = "server/packages/sandbox-agent" } -sandbox-agent-error = { version = "0.4.0", path = "server/packages/error" } -sandbox-agent-agent-management = { version = "0.4.0", path = "server/packages/agent-management" } -sandbox-agent-agent-credentials = { version = "0.4.0", path = "server/packages/agent-credentials" } -sandbox-agent-opencode-adapter = { version = "0.4.0", path = "server/packages/opencode-adapter" } -sandbox-agent-opencode-server-manager = { version = "0.4.0", path = "server/packages/opencode-server-manager" } -acp-http-adapter = { version = "0.4.0", path = "server/packages/acp-http-adapter" } +sandbox-agent = { version = "0.4.2", path = "server/packages/sandbox-agent" } +sandbox-agent-error = { version = "0.4.2", path = "server/packages/error" } +sandbox-agent-agent-management = { version = "0.4.2", path = "server/packages/agent-management" } +sandbox-agent-agent-credentials = { version = "0.4.2", path = "server/packages/agent-credentials" } +sandbox-agent-opencode-adapter = { version = "0.4.2", path = "server/packages/opencode-adapter" } +sandbox-agent-opencode-server-manager = { version = "0.4.2", path = "server/packages/opencode-server-manager" } +acp-http-adapter = { version = "0.4.2", path = "server/packages/acp-http-adapter" } # Serialization serde = { version = "1.0", features = ["derive"] } diff --git a/README.md b/README.md index eb427d7..cf9b933 100644 --- a/README.md +++ b/README.md @@ -80,11 +80,11 @@ Import the SDK directly into your Node or browser application. Full type safety **Install** ```bash -npm install sandbox-agent@0.3.x +npm install sandbox-agent@0.4.x ``` ```bash -bun add sandbox-agent@0.3.x +bun add sandbox-agent@0.4.x # Optional: allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()). bun pm trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 ``` @@ -135,7 +135,7 @@ Run as an HTTP server and connect from any language. Deploy to E2B, Daytona, Ver ```bash # Install it -curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh +curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh # Run it sandbox-agent server --token "$SANDBOX_TOKEN" --host 127.0.0.1 --port 2468 ``` @@ -159,12 +159,12 @@ sandbox-agent server --no-token --host 127.0.0.1 --port 2468 Install the CLI wrapper (optional but convenient): ```bash -npm install -g @sandbox-agent/cli@0.3.x +npm install -g @sandbox-agent/cli@0.4.x ``` ```bash # Allow Bun to run postinstall scripts for native binaries. -bun add -g @sandbox-agent/cli@0.3.x +bun add -g @sandbox-agent/cli@0.4.x bun pm -g trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 ``` @@ -179,11 +179,11 @@ sandbox-agent api sessions send-message-stream my-session --message "Hello" --en You can also use npx like: ```bash -npx @sandbox-agent/cli@0.3.x --help +npx @sandbox-agent/cli@0.4.x --help ``` ```bash -bunx @sandbox-agent/cli@0.3.x --help +bunx @sandbox-agent/cli@0.4.x --help ``` [CLI documentation](https://sandboxagent.dev/docs/cli) diff --git a/docker/inspector-dev/Dockerfile b/docker/inspector-dev/Dockerfile new file mode 100644 index 0000000..b55923f --- /dev/null +++ b/docker/inspector-dev/Dockerfile @@ -0,0 +1,7 @@ +FROM node:22-bookworm-slim + +RUN npm install -g pnpm@10.28.2 + +WORKDIR /app + +CMD ["bash", "-lc", "pnpm install --filter @sandbox-agent/inspector... && cd frontend/packages/inspector && exec pnpm vite --host 0.0.0.0 --port 5173"] diff --git a/docker/runtime/Dockerfile b/docker/runtime/Dockerfile index e0a3335..85473be 100644 --- a/docker/runtime/Dockerfile +++ b/docker/runtime/Dockerfile @@ -149,7 +149,8 @@ FROM debian:bookworm-slim RUN apt-get update && apt-get install -y \ ca-certificates \ curl \ - git && \ + git \ + ffmpeg && \ rm -rf /var/lib/apt/lists/* # Copy the binary from builder diff --git a/docker/test-agent/Dockerfile b/docker/test-agent/Dockerfile new file mode 100644 index 0000000..67888b3 --- /dev/null +++ b/docker/test-agent/Dockerfile @@ -0,0 +1,61 @@ +FROM rust:1.88.0-bookworm AS builder +WORKDIR /build + +COPY Cargo.toml Cargo.lock ./ +COPY server/ ./server/ +COPY gigacode/ ./gigacode/ +COPY resources/agent-schemas/artifacts/ ./resources/agent-schemas/artifacts/ +COPY scripts/agent-configs/ ./scripts/agent-configs/ +COPY scripts/audit-acp-deps/ ./scripts/audit-acp-deps/ + +ENV SANDBOX_AGENT_SKIP_INSPECTOR=1 + +RUN --mount=type=cache,target=/usr/local/cargo/registry \ + --mount=type=cache,target=/usr/local/cargo/git \ + --mount=type=cache,target=/build/target \ + cargo build -p sandbox-agent --release && \ + cp target/release/sandbox-agent /sandbox-agent + +# Extract neko binary from the official image for WebRTC desktop streaming. +# Using neko v3 base image from GHCR which provides multi-arch support (amd64, arm64). +# Pinned by digest to prevent breaking changes from upstream. +# Reference client: https://github.com/demodesk/neko-client/blob/37f93eae6bd55b333c94bd009d7f2b079075a026/src/component/internal/webrtc.ts +FROM ghcr.io/m1k1o/neko/base@sha256:0c384afa56268aaa2d5570211d284763d0840dcdd1a7d9a24be3081d94d3dfce AS neko-base + +FROM node:22-bookworm-slim +RUN apt-get update -qq && \ + apt-get install -y -qq --no-install-recommends \ + ca-certificates \ + bash \ + libstdc++6 \ + xvfb \ + openbox \ + xdotool \ + imagemagick \ + ffmpeg \ + gstreamer1.0-tools \ + gstreamer1.0-plugins-base \ + gstreamer1.0-plugins-good \ + gstreamer1.0-plugins-bad \ + gstreamer1.0-plugins-ugly \ + gstreamer1.0-nice \ + gstreamer1.0-x \ + gstreamer1.0-pulseaudio \ + libxcvt0 \ + x11-xserver-utils \ + dbus-x11 \ + xauth \ + fonts-dejavu-core \ + xterm \ + > /dev/null 2>&1 && \ + rm -rf /var/lib/apt/lists/* + +COPY --from=builder /sandbox-agent /usr/local/bin/sandbox-agent +COPY --from=neko-base /usr/bin/neko /usr/local/bin/neko + +EXPOSE 3000 +# Expose UDP port range for WebRTC media transport +EXPOSE 59050-59070/udp + +ENTRYPOINT ["/usr/local/bin/sandbox-agent"] +CMD ["server", "--host", "0.0.0.0", "--port", "3000", "--no-token"] diff --git a/docker/test-common-software/Dockerfile b/docker/test-common-software/Dockerfile new file mode 100644 index 0000000..7a03abc --- /dev/null +++ b/docker/test-common-software/Dockerfile @@ -0,0 +1,37 @@ +# Extends the base test-agent image with common software pre-installed. +# Used by the common_software integration test to verify that all documented +# software in docs/common-software.mdx works correctly inside the sandbox. +# +# KEEP IN SYNC with docs/common-software.mdx + +ARG BASE_IMAGE=sandbox-agent-test:dev +FROM ${BASE_IMAGE} + +USER root + +RUN apt-get update -qq && \ + apt-get install -y -qq --no-install-recommends \ + # Browsers + chromium \ + firefox-esr \ + # Languages + python3 python3-pip python3-venv \ + default-jdk \ + ruby-full \ + # Databases + sqlite3 \ + redis-server \ + # Build tools + build-essential cmake pkg-config \ + # CLI tools + git jq tmux \ + # Media and graphics + imagemagick \ + poppler-utils \ + # Desktop apps + gimp \ + > /dev/null 2>&1 && \ + rm -rf /var/lib/apt/lists/* + +ENTRYPOINT ["/usr/local/bin/sandbox-agent"] +CMD ["server", "--host", "0.0.0.0", "--port", "3000", "--no-token"] diff --git a/docs/agent-sessions.mdx b/docs/agent-sessions.mdx index 0f9e2ab..0154537 100644 --- a/docs/agent-sessions.mdx +++ b/docs/agent-sessions.mdx @@ -51,6 +51,108 @@ await session.prompt([ unsubscribe(); ``` +### Event types + +Each event's `payload` contains a session update. The `sessionUpdate` field identifies the type. + + + +Streamed text or content from the agent's response. + +```json +{ + "sessionUpdate": "agent_message_chunk", + "content": { "type": "text", "text": "Here's how the repository is structured..." } +} +``` + + + +Internal reasoning from the agent (chain-of-thought / extended thinking). + +```json +{ + "sessionUpdate": "agent_thought_chunk", + "content": { "type": "text", "text": "I should start by looking at the project structure..." } +} +``` + + + +Echo of the user's prompt being processed. + +```json +{ + "sessionUpdate": "user_message_chunk", + "content": { "type": "text", "text": "Summarize the repository structure." } +} +``` + + + +The agent invoked a tool (file edit, terminal command, etc.). + +```json +{ + "sessionUpdate": "tool_call", + "toolCallId": "tc_abc123", + "title": "Read file", + "status": "in_progress", + "rawInput": { "path": "/src/index.ts" } +} +``` + + + +Progress or result update for an in-progress tool call. + +```json +{ + "sessionUpdate": "tool_call_update", + "toolCallId": "tc_abc123", + "status": "completed", + "content": [{ "type": "text", "text": "import express from 'express';\n..." }] +} +``` + + + +The agent's execution plan for the current task. + +```json +{ + "sessionUpdate": "plan", + "entries": [ + { "content": "Read the project structure", "status": "completed" }, + { "content": "Identify main entrypoints", "status": "in_progress" }, + { "content": "Write summary", "status": "pending" } + ] +} +``` + + + +Token usage metrics for the current turn. + +```json +{ + "sessionUpdate": "usage_update" +} +``` + + + +Session metadata changed (e.g. agent-generated title). + +```json +{ + "sessionUpdate": "session_info_update", + "title": "Repository structure analysis" +} +``` + + + ## Fetch persisted event history ```ts diff --git a/docs/architecture.mdx b/docs/architecture.mdx index 467a71f..61b4689 100644 --- a/docs/architecture.mdx +++ b/docs/architecture.mdx @@ -56,7 +56,7 @@ Agents are installed lazily on first use. To avoid the cold-start delay, pre-ins sandbox-agent install-agent --all ``` -The `rivetdev/sandbox-agent:0.4.0-full` Docker image ships with all agents pre-installed. +The `rivetdev/sandbox-agent:0.4.2-full` Docker image ships with all agents pre-installed. ## Production-ready agent orchestration diff --git a/docs/cli.mdx b/docs/cli.mdx index 2ad3b08..362de49 100644 --- a/docs/cli.mdx +++ b/docs/cli.mdx @@ -37,6 +37,36 @@ Notes: - Set `SANDBOX_AGENT_LOG_STDOUT=1` to force stdout/stderr logging. - Use `SANDBOX_AGENT_LOG_DIR` to override log directory. +## install + +Install first-party runtime dependencies. + +### install desktop + +Install the Linux desktop runtime packages required by `/v1/desktop/*`. + +```bash +sandbox-agent install desktop [OPTIONS] +``` + +| Option | Description | +|--------|-------------| +| `--yes` | Skip the confirmation prompt | +| `--print-only` | Print the package-manager command without executing it | +| `--package-manager ` | Override package-manager detection | +| `--no-fonts` | Skip the default DejaVu font package | + +```bash +sandbox-agent install desktop --yes +sandbox-agent install desktop --print-only +``` + +Notes: + +- Supported on Linux only. +- The command detects `apt`, `dnf`, or `apk`. +- If the host is not already running as root, the command requires `sudo`. + ## install-agent Install or reinstall a single agent, or every supported agent with `--all`. diff --git a/docs/common-software.mdx b/docs/common-software.mdx new file mode 100644 index 0000000..7997a92 --- /dev/null +++ b/docs/common-software.mdx @@ -0,0 +1,560 @@ +--- +title: "Common Software" +description: "Install browsers, languages, databases, and other tools inside the sandbox." +sidebarTitle: "Common Software" +icon: "box-open" +--- + +The sandbox runs a Debian/Ubuntu base image. You can install software with `apt-get` via the [Process API](/processes) or by customizing your Docker image. This page covers commonly needed packages and how to install them. + +## Browsers + +### Chromium + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "chromium", "chromium-sandbox"], +}); + +// Launch headless +await sdk.runProcess({ + command: "chromium", + args: ["--headless", "--no-sandbox", "--disable-gpu", "https://example.com"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","chromium","chromium-sandbox"]}' +``` + + + +Use `--no-sandbox` when running Chromium inside a container. The container itself provides isolation. + + +### Firefox + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "firefox-esr"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","firefox-esr"]}' +``` + + +### Playwright browsers + +Playwright bundles its own browser binaries. Install the Playwright CLI and let it download browsers for you. + + +```ts TypeScript +await sdk.runProcess({ + command: "npx", + args: ["playwright", "install", "--with-deps", "chromium"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"npx","args":["playwright","install","--with-deps","chromium"]}' +``` + + +--- + +## Languages and runtimes + +### Node.js + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "nodejs", "npm"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","nodejs","npm"]}' +``` + + +For a specific version, use [nvm](https://github.com/nvm-sh/nvm): + +```ts TypeScript +await sdk.runProcess({ + command: "bash", + args: ["-c", "curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.1/install.sh | bash && . ~/.nvm/nvm.sh && nvm install 22"], +}); +``` + +### Python + +Python 3 is typically pre-installed. To add pip and common packages: + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "python3", "python3-pip", "python3-venv"], +}); + +await sdk.runProcess({ + command: "pip3", + args: ["install", "numpy", "pandas", "matplotlib"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","python3","python3-pip","python3-venv"]}' + +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"pip3","args":["install","numpy","pandas","matplotlib"]}' +``` + + +### Go + + +```ts TypeScript +await sdk.runProcess({ + command: "bash", + args: ["-c", "curl -fsSL https://go.dev/dl/go1.23.6.linux-amd64.tar.gz | tar -C /usr/local -xz"], +}); + +// Add to PATH for subsequent commands +await sdk.runProcess({ + command: "bash", + args: ["-c", "export PATH=$PATH:/usr/local/go/bin && go version"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"bash","args":["-c","curl -fsSL https://go.dev/dl/go1.23.6.linux-amd64.tar.gz | tar -C /usr/local -xz"]}' +``` + + +### Rust + + +```ts TypeScript +await sdk.runProcess({ + command: "bash", + args: ["-c", "curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"bash","args":["-c","curl --proto =https --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y"]}' +``` + + +### Java (OpenJDK) + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "default-jdk"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","default-jdk"]}' +``` + + +### Ruby + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "ruby-full"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","ruby-full"]}' +``` + + +--- + +## Databases + +### PostgreSQL + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "postgresql", "postgresql-client"], +}); + +// Start the service +const proc = await sdk.createProcess({ + command: "bash", + args: ["-c", "su - postgres -c 'pg_ctlcluster 15 main start'"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","postgresql","postgresql-client"]}' +``` + + +### SQLite + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "sqlite3"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","sqlite3"]}' +``` + + +### Redis + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "redis-server"], +}); + +const proc = await sdk.createProcess({ + command: "redis-server", + args: ["--daemonize", "no"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","redis-server"]}' + +curl -X POST "http://127.0.0.1:2468/v1/processes" \ + -H "Content-Type: application/json" \ + -d '{"command":"redis-server","args":["--daemonize","no"]}' +``` + + +### MySQL / MariaDB + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "mariadb-server", "mariadb-client"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","mariadb-server","mariadb-client"]}' +``` + + +--- + +## Build tools + +### Essential build toolchain + +Most compiled software needs the standard build toolchain: + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "build-essential", "cmake", "pkg-config"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","build-essential","cmake","pkg-config"]}' +``` + + +This installs `gcc`, `g++`, `make`, `cmake`, and related tools. + +--- + +## Desktop applications + +These require the [Computer Use](/computer-use) desktop to be started first. + +### LibreOffice + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "libreoffice"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","libreoffice"]}' +``` + + +### GIMP + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "gimp"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","gimp"]}' +``` + + +### VLC + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "vlc"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","vlc"]}' +``` + + +### VS Code (code-server) + + +```ts TypeScript +await sdk.runProcess({ + command: "bash", + args: ["-c", "curl -fsSL https://code-server.dev/install.sh | sh"], +}); + +const proc = await sdk.createProcess({ + command: "code-server", + args: ["--bind-addr", "0.0.0.0:8080", "--auth", "none"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"bash","args":["-c","curl -fsSL https://code-server.dev/install.sh | sh"]}' + +curl -X POST "http://127.0.0.1:2468/v1/processes" \ + -H "Content-Type: application/json" \ + -d '{"command":"code-server","args":["--bind-addr","0.0.0.0:8080","--auth","none"]}' +``` + + +--- + +## CLI tools + +### Git + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "git"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","git"]}' +``` + + +### Docker + + +```ts TypeScript +await sdk.runProcess({ + command: "bash", + args: ["-c", "curl -fsSL https://get.docker.com | sh"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"bash","args":["-c","curl -fsSL https://get.docker.com | sh"]}' +``` + + +### jq + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "jq"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","jq"]}' +``` + + +### tmux + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "tmux"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","tmux"]}' +``` + + +--- + +## Media and graphics + +### FFmpeg + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "ffmpeg"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","ffmpeg"]}' +``` + + +### ImageMagick + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "imagemagick"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","imagemagick"]}' +``` + + +### Poppler (PDF utilities) + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "poppler-utils"], +}); + +// Convert PDF to images +await sdk.runProcess({ + command: "pdftoppm", + args: ["-png", "document.pdf", "output"], +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","poppler-utils"]}' +``` + + +--- + +## Pre-installing in a Docker image + +For production use, install software in your Dockerfile instead of at runtime. This avoids repeated downloads and makes startup faster. + +```dockerfile +FROM ubuntu:22.04 + +RUN apt-get update && apt-get install -y \ + chromium \ + firefox-esr \ + nodejs npm \ + python3 python3-pip \ + git curl wget \ + build-essential \ + sqlite3 \ + ffmpeg \ + imagemagick \ + jq \ + && rm -rf /var/lib/apt/lists/* + +RUN pip3 install numpy pandas matplotlib +``` + +See [Docker deployment](/deploy/docker) for how to use custom images with Sandbox Agent. diff --git a/docs/computer-use.mdx b/docs/computer-use.mdx new file mode 100644 index 0000000..fc6b7d0 --- /dev/null +++ b/docs/computer-use.mdx @@ -0,0 +1,859 @@ +--- +title: "Computer Use" +description: "Control a virtual desktop inside the sandbox with mouse, keyboard, screenshots, recordings, and live streaming." +sidebarTitle: "Computer Use" +icon: "desktop" +--- + +Sandbox Agent provides a managed virtual desktop (Xvfb + openbox) that you can control programmatically. This is useful for browser automation, GUI testing, and AI computer-use workflows. + +## Start and stop + + +```ts TypeScript +import { SandboxAgent } from "sandbox-agent"; + +const sdk = await SandboxAgent.connect({ + baseUrl: "http://127.0.0.1:2468", +}); + +const status = await sdk.startDesktop({ + width: 1920, + height: 1080, + dpi: 96, +}); + +console.log(status.state); // "active" +console.log(status.display); // ":99" + +// When done +await sdk.stopDesktop(); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/desktop/start" \ + -H "Content-Type: application/json" \ + -d '{"width":1920,"height":1080,"dpi":96}' + +curl -X POST "http://127.0.0.1:2468/v1/desktop/stop" +``` + + +All fields in the start request are optional. Defaults are 1440x900 at 96 DPI. + +### Start request options + +| Field | Type | Default | Description | +|-------|------|---------|-------------| +| `width` | number | 1440 | Desktop width in pixels | +| `height` | number | 900 | Desktop height in pixels | +| `dpi` | number | 96 | Display DPI | +| `displayNum` | number | 99 | Starting X display number. The runtime probes from this number upward to find an available display. | +| `stateDir` | string | (auto) | Desktop state directory for home, logs, recordings | +| `streamVideoCodec` | string | `"vp8"` | WebRTC video codec (`vp8`, `vp9`, `h264`) | +| `streamAudioCodec` | string | `"opus"` | WebRTC audio codec (`opus`, `g722`) | +| `streamFrameRate` | number | 30 | Streaming frame rate (1-60) | +| `webrtcPortRange` | string | `"59050-59070"` | UDP port range for WebRTC media | +| `recordingFps` | number | 30 | Default recording FPS when not specified in `startDesktopRecording` (1-60) | + +The streaming and recording options configure defaults for the desktop session. They take effect when streaming or recording is started later. + + +```ts TypeScript +const status = await sdk.startDesktop({ + width: 1920, + height: 1080, + streamVideoCodec: "h264", + streamFrameRate: 60, + webrtcPortRange: "59100-59120", + recordingFps: 15, +}); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/desktop/start" \ + -H "Content-Type: application/json" \ + -d '{ + "width": 1920, + "height": 1080, + "streamVideoCodec": "h264", + "streamFrameRate": 60, + "webrtcPortRange": "59100-59120", + "recordingFps": 15 + }' +``` + + +## Status + + +```ts TypeScript +const status = await sdk.getDesktopStatus(); +console.log(status.state); // "inactive" | "active" | "failed" | ... +``` + +```bash cURL +curl "http://127.0.0.1:2468/v1/desktop/status" +``` + + +## Screenshots + +Capture the full desktop or a specific region. Optionally include the cursor position. + + +```ts TypeScript +// Full screenshot (PNG by default) +const png = await sdk.takeDesktopScreenshot(); + +// JPEG at 70% quality, half scale +const jpeg = await sdk.takeDesktopScreenshot({ + format: "jpeg", + quality: 70, + scale: 0.5, +}); + +// Include cursor overlay +const withCursor = await sdk.takeDesktopScreenshot({ + showCursor: true, +}); + +// Region screenshot +const region = await sdk.takeDesktopRegionScreenshot({ + x: 100, + y: 100, + width: 400, + height: 300, +}); +``` + +```bash cURL +curl "http://127.0.0.1:2468/v1/desktop/screenshot" --output screenshot.png + +curl "http://127.0.0.1:2468/v1/desktop/screenshot?format=jpeg&quality=70&scale=0.5" \ + --output screenshot.jpg + +# Include cursor overlay +curl "http://127.0.0.1:2468/v1/desktop/screenshot?show_cursor=true" \ + --output with_cursor.png + +curl "http://127.0.0.1:2468/v1/desktop/screenshot/region?x=100&y=100&width=400&height=300" \ + --output region.png +``` + + +### Screenshot options + +| Param | Type | Default | Description | +|-------|------|---------|-------------| +| `format` | string | `"png"` | Output format: `png`, `jpeg`, or `webp` | +| `quality` | number | 85 | Compression quality (1-100, JPEG/WebP only) | +| `scale` | number | 1.0 | Scale factor (0.1-1.0) | +| `showCursor` | boolean | `false` | Composite a crosshair at the cursor position | + +When `showCursor` is enabled, the cursor position is captured at the moment of the screenshot and a red crosshair is drawn at that location. This is useful for AI agents that need to see where the cursor is in the screenshot. + +## Mouse + + +```ts TypeScript +// Get current position +const pos = await sdk.getDesktopMousePosition(); +console.log(pos.x, pos.y); + +// Move +await sdk.moveDesktopMouse({ x: 500, y: 300 }); + +// Click (left by default) +await sdk.clickDesktop({ x: 500, y: 300 }); + +// Right click +await sdk.clickDesktop({ x: 500, y: 300, button: "right" }); + +// Double click +await sdk.clickDesktop({ x: 500, y: 300, clickCount: 2 }); + +// Drag +await sdk.dragDesktopMouse({ + startX: 100, startY: 100, + endX: 400, endY: 400, +}); + +// Scroll +await sdk.scrollDesktop({ x: 500, y: 300, deltaY: -3 }); +``` + +```bash cURL +curl "http://127.0.0.1:2468/v1/desktop/mouse/position" + +curl -X POST "http://127.0.0.1:2468/v1/desktop/mouse/click" \ + -H "Content-Type: application/json" \ + -d '{"x":500,"y":300}' + +curl -X POST "http://127.0.0.1:2468/v1/desktop/mouse/drag" \ + -H "Content-Type: application/json" \ + -d '{"startX":100,"startY":100,"endX":400,"endY":400}' + +curl -X POST "http://127.0.0.1:2468/v1/desktop/mouse/scroll" \ + -H "Content-Type: application/json" \ + -d '{"x":500,"y":300,"deltaY":-3}' +``` + + +## Keyboard + + +```ts TypeScript +// Type text +await sdk.typeDesktopText({ text: "Hello, world!" }); + +// Press a key with modifiers +await sdk.pressDesktopKey({ + key: "c", + modifiers: { ctrl: true }, +}); + +// Low-level key down/up +await sdk.keyDownDesktop({ key: "Shift_L" }); +await sdk.keyUpDesktop({ key: "Shift_L" }); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/desktop/keyboard/type" \ + -H "Content-Type: application/json" \ + -d '{"text":"Hello, world!"}' + +curl -X POST "http://127.0.0.1:2468/v1/desktop/keyboard/press" \ + -H "Content-Type: application/json" \ + -d '{"key":"c","modifiers":{"ctrl":true}}' +``` + + +## Clipboard + +Read and write the X11 clipboard programmatically. + + +```ts TypeScript +// Read clipboard +const clipboard = await sdk.getDesktopClipboard(); +console.log(clipboard.text); + +// Read primary selection (mouse-selected text) +const primary = await sdk.getDesktopClipboard({ selection: "primary" }); + +// Write to clipboard +await sdk.setDesktopClipboard({ text: "Pasted via API" }); + +// Write to both clipboard and primary selection +await sdk.setDesktopClipboard({ + text: "Synced text", + selection: "both", +}); +``` + +```bash cURL +curl "http://127.0.0.1:2468/v1/desktop/clipboard" + +curl "http://127.0.0.1:2468/v1/desktop/clipboard?selection=primary" + +curl -X POST "http://127.0.0.1:2468/v1/desktop/clipboard" \ + -H "Content-Type: application/json" \ + -d '{"text":"Pasted via API"}' + +curl -X POST "http://127.0.0.1:2468/v1/desktop/clipboard" \ + -H "Content-Type: application/json" \ + -d '{"text":"Synced text","selection":"both"}' +``` + + +The `selection` parameter controls which X11 selection to read or write: + +| Value | Description | +|-------|-------------| +| `clipboard` (default) | The standard clipboard (Ctrl+C / Ctrl+V) | +| `primary` | The primary selection (text selected with the mouse) | +| `both` | Write to both clipboard and primary selection (write only) | + +## Display and windows + + +```ts TypeScript +const display = await sdk.getDesktopDisplayInfo(); +console.log(display.resolution); // { width: 1920, height: 1080, dpi: 96 } + +const { windows } = await sdk.listDesktopWindows(); +for (const win of windows) { + console.log(win.title, win.x, win.y, win.width, win.height); +} +``` + +```bash cURL +curl "http://127.0.0.1:2468/v1/desktop/display/info" + +curl "http://127.0.0.1:2468/v1/desktop/windows" +``` + + +The windows endpoint filters out noise automatically: window manager internals (Openbox), windows with empty titles, and tiny helper windows (under 120x80) are excluded. The currently active/focused window is always included regardless of filters. + +### Focused window + +Get the currently focused window without listing all windows. + + +```ts TypeScript +const focused = await sdk.getDesktopFocusedWindow(); +console.log(focused.title, focused.id); +``` + +```bash cURL +curl "http://127.0.0.1:2468/v1/desktop/windows/focused" +``` + + +Returns 404 if no window currently has focus. + +### Window management + +Focus, move, and resize windows by their X11 window ID. + + +```ts TypeScript +const { windows } = await sdk.listDesktopWindows(); +const win = windows[0]; + +// Bring window to foreground +await sdk.focusDesktopWindow(win.id); + +// Move window +await sdk.moveDesktopWindow(win.id, { x: 100, y: 50 }); + +// Resize window +await sdk.resizeDesktopWindow(win.id, { width: 1280, height: 720 }); +``` + +```bash cURL +# Focus a window +curl -X POST "http://127.0.0.1:2468/v1/desktop/windows/12345/focus" + +# Move a window +curl -X POST "http://127.0.0.1:2468/v1/desktop/windows/12345/move" \ + -H "Content-Type: application/json" \ + -d '{"x":100,"y":50}' + +# Resize a window +curl -X POST "http://127.0.0.1:2468/v1/desktop/windows/12345/resize" \ + -H "Content-Type: application/json" \ + -d '{"width":1280,"height":720}' +``` + + +All three endpoints return the updated window info so you can verify the operation took effect. The window manager may adjust the requested position or size. + +## App launching + +Launch applications or open files/URLs on the desktop without needing to shell out. + + +```ts TypeScript +// Launch an app by name +const result = await sdk.launchDesktopApp({ + app: "firefox", + args: ["--private"], +}); +console.log(result.processId); // "proc_7" + +// Launch and wait for the window to appear +const withWindow = await sdk.launchDesktopApp({ + app: "xterm", + wait: true, +}); +console.log(withWindow.windowId); // "12345" or null if timed out + +// Open a URL with the default handler +const opened = await sdk.openDesktopTarget({ + target: "https://example.com", +}); +console.log(opened.processId); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/desktop/launch" \ + -H "Content-Type: application/json" \ + -d '{"app":"firefox","args":["--private"]}' + +curl -X POST "http://127.0.0.1:2468/v1/desktop/launch" \ + -H "Content-Type: application/json" \ + -d '{"app":"xterm","wait":true}' + +curl -X POST "http://127.0.0.1:2468/v1/desktop/open" \ + -H "Content-Type: application/json" \ + -d '{"target":"https://example.com"}' +``` + + +The returned `processId` can be used with the [Process API](/processes) to read logs (`GET /v1/processes/{id}/logs`) or stop the application (`POST /v1/processes/{id}/stop`). + +When `wait` is `true`, the API polls for up to 5 seconds for a window to appear. If the window appears, its ID is returned in `windowId`. If it times out, `windowId` is `null` but the process is still running. + + +**Launch/Open vs the Process API:** Both `launch` and `open` are convenience wrappers around the [Process API](/processes). They create managed processes (with `owner: "desktop"`) that you can inspect, log, and stop through the same Process endpoints. The difference is that `launch` validates the binary exists in PATH first and can optionally wait for a window to appear, while `open` delegates to the system default handler (`xdg-open`). Use the Process API directly when you need full control over command, environment, working directory, or restart policies. + + +## Recording + +Record the desktop to MP4. + + +```ts TypeScript +const recording = await sdk.startDesktopRecording({ fps: 30 }); +console.log(recording.id); + +// ... do things ... + +const stopped = await sdk.stopDesktopRecording(); + +// List all recordings +const { recordings } = await sdk.listDesktopRecordings(); + +// Download +const mp4 = await sdk.downloadDesktopRecording(recording.id); + +// Clean up +await sdk.deleteDesktopRecording(recording.id); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/desktop/recording/start" \ + -H "Content-Type: application/json" \ + -d '{"fps":30}' + +curl -X POST "http://127.0.0.1:2468/v1/desktop/recording/stop" + +curl "http://127.0.0.1:2468/v1/desktop/recordings" + +curl "http://127.0.0.1:2468/v1/desktop/recordings/rec_1/download" --output recording.mp4 + +curl -X DELETE "http://127.0.0.1:2468/v1/desktop/recordings/rec_1" +``` + + +## Desktop processes + +The desktop runtime manages several background processes (Xvfb, openbox, neko, ffmpeg). These are all registered with the general [Process API](/processes) under the `desktop` owner, so you can inspect logs, check status, and troubleshoot using the same tools you use for any other managed process. + + +```ts TypeScript +// List all processes, including desktop-owned ones +const { processes } = await sdk.listProcesses(); + +const desktopProcs = processes.filter((p) => p.owner === "desktop"); +for (const p of desktopProcs) { + console.log(p.id, p.command, p.status); +} + +// Read logs from a specific desktop process +const logs = await sdk.getProcessLogs(desktopProcs[0].id, { tail: 50 }); +for (const entry of logs.entries) { + console.log(entry.stream, atob(entry.data)); +} +``` + +```bash cURL +# List all processes (desktop processes have owner: "desktop") +curl "http://127.0.0.1:2468/v1/processes" + +# Get logs from a specific desktop process +curl "http://127.0.0.1:2468/v1/processes/proc_1/logs?tail=50" +``` + + +The desktop status endpoint also includes a summary of running processes: + + +```ts TypeScript +const status = await sdk.getDesktopStatus(); +for (const proc of status.processes) { + console.log(proc.name, proc.pid, proc.running); +} +``` + +```bash cURL +curl "http://127.0.0.1:2468/v1/desktop/status" +# Response includes: processes: [{ name: "Xvfb", pid: 123, running: true }, ...] +``` + + +| Process | Role | Restart policy | +|---------|------|---------------| +| Xvfb | Virtual X11 framebuffer | Auto-restart while desktop is active | +| openbox | Window manager | Auto-restart while desktop is active | +| neko | WebRTC streaming server (started by `startDesktopStream`) | No auto-restart | +| ffmpeg | Screen recorder (started by `startDesktopRecording`) | No auto-restart | + +## Live streaming + +Start a WebRTC stream for real-time desktop viewing in a browser. + + +```ts TypeScript +await sdk.startDesktopStream(); + +// Check stream status +const status = await sdk.getDesktopStreamStatus(); +console.log(status.active); // true +console.log(status.processId); // "proc_5" + +// Connect via the React DesktopViewer component or +// use the WebSocket signaling endpoint directly +// at ws://127.0.0.1:2468/v1/desktop/stream/signaling + +await sdk.stopDesktopStream(); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/desktop/stream/start" + +# Check stream status +curl "http://127.0.0.1:2468/v1/desktop/stream/status" + +# Connect to ws://127.0.0.1:2468/v1/desktop/stream/signaling for WebRTC signaling + +curl -X POST "http://127.0.0.1:2468/v1/desktop/stream/stop" +``` + + +For a drop-in React component, see [React Components](/react-components). + +## API reference + +### Endpoints + +| Method | Path | Description | +|--------|------|-------------| +| `POST` | `/v1/desktop/start` | Start the desktop runtime | +| `POST` | `/v1/desktop/stop` | Stop the desktop runtime | +| `GET` | `/v1/desktop/status` | Get desktop runtime status | +| `GET` | `/v1/desktop/screenshot` | Capture full desktop screenshot | +| `GET` | `/v1/desktop/screenshot/region` | Capture a region screenshot | +| `GET` | `/v1/desktop/mouse/position` | Get current mouse position | +| `POST` | `/v1/desktop/mouse/move` | Move the mouse | +| `POST` | `/v1/desktop/mouse/click` | Click the mouse | +| `POST` | `/v1/desktop/mouse/down` | Press mouse button down | +| `POST` | `/v1/desktop/mouse/up` | Release mouse button | +| `POST` | `/v1/desktop/mouse/drag` | Drag from one point to another | +| `POST` | `/v1/desktop/mouse/scroll` | Scroll at a position | +| `POST` | `/v1/desktop/keyboard/type` | Type text | +| `POST` | `/v1/desktop/keyboard/press` | Press a key with optional modifiers | +| `POST` | `/v1/desktop/keyboard/down` | Press a key down (hold) | +| `POST` | `/v1/desktop/keyboard/up` | Release a key | +| `GET` | `/v1/desktop/display/info` | Get display info | +| `GET` | `/v1/desktop/windows` | List visible windows | +| `GET` | `/v1/desktop/windows/focused` | Get focused window info | +| `POST` | `/v1/desktop/windows/{id}/focus` | Focus a window | +| `POST` | `/v1/desktop/windows/{id}/move` | Move a window | +| `POST` | `/v1/desktop/windows/{id}/resize` | Resize a window | +| `GET` | `/v1/desktop/clipboard` | Read clipboard contents | +| `POST` | `/v1/desktop/clipboard` | Write to clipboard | +| `POST` | `/v1/desktop/launch` | Launch an application | +| `POST` | `/v1/desktop/open` | Open a file or URL | +| `POST` | `/v1/desktop/recording/start` | Start recording | +| `POST` | `/v1/desktop/recording/stop` | Stop recording | +| `GET` | `/v1/desktop/recordings` | List recordings | +| `GET` | `/v1/desktop/recordings/{id}` | Get recording metadata | +| `GET` | `/v1/desktop/recordings/{id}/download` | Download recording | +| `DELETE` | `/v1/desktop/recordings/{id}` | Delete recording | +| `POST` | `/v1/desktop/stream/start` | Start WebRTC streaming | +| `POST` | `/v1/desktop/stream/stop` | Stop WebRTC streaming | +| `GET` | `/v1/desktop/stream/status` | Get stream status | +| `GET` | `/v1/desktop/stream/signaling` | WebSocket for WebRTC signaling | + +### TypeScript SDK methods + +| Method | Returns | Description | +|--------|---------|-------------| +| `startDesktop(request?)` | `DesktopStatusResponse` | Start the desktop | +| `stopDesktop()` | `DesktopStatusResponse` | Stop the desktop | +| `getDesktopStatus()` | `DesktopStatusResponse` | Get desktop status | +| `takeDesktopScreenshot(query?)` | `Uint8Array` | Capture screenshot | +| `takeDesktopRegionScreenshot(query)` | `Uint8Array` | Capture region screenshot | +| `getDesktopMousePosition()` | `DesktopMousePositionResponse` | Get mouse position | +| `moveDesktopMouse(request)` | `DesktopMousePositionResponse` | Move mouse | +| `clickDesktop(request)` | `DesktopMousePositionResponse` | Click mouse | +| `mouseDownDesktop(request)` | `DesktopMousePositionResponse` | Mouse button down | +| `mouseUpDesktop(request)` | `DesktopMousePositionResponse` | Mouse button up | +| `dragDesktopMouse(request)` | `DesktopMousePositionResponse` | Drag mouse | +| `scrollDesktop(request)` | `DesktopMousePositionResponse` | Scroll | +| `typeDesktopText(request)` | `DesktopActionResponse` | Type text | +| `pressDesktopKey(request)` | `DesktopActionResponse` | Press key | +| `keyDownDesktop(request)` | `DesktopActionResponse` | Key down | +| `keyUpDesktop(request)` | `DesktopActionResponse` | Key up | +| `getDesktopDisplayInfo()` | `DesktopDisplayInfoResponse` | Get display info | +| `listDesktopWindows()` | `DesktopWindowListResponse` | List windows | +| `getDesktopFocusedWindow()` | `DesktopWindowInfo` | Get focused window | +| `focusDesktopWindow(id)` | `DesktopWindowInfo` | Focus a window | +| `moveDesktopWindow(id, request)` | `DesktopWindowInfo` | Move a window | +| `resizeDesktopWindow(id, request)` | `DesktopWindowInfo` | Resize a window | +| `getDesktopClipboard(query?)` | `DesktopClipboardResponse` | Read clipboard | +| `setDesktopClipboard(request)` | `DesktopActionResponse` | Write clipboard | +| `launchDesktopApp(request)` | `DesktopLaunchResponse` | Launch an app | +| `openDesktopTarget(request)` | `DesktopOpenResponse` | Open file/URL | +| `startDesktopRecording(request?)` | `DesktopRecordingInfo` | Start recording | +| `stopDesktopRecording()` | `DesktopRecordingInfo` | Stop recording | +| `listDesktopRecordings()` | `DesktopRecordingListResponse` | List recordings | +| `getDesktopRecording(id)` | `DesktopRecordingInfo` | Get recording | +| `downloadDesktopRecording(id)` | `Uint8Array` | Download recording | +| `deleteDesktopRecording(id)` | `void` | Delete recording | +| `startDesktopStream()` | `DesktopStreamStatusResponse` | Start streaming | +| `stopDesktopStream()` | `DesktopStreamStatusResponse` | Stop streaming | +| `getDesktopStreamStatus()` | `DesktopStreamStatusResponse` | Stream status | + +## Customizing the desktop environment + +The desktop runs inside the sandbox filesystem, so you can customize it using the [File System](/file-system) API before or after starting the desktop. The desktop HOME directory is located at `~/.local/state/sandbox-agent/desktop/home` (or `$XDG_STATE_HOME/sandbox-agent/desktop/home` if `XDG_STATE_HOME` is set). + +All configuration files below are written to paths relative to this HOME directory. + +### Window manager (openbox) + +The desktop uses [openbox](http://openbox.org/) as its window manager. You can customize its behavior, theme, and keyboard shortcuts by writing an `rc.xml` config file. + + +```ts TypeScript +const openboxConfig = ` + + + Clearlooks + NLIMC + DejaVu Sans10 + + 1 + + + + +`; + +await sdk.mkdirFs({ path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox" }); +await sdk.writeFsFile( + { path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox/rc.xml" }, + openboxConfig, +); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/fs/mkdir?path=~/.local/state/sandbox-agent/desktop/home/.config/openbox" + +curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.config/openbox/rc.xml" \ + -H "Content-Type: application/octet-stream" \ + --data-binary @rc.xml +``` + + +### Autostart programs + +Openbox runs scripts in `~/.config/openbox/autostart` on startup. Use this to launch applications, set the background, or configure the environment. + + +```ts TypeScript +const autostart = `#!/bin/sh +# Set a solid background color +xsetroot -solid "#1e1e2e" & + +# Launch a terminal +xterm -geometry 120x40+50+50 & + +# Launch a browser +firefox --no-remote & +`; + +await sdk.mkdirFs({ path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox" }); +await sdk.writeFsFile( + { path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox/autostart" }, + autostart, +); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/fs/mkdir?path=~/.local/state/sandbox-agent/desktop/home/.config/openbox" + +curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.config/openbox/autostart" \ + -H "Content-Type: application/octet-stream" \ + --data-binary @autostart.sh +``` + + + +The autostart script runs when openbox starts, which happens during `startDesktop()`. Write the autostart file before calling `startDesktop()` for it to take effect. + + +### Background + +There is no wallpaper set by default (the background is the X root window default). You can set it using `xsetroot` in the autostart script (as shown above), or use `feh` if you need an image: + + +```ts TypeScript +// Upload a wallpaper image +import fs from "node:fs"; + +const wallpaper = await fs.promises.readFile("./wallpaper.png"); +await sdk.writeFsFile( + { path: "~/.local/state/sandbox-agent/desktop/home/wallpaper.png" }, + wallpaper, +); + +// Set the autostart to apply it +const autostart = `#!/bin/sh +feh --bg-fill ~/wallpaper.png & +`; + +await sdk.mkdirFs({ path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox" }); +await sdk.writeFsFile( + { path: "~/.local/state/sandbox-agent/desktop/home/.config/openbox/autostart" }, + autostart, +); +``` + +```bash cURL +curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/wallpaper.png" \ + -H "Content-Type: application/octet-stream" \ + --data-binary @wallpaper.png + +curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.config/openbox/autostart" \ + -H "Content-Type: application/octet-stream" \ + --data-binary @autostart.sh +``` + + + +`feh` is not installed by default. Install it via the [Process API](/processes) before starting the desktop: `await sdk.runProcess({ command: "apt-get", args: ["install", "-y", "feh"] })`. + + +### Fonts + +Only `fonts-dejavu-core` is installed by default. To add more fonts, install them with your system package manager or copy font files into the sandbox: + + +```ts TypeScript +// Install a font package +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "fonts-noto", "fonts-liberation"], +}); + +// Or copy a custom font file +import fs from "node:fs"; + +const font = await fs.promises.readFile("./CustomFont.ttf"); +await sdk.mkdirFs({ path: "~/.local/state/sandbox-agent/desktop/home/.local/share/fonts" }); +await sdk.writeFsFile( + { path: "~/.local/state/sandbox-agent/desktop/home/.local/share/fonts/CustomFont.ttf" }, + font, +); + +// Rebuild the font cache +await sdk.runProcess({ command: "fc-cache", args: ["-fv"] }); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","fonts-noto","fonts-liberation"]}' + +curl -X POST "http://127.0.0.1:2468/v1/fs/mkdir?path=~/.local/state/sandbox-agent/desktop/home/.local/share/fonts" + +curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.local/share/fonts/CustomFont.ttf" \ + -H "Content-Type: application/octet-stream" \ + --data-binary @CustomFont.ttf + +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"fc-cache","args":["-fv"]}' +``` + + +### Cursor theme + + +```ts TypeScript +await sdk.runProcess({ + command: "apt-get", + args: ["install", "-y", "dmz-cursor-theme"], +}); + +const xresources = `Xcursor.theme: DMZ-White\nXcursor.size: 24\n`; +await sdk.writeFsFile( + { path: "~/.local/state/sandbox-agent/desktop/home/.Xresources" }, + xresources, +); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/processes/run" \ + -H "Content-Type: application/json" \ + -d '{"command":"apt-get","args":["install","-y","dmz-cursor-theme"]}' + +curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.Xresources" \ + -H "Content-Type: application/octet-stream" \ + --data-binary 'Xcursor.theme: DMZ-White\nXcursor.size: 24' +``` + + + +Run `xrdb -merge ~/.Xresources` (via the autostart or process API) after writing the file for changes to take effect. + + +### Shell and terminal + +No terminal emulator or shell is launched by default. Add one to the openbox autostart: + +```sh +# In ~/.config/openbox/autostart +xterm -geometry 120x40+50+50 & +``` + +To use a different shell, set the `SHELL` environment variable in your Dockerfile or install your preferred shell and configure the terminal to use it. + +### GTK theme + +Applications using GTK will pick up settings from `~/.config/gtk-3.0/settings.ini`: + + +```ts TypeScript +const gtkSettings = `[Settings] +gtk-theme-name=Adwaita +gtk-icon-theme-name=Adwaita +gtk-font-name=DejaVu Sans 10 +gtk-cursor-theme-name=DMZ-White +gtk-cursor-theme-size=24 +`; + +await sdk.mkdirFs({ path: "~/.local/state/sandbox-agent/desktop/home/.config/gtk-3.0" }); +await sdk.writeFsFile( + { path: "~/.local/state/sandbox-agent/desktop/home/.config/gtk-3.0/settings.ini" }, + gtkSettings, +); +``` + +```bash cURL +curl -X POST "http://127.0.0.1:2468/v1/fs/mkdir?path=~/.local/state/sandbox-agent/desktop/home/.config/gtk-3.0" + +curl -X PUT "http://127.0.0.1:2468/v1/fs/file?path=~/.local/state/sandbox-agent/desktop/home/.config/gtk-3.0/settings.ini" \ + -H "Content-Type: application/octet-stream" \ + --data-binary @settings.ini +``` + + +### Summary of configuration paths + +All paths are relative to the desktop HOME directory (`~/.local/state/sandbox-agent/desktop/home`). + +| What | Path | Notes | +|------|------|-------| +| Openbox config | `.config/openbox/rc.xml` | Window manager theme, keybindings, behavior | +| Autostart | `.config/openbox/autostart` | Shell script run on desktop start | +| Custom fonts | `.local/share/fonts/` | TTF/OTF files, run `fc-cache -fv` after | +| Cursor theme | `.Xresources` | Requires `xrdb -merge` to apply | +| GTK 3 settings | `.config/gtk-3.0/settings.ini` | Theme, icons, fonts for GTK apps | +| Wallpaper | Any path, referenced from autostart | Requires `feh` or similar tool | diff --git a/docs/deploy/boxlite.mdx b/docs/deploy/boxlite.mdx index 115d8b8..8c02bb4 100644 --- a/docs/deploy/boxlite.mdx +++ b/docs/deploy/boxlite.mdx @@ -20,7 +20,7 @@ that BoxLite can load directly (BoxLite has its own image store separate from Do ```dockerfile FROM node:22-bookworm-slim RUN apt-get update && apt-get install -y curl ca-certificates && rm -rf /var/lib/apt/lists/* -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh RUN sandbox-agent install-agent claude RUN sandbox-agent install-agent codex ``` diff --git a/docs/deploy/cloudflare.mdx b/docs/deploy/cloudflare.mdx index 1cecdd7..c0370e4 100644 --- a/docs/deploy/cloudflare.mdx +++ b/docs/deploy/cloudflare.mdx @@ -25,7 +25,7 @@ cd my-sandbox ```dockerfile FROM cloudflare/sandbox:0.7.0 -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh RUN sandbox-agent install-agent claude && sandbox-agent install-agent codex EXPOSE 8000 @@ -36,7 +36,7 @@ EXPOSE 8000 For standalone scripts, use the `cloudflare` provider: ```bash -npm install sandbox-agent@0.3.x @cloudflare/sandbox +npm install sandbox-agent@0.4.x @cloudflare/sandbox ``` ```typescript diff --git a/docs/deploy/computesdk.mdx b/docs/deploy/computesdk.mdx index 1adfffe..601d9c7 100644 --- a/docs/deploy/computesdk.mdx +++ b/docs/deploy/computesdk.mdx @@ -14,7 +14,7 @@ description: "Deploy Sandbox Agent using ComputeSDK's provider-agnostic sandbox ## TypeScript example ```bash -npm install sandbox-agent@0.3.x computesdk +npm install sandbox-agent@0.4.x computesdk ``` ```typescript @@ -27,7 +27,11 @@ if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY const sdk = await SandboxAgent.start({ sandbox: computesdk({ - create: { envs }, + create: { + envs, + image: process.env.COMPUTESDK_IMAGE, + templateId: process.env.COMPUTESDK_TEMPLATE_ID, + }, }), }); @@ -43,6 +47,7 @@ try { ``` The `computesdk` provider handles sandbox creation, Sandbox Agent installation, agent setup, and server startup automatically. ComputeSDK routes to your configured provider behind the scenes. +The `create` option now forwards the full ComputeSDK sandbox-create payload, including provider-specific fields such as `image` and `templateId` when the selected provider supports them. Before calling `SandboxAgent.start()`, configure ComputeSDK with your provider: diff --git a/docs/deploy/daytona.mdx b/docs/deploy/daytona.mdx index 42dad40..e546bef 100644 --- a/docs/deploy/daytona.mdx +++ b/docs/deploy/daytona.mdx @@ -16,7 +16,7 @@ See [Daytona network limits](https://www.daytona.io/docs/en/network-limits/). ## TypeScript example ```bash -npm install sandbox-agent@0.3.x @daytonaio/sdk +npm install sandbox-agent@0.4.x @daytonaio/sdk ``` ```typescript @@ -44,7 +44,7 @@ try { } ``` -The `daytona` provider uses the `rivetdev/sandbox-agent:0.4.0-full` image by default and starts the server automatically. +The `daytona` provider uses the `rivetdev/sandbox-agent:0.4.2-full` image by default and starts the server automatically. ## Using snapshots for faster startup @@ -61,7 +61,7 @@ if (!hasSnapshot) { name: SNAPSHOT, image: Image.base("ubuntu:22.04").runCommands( "apt-get update && apt-get install -y curl ca-certificates", - "curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh", + "curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh", "sandbox-agent install-agent claude", "sandbox-agent install-agent codex", ), diff --git a/docs/deploy/docker.mdx b/docs/deploy/docker.mdx index 7c9d2e3..c5a3432 100644 --- a/docs/deploy/docker.mdx +++ b/docs/deploy/docker.mdx @@ -15,43 +15,64 @@ Run the published full image with all supported agents pre-installed: docker run --rm -p 3000:3000 \ -e ANTHROPIC_API_KEY="$ANTHROPIC_API_KEY" \ -e OPENAI_API_KEY="$OPENAI_API_KEY" \ - rivetdev/sandbox-agent:0.4.0-full \ + rivetdev/sandbox-agent:0.4.2-full \ server --no-token --host 0.0.0.0 --port 3000 ``` -The `0.4.0-full` tag pins the exact version. The moving `full` tag is also published for contributors who want the latest full image. +The `0.4.2-full` tag pins the exact version. The moving `full` tag is also published for contributors who want the latest full image. -## TypeScript with the Docker provider +If you also want the desktop API inside the container, install desktop dependencies before starting the server: ```bash -npm install sandbox-agent@0.3.x dockerode get-port +docker run --rm -p 3000:3000 \ + -e ANTHROPIC_API_KEY="$ANTHROPIC_API_KEY" \ + -e OPENAI_API_KEY="$OPENAI_API_KEY" \ + node:22-bookworm-slim sh -c "\ + apt-get update && \ + DEBIAN_FRONTEND=noninteractive apt-get install -y curl ca-certificates bash libstdc++6 && \ + rm -rf /var/lib/apt/lists/* && \ + curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh && \ + sandbox-agent install desktop --yes && \ + sandbox-agent server --no-token --host 0.0.0.0 --port 3000" ``` -```typescript -import { SandboxAgent } from "sandbox-agent"; -import { docker } from "sandbox-agent/docker"; +In a Dockerfile: -const sdk = await SandboxAgent.start({ - sandbox: docker({ - env: [ - `ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}`, - `OPENAI_API_KEY=${process.env.OPENAI_API_KEY}`, - ].filter(Boolean), - }), +```dockerfile +RUN sandbox-agent install desktop --yes +``` + +## TypeScript with dockerode + +```typescript +import Docker from "dockerode"; +import { SandboxAgent } from "sandbox-agent"; + +const docker = new Docker(); +const PORT = 3000; + +const container = await docker.createContainer({ + Image: "rivetdev/sandbox-agent:0.4.2-full", + Cmd: ["server", "--no-token", "--host", "0.0.0.0", "--port", `${PORT}`], + Env: [ + `ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY}`, + `OPENAI_API_KEY=${process.env.OPENAI_API_KEY}`, + `CODEX_API_KEY=${process.env.CODEX_API_KEY}`, + ].filter(Boolean), + ExposedPorts: { [`${PORT}/tcp`]: {} }, + HostConfig: { + AutoRemove: true, + PortBindings: { [`${PORT}/tcp`]: [{ HostPort: `${PORT}` }] }, + }, }); -try { - const session = await sdk.createSession({ agent: "codex" }); - await session.prompt([{ type: "text", text: "Summarize this repository." }]); -} finally { - await sdk.destroySandbox(); -} -``` +await container.start(); -The `docker` provider uses the `rivetdev/sandbox-agent:0.4.0-full` image by default. Override with `image`: +const baseUrl = `http://127.0.0.1:${PORT}`; +const sdk = await SandboxAgent.connect({ baseUrl }); -```typescript -docker({ image: "my-custom-image:latest" }) +const session = await sdk.createSession({ agent: "codex" }); +await session.prompt([{ type: "text", text: "Summarize this repository." }]); ``` ## Building a custom image with everything preinstalled @@ -65,7 +86,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \ bash ca-certificates curl git && \ rm -rf /var/lib/apt/lists/* -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh && \ +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh && \ sandbox-agent install-agent --all RUN useradd -m -s /bin/bash sandbox diff --git a/docs/deploy/e2b.mdx b/docs/deploy/e2b.mdx index 4e056ee..225cfdc 100644 --- a/docs/deploy/e2b.mdx +++ b/docs/deploy/e2b.mdx @@ -11,7 +11,7 @@ description: "Deploy Sandbox Agent inside an E2B sandbox." ## TypeScript example ```bash -npm install sandbox-agent@0.3.x @e2b/code-interpreter +npm install sandbox-agent@0.4.x @e2b/code-interpreter ``` ```typescript @@ -21,9 +21,11 @@ import { e2b } from "sandbox-agent/e2b"; const envs: Record = {}; if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; +const template = process.env.E2B_TEMPLATE; const sdk = await SandboxAgent.start({ sandbox: e2b({ + template, create: { envs }, }), }); @@ -39,9 +41,12 @@ try { } ``` -The `e2b` provider handles sandbox creation, Sandbox Agent installation, agent setup, and server startup automatically. +The `e2b` provider handles sandbox creation, Sandbox Agent installation, agent setup, and server startup automatically. Sandboxes pause by default instead of being deleted, and reconnecting with the same `sandboxId` resumes them automatically. + +Pass `template` when you want to start from a custom E2B template alias or template ID. E2B base-image selection happens when you build the template, then `sandbox-agent/e2b` uses that template at sandbox creation time. ## Faster cold starts For faster startup, create a custom E2B template with Sandbox Agent and target agents pre-installed. -See [E2B Custom Templates](https://e2b.dev/docs/sandbox-template). +Build System 2.0 also lets you choose the template's base image in code. +See [E2B Custom Templates](https://e2b.dev/docs/sandbox-template) and [E2B Base Images](https://e2b.dev/docs/template/base-image). diff --git a/docs/deploy/foundry-self-hosting.mdx b/docs/deploy/foundry-self-hosting.mdx deleted file mode 100644 index 8fd43ae..0000000 --- a/docs/deploy/foundry-self-hosting.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: "Foundry Self-Hosting" -description: "Environment, credentials, and deployment setup for Sandbox Agent Foundry auth, GitHub, and billing." ---- - -This guide documents the deployment contract for the Foundry product surface: app auth, GitHub onboarding, repository import, and billing. - -It also covers the local-development bootstrap that uses `.env.development` only when `NODE_ENV=development`. - -## Local Development - -For backend local development, the Foundry backend now supports a development-only dotenv bootstrap: - -- It loads `.env.development.local` and `.env.development` -- It does this **only** when `NODE_ENV=development` -- It does **not** load dotenv files in production - -The example file lives at [`/.env.development.example`](https://github.com/rivet-dev/sandbox-agent/blob/main/.env.development.example). - -To use it locally: - -```bash -cp .env.development.example .env.development -``` - -Run the backend with: - -```bash -just foundry-backend-start -``` - -That recipe sets `NODE_ENV=development`, which enables the dotenv loader. - -### Local Defaults - -These values can be safely defaulted for local development: - -- `APP_URL=http://localhost:4173` -- `BETTER_AUTH_URL=http://localhost:7741` -- `BETTER_AUTH_SECRET=sandbox-agent-foundry-development-only-change-me` -- `GITHUB_REDIRECT_URI=http://localhost:7741/v1/auth/callback/github` - -These should be treated as development-only values. - -## Production Environment - -For production or self-hosting, set these as real environment variables in your deployment platform. Do not rely on dotenv file loading. - -### App/Auth - -| Variable | Required | Notes | -|---|---:|---| -| `APP_URL` | Yes | Public frontend origin | -| `BETTER_AUTH_URL` | Yes | Public auth base URL | -| `BETTER_AUTH_SECRET` | Yes | Strong random secret for auth/session signing | - -### GitHub OAuth - -| Variable | Required | Notes | -|---|---:|---| -| `GITHUB_CLIENT_ID` | Yes | GitHub OAuth app client id | -| `GITHUB_CLIENT_SECRET` | Yes | GitHub OAuth app client secret | -| `GITHUB_REDIRECT_URI` | Yes | GitHub OAuth callback URL | - -Use GitHub OAuth for: - -- user sign-in -- user identity -- org selection -- access to the signed-in user’s GitHub context - -## GitHub App - -If your Foundry deployment uses GitHub App-backed organization install and repo import, also configure: - -| Variable | Required | Notes | -|---|---:|---| -| `GITHUB_APP_ID` | Yes | GitHub App id | -| `GITHUB_APP_CLIENT_ID` | Yes | GitHub App client id | -| `GITHUB_APP_CLIENT_SECRET` | Yes | GitHub App client secret | -| `GITHUB_APP_PRIVATE_KEY` | Yes | PEM private key for installation auth | - -For `.env.development` and `.env.development.local`, store `GITHUB_APP_PRIVATE_KEY` as a quoted single-line value with `\n` escapes instead of raw multi-line PEM text. - -Recommended GitHub App permissions: - -- Repository `Metadata: Read` -- Repository `Contents: Read & Write` -- Repository `Pull requests: Read & Write` -- Repository `Checks: Read` -- Repository `Commit statuses: Read` - -Set the webhook URL to `https:///v1/webhooks/github` and generate a webhook secret. Store the secret as `GITHUB_WEBHOOK_SECRET`. - -This is required, not optional. Foundry depends on GitHub App webhook delivery for installation lifecycle changes, repo access changes, and ongoing repo / pull request sync. If the GitHub App is not installed for the workspace, or webhook delivery is misconfigured, Foundry will remain in an install / reconnect state and core GitHub-backed functionality will not work correctly. - -Recommended webhook subscriptions: - -- `installation` -- `installation_repositories` -- `pull_request` -- `pull_request_review` -- `pull_request_review_comment` -- `push` -- `create` -- `delete` -- `check_suite` -- `check_run` -- `status` - -Use the GitHub App for: - -- installation/reconnect state -- org repo import -- repository sync -- PR creation and updates - -Use GitHub OAuth for: - -- who the user is -- which orgs they can choose - -## Stripe - -For live billing, configure: - -| Variable | Required | Notes | -|---|---:|---| -| `STRIPE_SECRET_KEY` | Yes | Server-side Stripe secret key | -| `STRIPE_PUBLISHABLE_KEY` | Yes | Client-side Stripe publishable key | -| `STRIPE_WEBHOOK_SECRET` | Yes | Signing secret for billing webhooks | -| `STRIPE_PRICE_TEAM` | Yes | Stripe price id for the Team plan checkout session | - -Stripe should own: - -- hosted checkout -- billing portal -- subscription status -- invoice history -- webhook-driven state sync - -## Mock Invariant - -Foundry’s mock client path should continue to work end to end even when the real auth/GitHub/Stripe path exists. - -That includes: - -- sign-in -- org selection/import -- settings -- billing UI -- workspace/task/session flow -- seat accrual - -Use mock mode for deterministic UI review and local product development. Use the real env-backed path for integration and self-hosting. diff --git a/docs/deploy/local.mdx b/docs/deploy/local.mdx index 90e2ba6..6ecdb09 100644 --- a/docs/deploy/local.mdx +++ b/docs/deploy/local.mdx @@ -9,7 +9,7 @@ For local development, run Sandbox Agent directly on your machine. ```bash # Install -curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh +curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh # Run sandbox-agent server --no-token --host 127.0.0.1 --port 2468 @@ -20,12 +20,12 @@ Or with npm/Bun: ```bash - npx @sandbox-agent/cli@0.3.x server --no-token --host 127.0.0.1 --port 2468 + npx @sandbox-agent/cli@0.4.x server --no-token --host 127.0.0.1 --port 2468 ``` ```bash - bunx @sandbox-agent/cli@0.3.x server --no-token --host 127.0.0.1 --port 2468 + bunx @sandbox-agent/cli@0.4.x server --no-token --host 127.0.0.1 --port 2468 ``` diff --git a/docs/deploy/modal.mdx b/docs/deploy/modal.mdx index 02a3828..5850fd8 100644 --- a/docs/deploy/modal.mdx +++ b/docs/deploy/modal.mdx @@ -11,7 +11,7 @@ description: "Deploy Sandbox Agent inside a Modal sandbox." ## TypeScript example ```bash -npm install sandbox-agent@0.3.x modal +npm install sandbox-agent@0.4.x modal ``` ```typescript @@ -21,9 +21,11 @@ import { modal } from "sandbox-agent/modal"; const secrets: Record = {}; if (process.env.ANTHROPIC_API_KEY) secrets.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.OPENAI_API_KEY) secrets.OPENAI_API_KEY = process.env.OPENAI_API_KEY; +const baseImage = process.env.MODAL_BASE_IMAGE ?? "node:22-slim"; const sdk = await SandboxAgent.start({ sandbox: modal({ + image: baseImage, create: { secrets }, }), }); @@ -40,6 +42,7 @@ try { ``` The `modal` provider handles app creation, image building, sandbox provisioning, agent installation, server startup, and tunnel networking automatically. +Set `image` to change the base Docker image before Sandbox Agent and its agent binaries are layered on top. You can also pass a prebuilt Modal `Image` object. ## Faster cold starts diff --git a/docs/deploy/vercel.mdx b/docs/deploy/vercel.mdx index db97236..ec931d8 100644 --- a/docs/deploy/vercel.mdx +++ b/docs/deploy/vercel.mdx @@ -11,7 +11,7 @@ description: "Deploy Sandbox Agent inside a Vercel Sandbox." ## TypeScript example ```bash -npm install sandbox-agent@0.3.x @vercel/sandbox +npm install sandbox-agent@0.4.x @vercel/sandbox ``` ```typescript diff --git a/docs/docs.json b/docs/docs.json index 16620fe..dbcc407 100644 --- a/docs/docs.json +++ b/docs/docs.json @@ -1,6 +1,6 @@ { "$schema": "https://mintlify.com/docs.json", - "theme": "willow", + "theme": "mint", "name": "Sandbox Agent SDK", "appearance": { "default": "dark", @@ -8,8 +8,8 @@ }, "colors": { "primary": "#ff4f00", - "light": "#ff4f00", - "dark": "#ff4f00" + "light": "#ff6a2a", + "dark": "#cc3f00" }, "favicon": "/favicon.svg", "logo": { @@ -25,17 +25,13 @@ }, "navbar": { "links": [ - { - "label": "Gigacode", - "icon": "terminal", - "href": "https://github.com/rivet-dev/sandbox-agent/tree/main/gigacode" - }, { "label": "Discord", "icon": "discord", "href": "https://discord.gg/auCecybynK" }, { + "label": "GitHub", "type": "github", "href": "https://github.com/rivet-dev/sandbox-agent" } @@ -87,15 +83,12 @@ }, { "group": "System", - "pages": ["file-system", "processes"] - }, - { - "group": "Orchestration", - "pages": ["orchestration-architecture", "session-persistence", "observability", "multiplayer", "security"] + "pages": ["file-system", "processes", "computer-use", "common-software"] }, { "group": "Reference", "pages": [ + "troubleshooting", "architecture", "cli", "inspector", @@ -127,5 +120,11 @@ ] } ] - } + }, + "__removed": [ + { + "group": "Orchestration", + "pages": ["orchestration-architecture", "session-persistence", "observability", "multiplayer", "security"] + } + ] } diff --git a/docs/gigacode.mdx b/docs/gigacode.mdx deleted file mode 100644 index ccc9e39..0000000 --- a/docs/gigacode.mdx +++ /dev/null @@ -1,6 +0,0 @@ ---- -title: Gigacode -url: "https://github.com/rivet-dev/sandbox-agent/tree/main/gigacode" ---- - - diff --git a/docs/inspector.mdx b/docs/inspector.mdx index cc5f3d0..1412c21 100644 --- a/docs/inspector.mdx +++ b/docs/inspector.mdx @@ -35,6 +35,7 @@ console.log(url); - Prompt testing - Request/response debugging - Interactive permission prompts (approve, always-allow, or reject tool-use requests) +- Desktop panel for status, remediation, start/stop, and screenshot refresh - Process management (create, stop, kill, delete, view logs) - Interactive PTY terminal for tty processes - One-shot command execution @@ -50,3 +51,16 @@ console.log(url); The Inspector includes an embedded Ghostty-based terminal for interactive tty processes. The UI uses the SDK's high-level `connectProcessTerminal(...)` wrapper via the shared `@sandbox-agent/react` `ProcessTerminal` component. + +## Desktop panel + +The `Desktop` panel shows the current desktop runtime state, missing dependencies, +the suggested install command, last error details, process/log paths, and the +latest captured screenshot. + +Use it to: + +- Check whether desktop dependencies are installed +- Start or stop the managed desktop runtime +- Refresh desktop status +- Capture a fresh screenshot on demand diff --git a/docs/openapi.json b/docs/openapi.json index e432a84..3624707 100644 --- a/docs/openapi.json +++ b/docs/openapi.json @@ -10,7 +10,7 @@ "license": { "name": "Apache-2.0" }, - "version": "0.4.0" + "version": "0.4.2" }, "servers": [ { @@ -628,6 +628,1814 @@ } } }, + "/v1/desktop/clipboard": { + "get": { + "tags": ["v1"], + "summary": "Read the desktop clipboard.", + "description": "Returns the current text content of the X11 clipboard.", + "operationId": "get_v1_desktop_clipboard", + "parameters": [ + { + "name": "selection", + "in": "query", + "required": false, + "schema": { + "type": "string", + "nullable": true + } + } + ], + "responses": { + "200": { + "description": "Clipboard contents", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopClipboardResponse" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "500": { + "description": "Clipboard read failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + }, + "post": { + "tags": ["v1"], + "summary": "Write to the desktop clipboard.", + "description": "Sets the text content of the X11 clipboard.", + "operationId": "post_v1_desktop_clipboard", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopClipboardWriteRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Clipboard updated", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopActionResponse" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "500": { + "description": "Clipboard write failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/display/info": { + "get": { + "tags": ["v1"], + "summary": "Get desktop display information.", + "description": "Performs a health-gated display query against the managed desktop and\nreturns the current display identifier and resolution.", + "operationId": "get_v1_desktop_display_info", + "responses": { + "200": { + "description": "Desktop display information", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopDisplayInfoResponse" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "503": { + "description": "Desktop runtime health or display query failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/keyboard/down": { + "post": { + "tags": ["v1"], + "summary": "Press and hold a desktop keyboard key.", + "description": "Performs a health-gated `xdotool keydown` operation against the managed\ndesktop.", + "operationId": "post_v1_desktop_keyboard_down", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopKeyboardDownRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop keyboard action result", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopActionResponse" + } + } + } + }, + "400": { + "description": "Invalid keyboard down request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/keyboard/press": { + "post": { + "tags": ["v1"], + "summary": "Press a desktop keyboard shortcut.", + "description": "Performs a health-gated `xdotool key` operation against the managed\ndesktop.", + "operationId": "post_v1_desktop_keyboard_press", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopKeyboardPressRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop keyboard action result", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopActionResponse" + } + } + } + }, + "400": { + "description": "Invalid keyboard press request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/keyboard/type": { + "post": { + "tags": ["v1"], + "summary": "Type desktop keyboard text.", + "description": "Performs a health-gated `xdotool type` operation against the managed\ndesktop.", + "operationId": "post_v1_desktop_keyboard_type", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopKeyboardTypeRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop keyboard action result", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopActionResponse" + } + } + } + }, + "400": { + "description": "Invalid keyboard type request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/keyboard/up": { + "post": { + "tags": ["v1"], + "summary": "Release a desktop keyboard key.", + "description": "Performs a health-gated `xdotool keyup` operation against the managed\ndesktop.", + "operationId": "post_v1_desktop_keyboard_up", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopKeyboardUpRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop keyboard action result", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopActionResponse" + } + } + } + }, + "400": { + "description": "Invalid keyboard up request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/launch": { + "post": { + "tags": ["v1"], + "summary": "Launch a desktop application.", + "description": "Launches an application by name on the managed desktop, optionally waiting\nfor its window to appear.", + "operationId": "post_v1_desktop_launch", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopLaunchRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Application launched", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopLaunchResponse" + } + } + } + }, + "404": { + "description": "Application not found", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/mouse/click": { + "post": { + "tags": ["v1"], + "summary": "Click on the desktop.", + "description": "Performs a health-gated pointer move and click against the managed desktop\nand returns the resulting mouse position.", + "operationId": "post_v1_desktop_mouse_click", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMouseClickRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop mouse position after click", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMousePositionResponse" + } + } + } + }, + "400": { + "description": "Invalid mouse click request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/mouse/down": { + "post": { + "tags": ["v1"], + "summary": "Press and hold a desktop mouse button.", + "description": "Performs a health-gated optional pointer move followed by `xdotool mousedown`\nand returns the resulting mouse position.", + "operationId": "post_v1_desktop_mouse_down", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMouseDownRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop mouse position after button press", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMousePositionResponse" + } + } + } + }, + "400": { + "description": "Invalid mouse down request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/mouse/drag": { + "post": { + "tags": ["v1"], + "summary": "Drag the desktop mouse.", + "description": "Performs a health-gated drag gesture against the managed desktop and\nreturns the resulting mouse position.", + "operationId": "post_v1_desktop_mouse_drag", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMouseDragRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop mouse position after drag", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMousePositionResponse" + } + } + } + }, + "400": { + "description": "Invalid mouse drag request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/mouse/move": { + "post": { + "tags": ["v1"], + "summary": "Move the desktop mouse.", + "description": "Performs a health-gated absolute pointer move on the managed desktop and\nreturns the resulting mouse position.", + "operationId": "post_v1_desktop_mouse_move", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMouseMoveRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop mouse position after move", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMousePositionResponse" + } + } + } + }, + "400": { + "description": "Invalid mouse move request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/mouse/position": { + "get": { + "tags": ["v1"], + "summary": "Get the current desktop mouse position.", + "description": "Performs a health-gated mouse position query against the managed desktop.", + "operationId": "get_v1_desktop_mouse_position", + "responses": { + "200": { + "description": "Desktop mouse position", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMousePositionResponse" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input check failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/mouse/scroll": { + "post": { + "tags": ["v1"], + "summary": "Scroll the desktop mouse wheel.", + "description": "Performs a health-gated scroll gesture at the requested coordinates and\nreturns the resulting mouse position.", + "operationId": "post_v1_desktop_mouse_scroll", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMouseScrollRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop mouse position after scroll", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMousePositionResponse" + } + } + } + }, + "400": { + "description": "Invalid mouse scroll request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/mouse/up": { + "post": { + "tags": ["v1"], + "summary": "Release a desktop mouse button.", + "description": "Performs a health-gated optional pointer move followed by `xdotool mouseup`\nand returns the resulting mouse position.", + "operationId": "post_v1_desktop_mouse_up", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMouseUpRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop mouse position after button release", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopMousePositionResponse" + } + } + } + }, + "400": { + "description": "Invalid mouse up request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or input failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/open": { + "post": { + "tags": ["v1"], + "summary": "Open a file or URL with the default handler.", + "description": "Opens a file path or URL using xdg-open on the managed desktop.", + "operationId": "post_v1_desktop_open", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopOpenRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Target opened", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopOpenResponse" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/recording/start": { + "post": { + "tags": ["v1"], + "summary": "Start desktop recording.", + "description": "Starts an ffmpeg x11grab recording against the managed desktop and returns\nthe created recording metadata.", + "operationId": "post_v1_desktop_recording_start", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopRecordingStartRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop recording started", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopRecordingInfo" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready or a recording is already active", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop recording failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/recording/stop": { + "post": { + "tags": ["v1"], + "summary": "Stop desktop recording.", + "description": "Stops the active desktop recording and returns the finalized recording\nmetadata.", + "operationId": "post_v1_desktop_recording_stop", + "responses": { + "200": { + "description": "Desktop recording stopped", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopRecordingInfo" + } + } + } + }, + "409": { + "description": "No active desktop recording", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop recording stop failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/recordings": { + "get": { + "tags": ["v1"], + "summary": "List desktop recordings.", + "description": "Returns the current desktop recording catalog.", + "operationId": "get_v1_desktop_recordings", + "responses": { + "200": { + "description": "Desktop recordings", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopRecordingListResponse" + } + } + } + }, + "502": { + "description": "Desktop recordings query failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/recordings/{id}": { + "get": { + "tags": ["v1"], + "summary": "Get desktop recording metadata.", + "description": "Returns metadata for a single desktop recording.", + "operationId": "get_v1_desktop_recording", + "parameters": [ + { + "name": "id", + "in": "path", + "description": "Desktop recording ID", + "required": true, + "schema": { + "type": "string" + } + } + ], + "responses": { + "200": { + "description": "Desktop recording metadata", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopRecordingInfo" + } + } + } + }, + "404": { + "description": "Unknown desktop recording", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + }, + "delete": { + "tags": ["v1"], + "summary": "Delete a desktop recording.", + "description": "Removes a completed desktop recording and its file from disk.", + "operationId": "delete_v1_desktop_recording", + "parameters": [ + { + "name": "id", + "in": "path", + "description": "Desktop recording ID", + "required": true, + "schema": { + "type": "string" + } + } + ], + "responses": { + "204": { + "description": "Desktop recording deleted" + }, + "404": { + "description": "Unknown desktop recording", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop recording is still active", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/recordings/{id}/download": { + "get": { + "tags": ["v1"], + "summary": "Download a desktop recording.", + "description": "Serves the recorded MP4 bytes for a completed desktop recording.", + "operationId": "get_v1_desktop_recording_download", + "parameters": [ + { + "name": "id", + "in": "path", + "description": "Desktop recording ID", + "required": true, + "schema": { + "type": "string" + } + } + ], + "responses": { + "200": { + "description": "Desktop recording as MP4 bytes" + }, + "404": { + "description": "Unknown desktop recording", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/screenshot": { + "get": { + "tags": ["v1"], + "summary": "Capture a full desktop screenshot.", + "description": "Performs a health-gated full-frame screenshot of the managed desktop and\nreturns the requested image bytes.", + "operationId": "get_v1_desktop_screenshot", + "parameters": [ + { + "name": "format", + "in": "query", + "required": false, + "schema": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopScreenshotFormat" + } + ], + "nullable": true + } + }, + { + "name": "quality", + "in": "query", + "required": false, + "schema": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + } + }, + { + "name": "scale", + "in": "query", + "required": false, + "schema": { + "type": "number", + "format": "float", + "nullable": true + } + }, + { + "name": "showCursor", + "in": "query", + "required": false, + "schema": { + "type": "boolean", + "nullable": true + } + } + ], + "responses": { + "200": { + "description": "Desktop screenshot as image bytes" + }, + "400": { + "description": "Invalid screenshot query", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or screenshot capture failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/screenshot/region": { + "get": { + "tags": ["v1"], + "summary": "Capture a desktop screenshot region.", + "description": "Performs a health-gated screenshot crop against the managed desktop and\nreturns the requested region image bytes.", + "operationId": "get_v1_desktop_screenshot_region", + "parameters": [ + { + "name": "x", + "in": "query", + "required": true, + "schema": { + "type": "integer", + "format": "int32" + } + }, + { + "name": "y", + "in": "query", + "required": true, + "schema": { + "type": "integer", + "format": "int32" + } + }, + { + "name": "width", + "in": "query", + "required": true, + "schema": { + "type": "integer", + "format": "int32", + "minimum": 0 + } + }, + { + "name": "height", + "in": "query", + "required": true, + "schema": { + "type": "integer", + "format": "int32", + "minimum": 0 + } + }, + { + "name": "format", + "in": "query", + "required": false, + "schema": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopScreenshotFormat" + } + ], + "nullable": true + } + }, + { + "name": "quality", + "in": "query", + "required": false, + "schema": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + } + }, + { + "name": "scale", + "in": "query", + "required": false, + "schema": { + "type": "number", + "format": "float", + "nullable": true + } + }, + { + "name": "showCursor", + "in": "query", + "required": false, + "schema": { + "type": "boolean", + "nullable": true + } + } + ], + "responses": { + "200": { + "description": "Desktop screenshot region as image bytes" + }, + "400": { + "description": "Invalid screenshot region", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop runtime health or screenshot capture failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/start": { + "post": { + "tags": ["v1"], + "summary": "Start the private desktop runtime.", + "description": "Lazily launches the managed Xvfb/openbox stack, validates display health,\nand returns the resulting desktop status snapshot.", + "operationId": "post_v1_desktop_start", + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopStartRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Desktop runtime status after start", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopStatusResponse" + } + } + } + }, + "400": { + "description": "Invalid desktop start request", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is already transitioning", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "501": { + "description": "Desktop API unsupported on this platform", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "503": { + "description": "Desktop runtime could not be started", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/status": { + "get": { + "tags": ["v1"], + "summary": "Get desktop runtime status.", + "description": "Returns the current desktop runtime state, dependency status, active\ndisplay metadata, and supervised process information.", + "operationId": "get_v1_desktop_status", + "responses": { + "200": { + "description": "Desktop runtime status", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopStatusResponse" + } + } + } + }, + "401": { + "description": "Authentication required", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/stop": { + "post": { + "tags": ["v1"], + "summary": "Stop the private desktop runtime.", + "description": "Terminates the managed openbox/Xvfb/dbus processes owned by the desktop\nruntime and returns the resulting status snapshot.", + "operationId": "post_v1_desktop_stop", + "responses": { + "200": { + "description": "Desktop runtime status after stop", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopStatusResponse" + } + } + } + }, + "409": { + "description": "Desktop runtime is already transitioning", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/stream/signaling": { + "get": { + "tags": ["v1"], + "summary": "Open a desktop WebRTC signaling session.", + "description": "Upgrades the connection to a WebSocket used for WebRTC signaling between\nthe browser client and the desktop streaming process. Also accepts mouse\nand keyboard input frames as a fallback transport.", + "operationId": "get_v1_desktop_stream_ws", + "parameters": [ + { + "name": "access_token", + "in": "query", + "description": "Bearer token alternative for WS auth", + "required": false, + "schema": { + "type": "string", + "nullable": true + } + } + ], + "responses": { + "101": { + "description": "WebSocket upgraded" + }, + "409": { + "description": "Desktop runtime or streaming session is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "502": { + "description": "Desktop stream failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/stream/start": { + "post": { + "tags": ["v1"], + "summary": "Start desktop streaming.", + "description": "Enables desktop websocket streaming for the managed desktop.", + "operationId": "post_v1_desktop_stream_start", + "responses": { + "200": { + "description": "Desktop streaming started", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopStreamStatusResponse" + } + } + } + } + } + } + }, + "/v1/desktop/stream/status": { + "get": { + "tags": ["v1"], + "summary": "Get desktop stream status.", + "description": "Returns the current state of the desktop WebRTC streaming session.", + "operationId": "get_v1_desktop_stream_status", + "responses": { + "200": { + "description": "Desktop stream status", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopStreamStatusResponse" + } + } + } + } + } + } + }, + "/v1/desktop/stream/stop": { + "post": { + "tags": ["v1"], + "summary": "Stop desktop streaming.", + "description": "Disables desktop websocket streaming for the managed desktop.", + "operationId": "post_v1_desktop_stream_stop", + "responses": { + "200": { + "description": "Desktop streaming stopped", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopStreamStatusResponse" + } + } + } + } + } + } + }, + "/v1/desktop/windows": { + "get": { + "tags": ["v1"], + "summary": "List visible desktop windows.", + "description": "Performs a health-gated visible-window enumeration against the managed\ndesktop and returns the current window metadata.", + "operationId": "get_v1_desktop_windows", + "responses": { + "200": { + "description": "Visible desktop windows", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopWindowListResponse" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "503": { + "description": "Desktop runtime health or window query failed", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/windows/focused": { + "get": { + "tags": ["v1"], + "summary": "Get the currently focused desktop window.", + "description": "Returns information about the window that currently has input focus.", + "operationId": "get_v1_desktop_windows_focused", + "responses": { + "200": { + "description": "Focused window info", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopWindowInfo" + } + } + } + }, + "404": { + "description": "No window is focused", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/windows/{id}/focus": { + "post": { + "tags": ["v1"], + "summary": "Focus a desktop window.", + "description": "Brings the specified window to the foreground and gives it input focus.", + "operationId": "post_v1_desktop_window_focus", + "parameters": [ + { + "name": "id", + "in": "path", + "description": "X11 window ID", + "required": true, + "schema": { + "type": "string" + } + } + ], + "responses": { + "200": { + "description": "Window info after focus", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopWindowInfo" + } + } + } + }, + "404": { + "description": "Window not found", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/windows/{id}/move": { + "post": { + "tags": ["v1"], + "summary": "Move a desktop window.", + "description": "Moves the specified window to the given position.", + "operationId": "post_v1_desktop_window_move", + "parameters": [ + { + "name": "id", + "in": "path", + "description": "X11 window ID", + "required": true, + "schema": { + "type": "string" + } + } + ], + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopWindowMoveRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Window info after move", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopWindowInfo" + } + } + } + }, + "404": { + "description": "Window not found", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, + "/v1/desktop/windows/{id}/resize": { + "post": { + "tags": ["v1"], + "summary": "Resize a desktop window.", + "description": "Resizes the specified window to the given dimensions.", + "operationId": "post_v1_desktop_window_resize", + "parameters": [ + { + "name": "id", + "in": "path", + "description": "X11 window ID", + "required": true, + "schema": { + "type": "string" + } + } + ], + "requestBody": { + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopWindowResizeRequest" + } + } + }, + "required": true + }, + "responses": { + "200": { + "description": "Window info after resize", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/DesktopWindowInfo" + } + } + } + }, + "404": { + "description": "Window not found", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + }, + "409": { + "description": "Desktop runtime is not ready", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ProblemDetails" + } + } + } + } + } + } + }, "/v1/fs/entries": { "get": { "tags": ["v1"], @@ -911,6 +2719,21 @@ "summary": "List all managed processes.", "description": "Returns a list of all processes (running and exited) currently tracked\nby the runtime, sorted by process ID.", "operationId": "get_v1_processes", + "parameters": [ + { + "name": "owner", + "in": "query", + "required": false, + "schema": { + "allOf": [ + { + "$ref": "#/components/schemas/ProcessOwner" + } + ], + "nullable": true + } + } + ], "responses": { "200": { "description": "List processes", @@ -1934,6 +3757,769 @@ } } }, + "DesktopActionResponse": { + "type": "object", + "required": ["ok"], + "properties": { + "ok": { + "type": "boolean" + } + } + }, + "DesktopClipboardQuery": { + "type": "object", + "properties": { + "selection": { + "type": "string", + "nullable": true + } + } + }, + "DesktopClipboardResponse": { + "type": "object", + "required": ["text", "selection"], + "properties": { + "selection": { + "type": "string" + }, + "text": { + "type": "string" + } + } + }, + "DesktopClipboardWriteRequest": { + "type": "object", + "required": ["text"], + "properties": { + "selection": { + "type": "string", + "nullable": true + }, + "text": { + "type": "string" + } + } + }, + "DesktopDisplayInfoResponse": { + "type": "object", + "required": ["display", "resolution"], + "properties": { + "display": { + "type": "string" + }, + "resolution": { + "$ref": "#/components/schemas/DesktopResolution" + } + } + }, + "DesktopErrorInfo": { + "type": "object", + "required": ["code", "message"], + "properties": { + "code": { + "type": "string" + }, + "message": { + "type": "string" + } + } + }, + "DesktopKeyModifiers": { + "type": "object", + "properties": { + "alt": { + "type": "boolean", + "nullable": true + }, + "cmd": { + "type": "boolean", + "nullable": true + }, + "ctrl": { + "type": "boolean", + "nullable": true + }, + "shift": { + "type": "boolean", + "nullable": true + } + } + }, + "DesktopKeyboardDownRequest": { + "type": "object", + "required": ["key"], + "properties": { + "key": { + "type": "string" + } + } + }, + "DesktopKeyboardPressRequest": { + "type": "object", + "required": ["key"], + "properties": { + "key": { + "type": "string" + }, + "modifiers": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopKeyModifiers" + } + ], + "nullable": true + } + } + }, + "DesktopKeyboardTypeRequest": { + "type": "object", + "required": ["text"], + "properties": { + "delayMs": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "text": { + "type": "string" + } + } + }, + "DesktopKeyboardUpRequest": { + "type": "object", + "required": ["key"], + "properties": { + "key": { + "type": "string" + } + } + }, + "DesktopLaunchRequest": { + "type": "object", + "required": ["app"], + "properties": { + "app": { + "type": "string" + }, + "args": { + "type": "array", + "items": { + "type": "string" + }, + "nullable": true + }, + "wait": { + "type": "boolean", + "nullable": true + } + } + }, + "DesktopLaunchResponse": { + "type": "object", + "required": ["processId"], + "properties": { + "pid": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "processId": { + "type": "string" + }, + "windowId": { + "type": "string", + "nullable": true + } + } + }, + "DesktopMouseButton": { + "type": "string", + "enum": ["left", "middle", "right"] + }, + "DesktopMouseClickRequest": { + "type": "object", + "required": ["x", "y"], + "properties": { + "button": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopMouseButton" + } + ], + "nullable": true + }, + "clickCount": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "x": { + "type": "integer", + "format": "int32" + }, + "y": { + "type": "integer", + "format": "int32" + } + } + }, + "DesktopMouseDownRequest": { + "type": "object", + "properties": { + "button": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopMouseButton" + } + ], + "nullable": true + }, + "x": { + "type": "integer", + "format": "int32", + "nullable": true + }, + "y": { + "type": "integer", + "format": "int32", + "nullable": true + } + } + }, + "DesktopMouseDragRequest": { + "type": "object", + "required": ["startX", "startY", "endX", "endY"], + "properties": { + "button": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopMouseButton" + } + ], + "nullable": true + }, + "endX": { + "type": "integer", + "format": "int32" + }, + "endY": { + "type": "integer", + "format": "int32" + }, + "startX": { + "type": "integer", + "format": "int32" + }, + "startY": { + "type": "integer", + "format": "int32" + } + } + }, + "DesktopMouseMoveRequest": { + "type": "object", + "required": ["x", "y"], + "properties": { + "x": { + "type": "integer", + "format": "int32" + }, + "y": { + "type": "integer", + "format": "int32" + } + } + }, + "DesktopMousePositionResponse": { + "type": "object", + "required": ["x", "y"], + "properties": { + "screen": { + "type": "integer", + "format": "int32", + "nullable": true + }, + "window": { + "type": "string", + "nullable": true + }, + "x": { + "type": "integer", + "format": "int32" + }, + "y": { + "type": "integer", + "format": "int32" + } + } + }, + "DesktopMouseScrollRequest": { + "type": "object", + "required": ["x", "y"], + "properties": { + "deltaX": { + "type": "integer", + "format": "int32", + "nullable": true + }, + "deltaY": { + "type": "integer", + "format": "int32", + "nullable": true + }, + "x": { + "type": "integer", + "format": "int32" + }, + "y": { + "type": "integer", + "format": "int32" + } + } + }, + "DesktopMouseUpRequest": { + "type": "object", + "properties": { + "button": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopMouseButton" + } + ], + "nullable": true + }, + "x": { + "type": "integer", + "format": "int32", + "nullable": true + }, + "y": { + "type": "integer", + "format": "int32", + "nullable": true + } + } + }, + "DesktopOpenRequest": { + "type": "object", + "required": ["target"], + "properties": { + "target": { + "type": "string" + } + } + }, + "DesktopOpenResponse": { + "type": "object", + "required": ["processId"], + "properties": { + "pid": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "processId": { + "type": "string" + } + } + }, + "DesktopProcessInfo": { + "type": "object", + "required": ["name", "running"], + "properties": { + "logPath": { + "type": "string", + "nullable": true + }, + "name": { + "type": "string" + }, + "pid": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "running": { + "type": "boolean" + } + } + }, + "DesktopRecordingInfo": { + "type": "object", + "required": ["id", "status", "fileName", "bytes", "startedAt"], + "properties": { + "bytes": { + "type": "integer", + "format": "int64", + "minimum": 0 + }, + "endedAt": { + "type": "string", + "nullable": true + }, + "fileName": { + "type": "string" + }, + "id": { + "type": "string" + }, + "processId": { + "type": "string", + "nullable": true + }, + "startedAt": { + "type": "string" + }, + "status": { + "$ref": "#/components/schemas/DesktopRecordingStatus" + } + } + }, + "DesktopRecordingListResponse": { + "type": "object", + "required": ["recordings"], + "properties": { + "recordings": { + "type": "array", + "items": { + "$ref": "#/components/schemas/DesktopRecordingInfo" + } + } + } + }, + "DesktopRecordingStartRequest": { + "type": "object", + "properties": { + "fps": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + } + } + }, + "DesktopRecordingStatus": { + "type": "string", + "enum": ["recording", "completed", "failed"] + }, + "DesktopRegionScreenshotQuery": { + "type": "object", + "required": ["x", "y", "width", "height"], + "properties": { + "format": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopScreenshotFormat" + } + ], + "nullable": true + }, + "height": { + "type": "integer", + "format": "int32", + "minimum": 0 + }, + "quality": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "scale": { + "type": "number", + "format": "float", + "nullable": true + }, + "showCursor": { + "type": "boolean", + "nullable": true + }, + "width": { + "type": "integer", + "format": "int32", + "minimum": 0 + }, + "x": { + "type": "integer", + "format": "int32" + }, + "y": { + "type": "integer", + "format": "int32" + } + } + }, + "DesktopResolution": { + "type": "object", + "required": ["width", "height"], + "properties": { + "dpi": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "height": { + "type": "integer", + "format": "int32", + "minimum": 0 + }, + "width": { + "type": "integer", + "format": "int32", + "minimum": 0 + } + } + }, + "DesktopScreenshotFormat": { + "type": "string", + "enum": ["png", "jpeg", "webp"] + }, + "DesktopScreenshotQuery": { + "type": "object", + "properties": { + "format": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopScreenshotFormat" + } + ], + "nullable": true + }, + "quality": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "scale": { + "type": "number", + "format": "float", + "nullable": true + }, + "showCursor": { + "type": "boolean", + "nullable": true + } + } + }, + "DesktopStartRequest": { + "type": "object", + "properties": { + "displayNum": { + "type": "integer", + "format": "int32", + "nullable": true + }, + "dpi": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "height": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "recordingFps": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "stateDir": { + "type": "string", + "nullable": true + }, + "streamAudioCodec": { + "type": "string", + "nullable": true + }, + "streamFrameRate": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + }, + "streamVideoCodec": { + "type": "string", + "nullable": true + }, + "webrtcPortRange": { + "type": "string", + "nullable": true + }, + "width": { + "type": "integer", + "format": "int32", + "nullable": true, + "minimum": 0 + } + } + }, + "DesktopState": { + "type": "string", + "enum": ["inactive", "install_required", "starting", "active", "stopping", "failed"] + }, + "DesktopStatusResponse": { + "type": "object", + "required": ["state"], + "properties": { + "display": { + "type": "string", + "nullable": true + }, + "installCommand": { + "type": "string", + "nullable": true + }, + "lastError": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopErrorInfo" + } + ], + "nullable": true + }, + "missingDependencies": { + "type": "array", + "items": { + "type": "string" + } + }, + "processes": { + "type": "array", + "items": { + "$ref": "#/components/schemas/DesktopProcessInfo" + } + }, + "resolution": { + "allOf": [ + { + "$ref": "#/components/schemas/DesktopResolution" + } + ], + "nullable": true + }, + "runtimeLogPath": { + "type": "string", + "nullable": true + }, + "startedAt": { + "type": "string", + "nullable": true + }, + "state": { + "$ref": "#/components/schemas/DesktopState" + }, + "windows": { + "type": "array", + "items": { + "$ref": "#/components/schemas/DesktopWindowInfo" + }, + "description": "Current visible windows (included when the desktop is active)." + } + } + }, + "DesktopStreamStatusResponse": { + "type": "object", + "required": ["active"], + "properties": { + "active": { + "type": "boolean" + }, + "processId": { + "type": "string", + "nullable": true + }, + "windowId": { + "type": "string", + "nullable": true + } + } + }, + "DesktopWindowInfo": { + "type": "object", + "required": ["id", "title", "x", "y", "width", "height", "isActive"], + "properties": { + "height": { + "type": "integer", + "format": "int32", + "minimum": 0 + }, + "id": { + "type": "string" + }, + "isActive": { + "type": "boolean" + }, + "title": { + "type": "string" + }, + "width": { + "type": "integer", + "format": "int32", + "minimum": 0 + }, + "x": { + "type": "integer", + "format": "int32" + }, + "y": { + "type": "integer", + "format": "int32" + } + } + }, + "DesktopWindowListResponse": { + "type": "object", + "required": ["windows"], + "properties": { + "windows": { + "type": "array", + "items": { + "$ref": "#/components/schemas/DesktopWindowInfo" + } + } + } + }, + "DesktopWindowMoveRequest": { + "type": "object", + "required": ["x", "y"], + "properties": { + "x": { + "type": "integer", + "format": "int32" + }, + "y": { + "type": "integer", + "format": "int32" + } + } + }, + "DesktopWindowResizeRequest": { + "type": "object", + "required": ["width", "height"], + "properties": { + "height": { + "type": "integer", + "format": "int32", + "minimum": 0 + }, + "width": { + "type": "integer", + "format": "int32", + "minimum": 0 + } + } + }, "ErrorType": { "type": "string", "enum": [ @@ -2326,7 +4912,7 @@ }, "ProcessInfo": { "type": "object", - "required": ["id", "command", "args", "tty", "interactive", "status", "createdAtMs"], + "required": ["id", "command", "args", "tty", "interactive", "owner", "status", "createdAtMs"], "properties": { "args": { "type": "array", @@ -2361,6 +4947,9 @@ "interactive": { "type": "boolean" }, + "owner": { + "$ref": "#/components/schemas/ProcessOwner" + }, "pid": { "type": "integer", "format": "int32", @@ -2398,6 +4987,19 @@ } } }, + "ProcessListQuery": { + "type": "object", + "properties": { + "owner": { + "allOf": [ + { + "$ref": "#/components/schemas/ProcessOwner" + } + ], + "nullable": true + } + } + }, "ProcessListResponse": { "type": "object", "required": ["processes"], @@ -2484,6 +5086,10 @@ "type": "string", "enum": ["stdout", "stderr", "combined", "pty"] }, + "ProcessOwner": { + "type": "string", + "enum": ["user", "desktop", "system"] + }, "ProcessRunRequest": { "type": "object", "required": ["command"], diff --git a/docs/pi-support-plan.md b/docs/pi-support-plan.md deleted file mode 100644 index 5e207a5..0000000 --- a/docs/pi-support-plan.md +++ /dev/null @@ -1,210 +0,0 @@ -# Pi Agent Support Plan (pi-mono) - -## Implementation Status Update - -- Runtime selection now supports two internal modes: - - `PerSession` (default for unknown/non-allowlisted Pi capabilities) - - `Shared` (allowlist-only compatibility path) -- Pi sessions now use per-session process isolation by default, enabling true concurrent Pi sessions in Inspector and API clients. -- Shared Pi server code remains available and is used only when capability checks allow multiplexing. -- Session termination for per-session Pi mode hard-kills the underlying Pi process and clears queued prompts/pending waiters. -- In-session concurrent sends are serialized with an unbounded daemon-side FIFO queue per session. - -## Investigation Summary - -### Pi CLI modes and RPC protocol -- Pi supports multiple modes including interactive, print/JSON output, RPC, and SDK usage. JSON mode outputs a stream of JSON events suitable for parsing, and RPC mode is intended for programmatic control over stdin/stdout. -- RPC mode is started with `pi --mode rpc` and supports options like `--provider`, `--model`, `--no-session`, and `--session-dir`. -- The RPC protocol is newline-delimited JSON over stdin/stdout: - - Commands are JSON objects written to stdin. - - Responses are JSON objects with `type: "response"` and optional `id`. - - Events are JSON objects without `id`. -- `prompt` can include images using `ImageContent` (base64 or URL) alongside text. -- JSON/print mode (`pi -p` or `pi --print --mode json`) produces JSONL for non-interactive parsing and can resume sessions with a token. - -### RPC commands -RPC commands listed in `rpc.md` include: -- `new_session`, `get_state`, `list_sessions`, `delete_session`, `rename_session`, `clear_session` -- `prompt`, `queue_message`, `abort`, `get_queued_messages` - -### RPC event types -RPC events listed in `rpc.md` include: -- `agent_start`, `agent_end` -- `turn_start`, `turn_end` -- `message_start`, `message_update`, `message_end` -- `tool_execution_start`, `tool_execution_update`, `tool_execution_end` -- `auto_compaction`, `auto_retry`, `hook_error` - -`message_update` uses `assistantMessageEvent` deltas such as: -- `start`, `text_start`, `text_delta`, `text_end` -- `thinking_start`, `thinking_delta`, `thinking_end` -- `toolcall_start`, `toolcall_delta`, `toolcall_end` -- `toolcall_args_start`, `toolcall_args_delta`, `toolcall_args_end` -- `done`, `error` - -`tool_execution_update` includes `partialResult`, which is described as accumulated output so far. - -### Schema source locations (pi-mono) -RPC types are documented as living in: -- `packages/ai/src/types.ts` (Model types) -- `packages/agent/src/types.ts` (AgentResponse types) -- `packages/coding-agent/src/core/messages.ts` (message types) -- `packages/coding-agent/src/modes/rpc/rpc-types.ts` (RPC protocol types) - -### Distribution assets -Pi releases provide platform-specific binaries such as: -- `pi-darwin-arm64`, `pi-darwin-x64` -- `pi-linux-arm64`, `pi-linux-x64` -- `pi-win-x64.zip` - -## Integration Decisions -- Follow the OpenCode pattern: a shared long-running process (stdio RPC) with session multiplexing. -- Primary integration path is RPC streaming (`pi --mode rpc`). -- JSON/print mode is a fallback only (diagnostics or non-interactive runs). -- Create sessions via `new_session`; store the returned `sessionId` as `native_session_id`. -- Use `get_state` as a re-sync path after server restarts. -- Use `prompt` for send-message, with optional image content. -- Convert Pi events into universal events; emit daemon synthetic `session.started` on session creation and `session.ended` only on errors/termination. - -## Implementation Plan - -### 1) Agent Identity + Capabilities -Files: -- `server/packages/agent-management/src/agents.rs` -- `server/packages/sandbox-agent/src/router.rs` -- `docs/cli.mdx`, `docs/conversion.mdx`, `docs/session-transcript-schema.mdx` -- `README.md`, `frontend/packages/website/src/components/FAQ.tsx` - -Tasks: -- Add `AgentId::Pi` with string/binary name `"pi"` and parsing rules. -- Add Pi to `all_agents()` and agent lists. -- Define `AgentCapabilities` for Pi: - - `tool_calls=true`, `tool_results=true` - - `text_messages=true`, `streaming_deltas=true`, `item_started=true` - - `reasoning=true` (from `thinking_*` deltas) - - `images=true` (ImageContent in `prompt`) - - `permissions=false`, `questions=false`, `mcp_tools=false` - - `shared_process=true`, `session_lifecycle=false` (no native session events) - - `error_events=true` (hook_error) - - `command_execution=false`, `file_changes=false`, `file_attachments=false` - -### 2) Installer and Binary Resolution -Files: -- `server/packages/agent-management/src/agents.rs` - -Tasks: -- Add `install_pi()` that: - - Downloads the correct release asset per platform (`pi-`). - - Handles `.zip` on Windows and raw binaries elsewhere. - - Marks binary executable. -- Add Pi to `AgentManager::install`, `is_installed`, `version`. -- Version detection: try `--version`, `version`, `-V`. - -### 3) Schema Extraction for Pi -Files: -- `resources/agent-schemas/src/pi.ts` (new) -- `resources/agent-schemas/src/index.ts` -- `resources/agent-schemas/artifacts/json-schema/pi.json` -- `server/packages/extracted-agent-schemas/build.rs` -- `server/packages/extracted-agent-schemas/src/lib.rs` - -Tasks: -- Implement `extractPiSchema()`: - - Download pi-mono sources (zip/tarball) into a temp dir. - - Use `ts-json-schema-generator` against `packages/coding-agent/src/modes/rpc/rpc-types.ts`. - - Include dependent files per `rpc.md` (ai/types, agent/types, core/messages). - - Extract `RpcEvent`, `RpcResponse`, `RpcCommand` unions (exact type names from source). -- Add fallback schema if remote fetch fails (minimal union with event/response fields). -- Wire pi into extractor index and artifact generation. - -### 4) Universal Schema Conversion (Pi -> Universal) -Files: -- `server/packages/universal-agent-schema/src/agents/pi.rs` (new) -- `server/packages/universal-agent-schema/src/agents/mod.rs` -- `server/packages/universal-agent-schema/src/lib.rs` -- `server/packages/sandbox-agent/src/router.rs` - -Mapping rules: -- `message_start` -> `item.started` (kind=message, role=assistant, native_item_id=messageId) -- `message_update`: - - `text_*` -> `item.delta` (assistant text delta) - - `thinking_*` -> `item.delta` with `ContentPart::Reasoning` (visibility=Private) - - `toolcall_*` and `toolcall_args_*` -> ignore for now (tool_execution_* is authoritative) - - `error` -> `item.completed` with `ItemStatus::Failed` (if no later message_end) -- `message_end` -> `item.completed` (finalize assistant message) -- `tool_execution_start` -> `item.started` (kind=tool_call, ContentPart::ToolCall) -- `tool_execution_update` -> `item.delta` for a synthetic tool_result item: - - Maintain a per-toolCallId buffer to compute delta from accumulated `partialResult`. -- `tool_execution_end` -> `item.completed` (kind=tool_result, output from `result.content`) - - If `isError=true`, set item status to failed. -- `agent_start`, `turn_start`, `turn_end`, `agent_end`, `auto_compaction`, `auto_retry`, `hook_error`: - - Map to `ItemKind::Status` with a label like `pi.agent_start`, `pi.auto_retry`, etc. - - Do not emit `session.ended` for these events. -- If event parsing fails, emit `agent.unparsed` (source=daemon, synthetic=true) and fail tests. - -### 5) Shared RPC Server Integration -Files: -- `server/packages/sandbox-agent/src/router.rs` - -Tasks: -- Add a new managed stdio server type for Pi, similar to Codex: - - Create `PiServer` struct with: - - stdin sender - - pending request map keyed by request id - - per-session native session id mapping - - Extend `ManagedServerKind` to include Pi. - - Add `ensure_pi_server()` and `spawn_pi_server()` using `pi --mode rpc`. - - Add a `handle_pi_server_output()` loop to parse stdout lines into events/responses. -- Session creation: - - On `create_session`, ensure Pi server is running, send `new_session`, store sessionId. - - Register session with `server_manager.register_session` for native mapping. -- Sending messages: - - Use `prompt` command; include sessionId and optional images. - - Emit synthetic `item.started` only if Pi does not emit `message_start`. - -### 6) Router + Streaming Path Changes -Files: -- `server/packages/sandbox-agent/src/router.rs` - -Tasks: -- Add Pi handling to: - - `create_session` (new_session) - - `send_message` (prompt) - - `parse_agent_line` (Pi event conversion) - - `agent_modes` (default to `default` unless Pi exposes a mode list) - - `agent_supports_resume` (true if Pi supports session resume) - -### 7) Tests -Files: -- `server/packages/sandbox-agent/tests/...` -- `server/packages/universal-agent-schema/tests/...` (if present) - -Tasks: -- Unit tests for conversion: - - `message_start/update/end` -> item.started/delta/completed - - `tool_execution_*` -> tool call/result mapping with partialResult delta - - failure -> agent.unparsed -- Integration tests: - - Start Pi RPC server, create session, send prompt, stream events. - - Validate `native_session_id` mapping and event ordering. -- Update HTTP/SSE test coverage to include Pi agent if relevant. - -## Risk Areas / Edge Cases -- `tool_execution_update.partialResult` is cumulative; must compute deltas. -- `message_update` may emit `done`/`error` without `message_end`; handle both paths. -- No native session lifecycle events; rely on daemon synthetic events. -- Session recovery after RPC server restart requires `get_state` + re-register sessions. - -## Acceptance Criteria -- Pi appears in `/v1/agents`, CLI list, and docs. -- `create_session` returns `native_session_id` from Pi `new_session`. -- Streaming prompt yields universal events with proper ordering: - - message -> item.started/delta/completed - - tool execution -> tool call + tool result -- Tests pass and no synthetic data is used in test fixtures. - -## Sources -- https://upd.dev/badlogic/pi-mono/src/commit/d36e0ea07303d8a76d51b4a7bd5f0d6d3c490860/packages/coding-agent/docs/rpc.md -- https://buildwithpi.ai/pi-cli -- https://takopi.dev/docs/pi-cli/ -- https://upd.dev/badlogic/pi-mono/releases diff --git a/docs/quickstart.mdx b/docs/quickstart.mdx index 5c299c3..223a54d 100644 --- a/docs/quickstart.mdx +++ b/docs/quickstart.mdx @@ -1,370 +1,289 @@ --- title: "Quickstart" -description: "Get a coding agent running in a sandbox in under a minute." +description: "Start the server and send your first message." icon: "rocket" --- - + - + ```bash - npm install sandbox-agent@0.3.x + npx skills add rivet-dev/skills -s sandbox-agent ``` - + ```bash - bun add sandbox-agent@0.3.x - # Allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()). - bun pm trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 + bunx skills add rivet-dev/skills -s sandbox-agent ``` - - `SandboxAgent.start()` provisions a sandbox, starts a lightweight [Sandbox Agent server](/architecture) inside it, and connects your SDK client. + + Each coding agent requires API keys to connect to their respective LLM providers. - + ```bash - npm install sandbox-agent@0.3.x + export ANTHROPIC_API_KEY="sk-ant-..." + export OPENAI_API_KEY="sk-..." ``` - - ```typescript - import { SandboxAgent } from "sandbox-agent"; - import { local } from "sandbox-agent/local"; - - // Runs on your machine. Inherits process.env automatically. - const client = await SandboxAgent.start({ - sandbox: local(), - }); - ``` - - See [Local deploy guide](/deploy/local) - ```bash - npm install sandbox-agent@0.3.x @e2b/code-interpreter - ``` - ```typescript - import { SandboxAgent } from "sandbox-agent"; - import { e2b } from "sandbox-agent/e2b"; + import { Sandbox } from "@e2b/code-interpreter"; - // Provisions a cloud sandbox on E2B, installs the server, and connects. - const client = await SandboxAgent.start({ - sandbox: e2b(), - }); + const envs: Record = {}; + if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; + if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; + + const sandbox = await Sandbox.create({ envs }); ``` - - See [E2B deploy guide](/deploy/e2b) - ```bash - npm install sandbox-agent@0.3.x @daytonaio/sdk - ``` - ```typescript - import { SandboxAgent } from "sandbox-agent"; - import { daytona } from "sandbox-agent/daytona"; + import { Daytona } from "@daytonaio/sdk"; - // Provisions a Daytona workspace with the server pre-installed. - const client = await SandboxAgent.start({ - sandbox: daytona(), + const envVars: Record = {}; + if (process.env.ANTHROPIC_API_KEY) envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; + if (process.env.OPENAI_API_KEY) envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY; + + const daytona = new Daytona(); + const sandbox = await daytona.create({ + snapshot: "sandbox-agent-ready", + envVars, }); ``` - - See [Daytona deploy guide](/deploy/daytona) - - - - ```bash - npm install sandbox-agent@0.3.x @vercel/sandbox - ``` - - ```typescript - import { SandboxAgent } from "sandbox-agent"; - import { vercel } from "sandbox-agent/vercel"; - - // Provisions a Vercel sandbox with the server installed on boot. - const client = await SandboxAgent.start({ - sandbox: vercel(), - }); - ``` - - See [Vercel deploy guide](/deploy/vercel) - - - - ```bash - npm install sandbox-agent@0.3.x modal - ``` - - ```typescript - import { SandboxAgent } from "sandbox-agent"; - import { modal } from "sandbox-agent/modal"; - - // Builds a container image with agents pre-installed (cached after first run), - // starts a Modal sandbox from that image, and connects. - const client = await SandboxAgent.start({ - sandbox: modal(), - }); - ``` - - See [Modal deploy guide](/deploy/modal) - - - - ```bash - npm install sandbox-agent@0.3.x @cloudflare/sandbox - ``` - - ```typescript - import { SandboxAgent } from "sandbox-agent"; - import { cloudflare } from "sandbox-agent/cloudflare"; - import { SandboxClient } from "@cloudflare/sandbox"; - - // Uses the Cloudflare Sandbox SDK to provision and connect. - // The Cloudflare SDK handles server lifecycle internally. - const cfSandboxClient = new SandboxClient(); - const client = await SandboxAgent.start({ - sandbox: cloudflare({ sdk: cfSandboxClient }), - }); - ``` - - See [Cloudflare deploy guide](/deploy/cloudflare) ```bash - npm install sandbox-agent@0.3.x dockerode get-port + docker run -p 2468:2468 \ + -e ANTHROPIC_API_KEY="sk-ant-..." \ + -e OPENAI_API_KEY="sk-..." \ + rivetdev/sandbox-agent:0.4.2-full \ + server --no-token --host 0.0.0.0 --port 2468 ``` - - ```typescript - import { SandboxAgent } from "sandbox-agent"; - import { docker } from "sandbox-agent/docker"; - - // Runs a Docker container locally. Good for testing. - const client = await SandboxAgent.start({ - sandbox: docker(), - }); - ``` - - See [Docker deploy guide](/deploy/docker) -
- - **More info:** - - - Agents need API keys for their LLM provider. Each provider passes credentials differently: - - ```typescript - // Local — inherits process.env automatically - - // E2B - e2b({ create: { envs: { ANTHROPIC_API_KEY: "..." } } }) - - // Daytona - daytona({ create: { envVars: { ANTHROPIC_API_KEY: "..." } } }) - - // Vercel - vercel({ create: { env: { ANTHROPIC_API_KEY: "..." } } }) - - // Modal - modal({ create: { secrets: { ANTHROPIC_API_KEY: "..." } } }) - - // Docker - docker({ env: ["ANTHROPIC_API_KEY=..."] }) - ``` - - For multi-tenant billing, per-user keys, and gateway options, see [LLM Credentials](/llm-credentials). + + Use `sandbox-agent credentials extract-env --export` to extract your existing API keys (Anthropic, OpenAI, etc.) from local Claude Code or Codex config files. - - - Implement the `SandboxProvider` interface to use any sandbox platform: - - ```typescript - import { SandboxAgent, type SandboxProvider } from "sandbox-agent"; - - const myProvider: SandboxProvider = { - name: "my-provider", - async create() { - // Provision a sandbox, install & start the server, return an ID - return "sandbox-123"; - }, - async destroy(sandboxId) { - // Tear down the sandbox - }, - async getUrl(sandboxId) { - // Return the Sandbox Agent server URL - return `https://${sandboxId}.my-platform.dev:3000`; - }, - }; - - const client = await SandboxAgent.start({ - sandbox: myProvider, - }); - ``` + + Use the `mock` agent for SDK and integration testing without provider credentials. - - - If you already have a Sandbox Agent server running, connect directly: - - ```typescript - const client = await SandboxAgent.connect({ - baseUrl: "http://127.0.0.1:2468", - }); - ``` - - - - - - ```bash - curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh - sandbox-agent server --no-token --host 0.0.0.0 --port 2468 - ``` - - - ```bash - npx @sandbox-agent/cli@0.3.x server --no-token --host 0.0.0.0 --port 2468 - ``` - - - ```bash - docker run -p 2468:2468 \ - -e ANTHROPIC_API_KEY="sk-ant-..." \ - -e OPENAI_API_KEY="sk-..." \ - rivetdev/sandbox-agent:0.4.0-full \ - server --no-token --host 0.0.0.0 --port 2468 - ``` - - + + For per-tenant token tracking, budget enforcement, or usage-based billing, see [LLM Credentials](/llm-credentials) for gateway options like OpenRouter, LiteLLM, and Portkey. - - + + + + Install and run the binary directly. - ```typescript Claude - const session = await client.createSession({ - agent: "claude", - }); + ```bash + curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh + sandbox-agent server --no-token --host 0.0.0.0 --port 2468 + ``` + - session.onEvent((event) => { - console.log(event.sender, event.payload); - }); + + Run without installing globally. - const result = await session.prompt([ - { type: "text", text: "Summarize the repository and suggest next steps." }, - ]); + ```bash + npx @sandbox-agent/cli@0.4.x server --no-token --host 0.0.0.0 --port 2468 + ``` + - console.log(result.stopReason); - ``` + + Run without installing globally. - ```typescript Codex - const session = await client.createSession({ - agent: "codex", - }); + ```bash + bunx @sandbox-agent/cli@0.4.x server --no-token --host 0.0.0.0 --port 2468 + ``` + - session.onEvent((event) => { - console.log(event.sender, event.payload); - }); + + Install globally, then run. - const result = await session.prompt([ - { type: "text", text: "Summarize the repository and suggest next steps." }, - ]); + ```bash + npm install -g @sandbox-agent/cli@0.4.x + sandbox-agent server --no-token --host 0.0.0.0 --port 2468 + ``` + - console.log(result.stopReason); - ``` + + Install globally, then run. - ```typescript OpenCode - const session = await client.createSession({ - agent: "opencode", - }); + ```bash + bun add -g @sandbox-agent/cli@0.4.x + # Allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()). + bun pm -g trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 + sandbox-agent server --no-token --host 0.0.0.0 --port 2468 + ``` + - session.onEvent((event) => { - console.log(event.sender, event.payload); - }); + + For local development, use `SandboxAgent.start()` to spawn and manage the server as a subprocess. - const result = await session.prompt([ - { type: "text", text: "Summarize the repository and suggest next steps." }, - ]); + ```bash + npm install sandbox-agent@0.4.x + ``` - console.log(result.stopReason); - ``` + ```typescript + import { SandboxAgent } from "sandbox-agent"; - ```typescript Cursor - const session = await client.createSession({ - agent: "cursor", - }); + const sdk = await SandboxAgent.start(); + ``` + - session.onEvent((event) => { - console.log(event.sender, event.payload); - }); + + For local development, use `SandboxAgent.start()` to spawn and manage the server as a subprocess. - const result = await session.prompt([ - { type: "text", text: "Summarize the repository and suggest next steps." }, - ]); + ```bash + bun add sandbox-agent@0.4.x + # Allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()). + bun pm trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 + ``` - console.log(result.stopReason); - ``` + ```typescript + import { SandboxAgent } from "sandbox-agent"; - ```typescript Amp - const session = await client.createSession({ - agent: "amp", - }); + const sdk = await SandboxAgent.start(); + ``` + - session.onEvent((event) => { - console.log(event.sender, event.payload); - }); + + If you're running from source instead of the installed CLI. - const result = await session.prompt([ - { type: "text", text: "Summarize the repository and suggest next steps." }, - ]); + ```bash + cargo run -p sandbox-agent -- server --no-token --host 0.0.0.0 --port 2468 + ``` + + - console.log(result.stopReason); - ``` + Binding to `0.0.0.0` allows the server to accept connections from any network interface, which is required when running inside a sandbox where clients connect remotely. - ```typescript Pi - const session = await client.createSession({ - agent: "pi", - }); + + + Tokens are usually not required. Most sandbox providers (E2B, Daytona, etc.) already secure networking at the infrastructure layer. - session.onEvent((event) => { - console.log(event.sender, event.payload); - }); + If you expose the server publicly, use `--token "$SANDBOX_TOKEN"` to require authentication: - const result = await session.prompt([ - { type: "text", text: "Summarize the repository and suggest next steps." }, - ]); + ```bash + sandbox-agent server --token "$SANDBOX_TOKEN" --host 0.0.0.0 --port 2468 + ``` - console.log(result.stopReason); - ``` + Then pass the token when connecting: - + + + ```typescript + import { SandboxAgent } from "sandbox-agent"; - See [Agent Sessions](/agent-sessions) for the full sessions API. + const sdk = await SandboxAgent.connect({ + baseUrl: "http://your-server:2468", + token: process.env.SANDBOX_TOKEN, + }); + ``` + + + + ```bash + curl "http://your-server:2468/v1/health" \ + -H "Authorization: Bearer $SANDBOX_TOKEN" + ``` + + + + ```bash + sandbox-agent --token "$SANDBOX_TOKEN" api agents list \ + --endpoint http://your-server:2468 + ``` + + + + + If you're calling the server from a browser, see the [CORS configuration guide](/cors). + + - - ```typescript - await client.destroySandbox(); // tears down the sandbox and disconnects + + To preinstall agents: + + ```bash + sandbox-agent install-agent --all ``` - Use `client.dispose()` instead to disconnect without destroying the sandbox (for reconnecting later). + If agents are not installed up front, they are lazily installed when creating a session. - - Open the Inspector at `/ui/` on your server (e.g. `http://localhost:2468/ui/`) to view sessions and events in a GUI. + + If you want to use `/v1/desktop/*`, install the desktop runtime packages first: + + ```bash + sandbox-agent install desktop --yes + ``` + + Then use `GET /v1/desktop/status` or `sdk.getDesktopStatus()` to verify the runtime is ready before calling desktop screenshot or input APIs. + + + + ```typescript + import { SandboxAgent } from "sandbox-agent"; + + const sdk = await SandboxAgent.connect({ + baseUrl: "http://127.0.0.1:2468", + }); + + const session = await sdk.createSession({ + agent: "claude", + sessionInit: { + cwd: "/", + mcpServers: [], + }, + }); + + console.log(session.id); + ``` + + + + ```typescript + const result = await session.prompt([ + { type: "text", text: "Summarize the repository and suggest next steps." }, + ]); + + console.log(result.stopReason); + ``` + + + + ```typescript + const off = session.onEvent((event) => { + console.log(event.sender, event.payload); + }); + + const page = await sdk.getEvents({ + sessionId: session.id, + limit: 50, + }); + + console.log(page.items.length); + off(); + ``` + + + + Open the Inspector UI at `/ui/` on your server (for example, `http://localhost:2468/ui/`) to inspect sessions and events in a GUI. Sandbox Agent Inspector @@ -372,44 +291,16 @@ icon: "rocket" -## Full example - -```typescript -import { SandboxAgent } from "sandbox-agent"; -import { e2b } from "sandbox-agent/e2b"; - -const client = await SandboxAgent.start({ - sandbox: e2b({ - create: { - envs: { ANTHROPIC_API_KEY: process.env.ANTHROPIC_API_KEY }, - }, - }), -}); - -try { - const session = await client.createSession({ agent: "claude" }); - - session.onEvent((event) => { - console.log(`[${event.sender}]`, JSON.stringify(event.payload)); - }); - - const result = await session.prompt([ - { type: "text", text: "Write a function that checks if a number is prime." }, - ]); - - console.log("Done:", result.stopReason); -} finally { - await client.destroySandbox(); -} -``` - ## Next steps - - - Full TypeScript SDK API surface. + + + Configure in-memory, Rivet Actor state, IndexedDB, SQLite, and Postgres persistence. - Deploy to E2B, Daytona, Docker, Vercel, or Cloudflare. + Deploy your agent to E2B, Daytona, Docker, Vercel, or Cloudflare. + + + Use the latest TypeScript SDK API. diff --git a/docs/react-components.mdx b/docs/react-components.mdx index 93183b2..71a76d2 100644 --- a/docs/react-components.mdx +++ b/docs/react-components.mdx @@ -17,7 +17,7 @@ Current exports: ## Install ```bash -npm install @sandbox-agent/react@0.3.x +npm install @sandbox-agent/react@0.4.x ``` ## Full example diff --git a/docs/sdk-overview.mdx b/docs/sdk-overview.mdx index a0f9b84..73e0d35 100644 --- a/docs/sdk-overview.mdx +++ b/docs/sdk-overview.mdx @@ -11,12 +11,12 @@ The TypeScript SDK is centered on `sandbox-agent` and its `SandboxAgent` class. ```bash - npm install sandbox-agent@0.3.x + npm install sandbox-agent@0.4.x ``` ```bash - bun add sandbox-agent@0.3.x + bun add sandbox-agent@0.4.x # Allow Bun to run postinstall scripts for native binaries (required for SandboxAgent.start()). bun pm trust @sandbox-agent/cli-linux-x64 @sandbox-agent/cli-linux-arm64 @sandbox-agent/cli-darwin-arm64 @sandbox-agent/cli-darwin-x64 @sandbox-agent/cli-win32-x64 ``` @@ -26,7 +26,7 @@ The TypeScript SDK is centered on `sandbox-agent` and its `SandboxAgent` class. ## Optional React components ```bash -npm install @sandbox-agent/react@0.3.x +npm install @sandbox-agent/react@0.4.x ``` ## Create a client @@ -87,7 +87,7 @@ const sdk = await SandboxAgent.start({ // sdk.sandboxId — prefixed provider ID (e.g. "local/127.0.0.1:2468") -await sdk.destroySandbox(); // tears down sandbox + disposes client +await sdk.destroySandbox(); // provider-defined cleanup + disposes client ``` `SandboxAgent.start(...)` requires a `sandbox` provider. Built-in providers: @@ -101,7 +101,7 @@ await sdk.destroySandbox(); // tears down sandbox + disposes client | `sandbox-agent/vercel` | Vercel Sandbox | | `sandbox-agent/cloudflare` | Cloudflare Sandbox | -Use `sdk.dispose()` to disconnect without destroying the sandbox, or `sdk.destroySandbox()` to tear down both. +Use `sdk.dispose()` to disconnect without changing sandbox state, `sdk.pauseSandbox()` for graceful suspension when supported, or `sdk.killSandbox()` for permanent deletion. ## Session flow @@ -196,6 +196,44 @@ const writeResult = await sdk.writeFsFile({ path: "./hello.txt" }, "hello"); console.log(health.status, agents.agents.length, entries.length, writeResult.path); ``` +## Desktop API + +The SDK also wraps the desktop host/runtime HTTP API. + +Install desktop dependencies first on Linux hosts: + +```bash +sandbox-agent install desktop --yes +``` + +Then query status, surface remediation if needed, and start the runtime: + +```ts +const status = await sdk.getDesktopStatus(); + +if (status.state === "install_required") { + console.log(status.installCommand); +} + +const started = await sdk.startDesktop({ + width: 1440, + height: 900, + dpi: 96, +}); + +const screenshot = await sdk.takeDesktopScreenshot(); +const displayInfo = await sdk.getDesktopDisplayInfo(); + +await sdk.moveDesktopMouse({ x: 400, y: 300 }); +await sdk.clickDesktop({ x: 400, y: 300, button: "left", clickCount: 1 }); +await sdk.typeDesktopText({ text: "hello world", delayMs: 10 }); +await sdk.pressDesktopKey({ key: "ctrl+l" }); + +await sdk.stopDesktop(); +``` + +Screenshot helpers return `Uint8Array` PNG bytes. The SDK does not attempt to install OS packages remotely; callers should surface `missingDependencies` and `installCommand` from `getDesktopStatus()`. + ## Error handling ```ts diff --git a/docs/session-transcript-schema.mdx b/docs/session-transcript-schema.mdx deleted file mode 100644 index c9c004a..0000000 --- a/docs/session-transcript-schema.mdx +++ /dev/null @@ -1,388 +0,0 @@ ---- -title: "Session Transcript Schema" -description: "Universal event schema for session transcripts across all agents." ---- - -Each coding agent outputs events in its own native format. The sandbox-agent converts these into a universal event schema, giving you a consistent session transcript regardless of which agent you use. - -The schema is defined in [OpenAPI format](https://github.com/rivet-dev/sandbox-agent/blob/main/docs/openapi.json). See the [HTTP API Reference](/api-reference) for endpoint documentation. - -## Coverage Matrix - -This table shows which agent feature coverage appears in the universal event stream. All agents retain their full native feature coverage—this only reflects what's normalized into the schema. - -| Feature | Claude | Codex | OpenCode | Amp | Pi (RPC) | -|--------------------|:------:|:-----:|:------------:|:------------:|:------------:| -| Stability | Stable | Stable| Experimental | Experimental | Experimental | -| Text Messages | ✓ | ✓ | ✓ | ✓ | ✓ | -| Tool Calls | ✓ | ✓ | ✓ | ✓ | ✓ | -| Tool Results | ✓ | ✓ | ✓ | ✓ | ✓ | -| Questions (HITL) | ✓ | | ✓ | | | -| Permissions (HITL) | ✓ | ✓ | ✓ | - | | -| Images | - | ✓ | ✓ | - | ✓ | -| File Attachments | - | ✓ | ✓ | - | | -| Session Lifecycle | - | ✓ | ✓ | - | | -| Error Events | - | ✓ | ✓ | ✓ | ✓ | -| Reasoning/Thinking | - | ✓ | - | - | ✓ | -| Command Execution | - | ✓ | - | - | | -| File Changes | - | ✓ | - | - | | -| MCP Tools | ✓ | ✓ | ✓ | ✓ | | -| Streaming Deltas | ✓ | ✓ | ✓ | - | ✓ | -| Variants | | ✓ | ✓ | ✓ | ✓ | - -Agents: [Claude Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview) · [Codex](https://github.com/openai/codex) · [OpenCode](https://github.com/opencode-ai/opencode) · [Amp](https://ampcode.com) · [Pi](https://buildwithpi.ai/pi-cli) - -- ✓ = Appears in session events -- \- = Agent supports natively, schema conversion coming soon -- (blank) = Not supported by agent -- Pi runtime model is router-managed per-session RPC (`pi --mode rpc`); it does not use generic subprocess streaming. - - - - Basic message exchange between user and assistant. - - - Visibility into tool invocations (file reads, command execution, etc.) and their results. When not natively supported, tool activity is embedded in message content. - - - Interactive questions the agent asks the user. Emits `question.requested` and `question.resolved` events. - - - Permission requests for sensitive operations. Emits `permission.requested` and `permission.resolved` events. - - - Support for image attachments in messages. - - - Support for file attachments in messages. - - - Native `session.started` and `session.ended` events. When not supported, the daemon emits synthetic lifecycle events. - - - Structured error events for runtime failures. - - - Extended thinking or reasoning content with visibility controls. - - - Detailed command execution events with stdout/stderr. - - - Structured file modification events with diffs. - - - Model Context Protocol tool support. - - - Native streaming of content deltas. When not supported, the daemon emits a single synthetic delta before `item.completed`. - - - Model variants such as reasoning effort or depth. Agents may expose different variant sets per model. - - - -Want support for another agent? [Open an issue](https://github.com/rivet-dev/sandbox-agent/issues/new) to request it. - -## UniversalEvent - -Every event from the API is wrapped in a `UniversalEvent` envelope. - -| Field | Type | Description | -|-------|------|-------------| -| `event_id` | string | Unique identifier for this event | -| `sequence` | integer | Monotonic sequence number within the session (starts at 1) | -| `time` | string | RFC3339 timestamp | -| `session_id` | string | Daemon-generated session identifier | -| `native_session_id` | string? | Provider-native session/thread identifier (e.g., Codex `threadId`, OpenCode `sessionID`) | -| `source` | string | Event origin: `agent` (native) or `daemon` (synthetic) | -| `synthetic` | boolean | Whether this event was generated by the daemon to fill gaps | -| `type` | string | Event type (see [Event Types](#event-types)) | -| `data` | object | Event-specific payload | -| `raw` | any? | Original provider payload (only when `include_raw=true`) | - -```json -{ - "event_id": "evt_abc123", - "sequence": 1, - "time": "2025-01-28T12:00:00Z", - "session_id": "my-session", - "native_session_id": "thread_xyz", - "source": "agent", - "synthetic": false, - "type": "item.completed", - "data": { ... } -} -``` - -## Event Types - -### Session Lifecycle - -| Type | Description | Data | -|------|-------------|------| -| `session.started` | Session has started | `{ metadata?: any }` | -| `session.ended` | Session has ended | `{ reason, terminated_by, message?, exit_code? }` | - -### Turn Lifecycle - -| Type | Description | Data | -|------|-------------|------| -| `turn.started` | Turn has started | `{ phase: "started", turn_id?, metadata? }` | -| `turn.ended` | Turn has ended | `{ phase: "ended", turn_id?, metadata? }` | - -**SessionEndedData** - -| Field | Type | Values | -|-------|------|--------| -| `reason` | string | `completed`, `error`, `terminated` | -| `terminated_by` | string | `agent`, `daemon` | -| `message` | string? | Error message (only present when reason is `error`) | -| `exit_code` | int? | Process exit code (only present when reason is `error`) | -| `stderr` | StderrOutput? | Structured stderr output (only present when reason is `error`) | - -**StderrOutput** - -| Field | Type | Description | -|-------|------|-------------| -| `head` | string? | First 20 lines of stderr (if truncated) or full stderr (if not truncated) | -| `tail` | string? | Last 50 lines of stderr (only present if truncated) | -| `truncated` | boolean | Whether the output was truncated | -| `total_lines` | int? | Total number of lines in stderr | - -### Item Lifecycle - -| Type | Description | Data | -|------|-------------|------| -| `item.started` | Item creation | `{ item }` | -| `item.delta` | Streaming content delta | `{ item_id, native_item_id?, delta }` | -| `item.completed` | Item finalized | `{ item }` | - -Items follow a consistent lifecycle: `item.started` → `item.delta` (0 or more) → `item.completed`. - -### HITL (Human-in-the-Loop) - -| Type | Description | Data | -|------|-------------|------| -| `permission.requested` | Permission request pending | `{ permission_id, action, status, metadata? }` | -| `permission.resolved` | Permission decision recorded | `{ permission_id, action, status, metadata? }` | -| `question.requested` | Question pending user input | `{ question_id, prompt, options, status }` | -| `question.resolved` | Question answered or rejected | `{ question_id, prompt, options, status, response? }` | - -**PermissionEventData** - -| Field | Type | Description | -|-------|------|-------------| -| `permission_id` | string | Identifier for the permission request | -| `action` | string | What the agent wants to do | -| `status` | string | `requested`, `accept`, `accept_for_session`, `reject` | -| `metadata` | any? | Additional context | - -**QuestionEventData** - -| Field | Type | Description | -|-------|------|-------------| -| `question_id` | string | Identifier for the question | -| `prompt` | string | Question text | -| `options` | string[] | Available answer options | -| `status` | string | `requested`, `answered`, `rejected` | -| `response` | string? | Selected answer (when resolved) | - -### Errors - -| Type | Description | Data | -|------|-------------|------| -| `error` | Runtime error | `{ message, code?, details? }` | -| `agent.unparsed` | Parse failure | `{ error, location, raw_hash? }` | - -The `agent.unparsed` event indicates the daemon failed to parse an agent payload. This should be treated as a bug. - -## UniversalItem - -Items represent discrete units of content within a session. - -| Field | Type | Description | -|-------|------|-------------| -| `item_id` | string | Daemon-generated identifier | -| `native_item_id` | string? | Provider-native item/message identifier | -| `parent_id` | string? | Parent item ID (e.g., tool call/result parented to a message) | -| `kind` | string | Item category (see below) | -| `role` | string? | Actor role for message items | -| `status` | string | Lifecycle status | -| `content` | ContentPart[] | Ordered list of content parts | - -### ItemKind - -| Value | Description | -|-------|-------------| -| `message` | User or assistant message | -| `tool_call` | Tool invocation | -| `tool_result` | Tool execution result | -| `system` | System message | -| `status` | Status update | -| `unknown` | Unrecognized item type | - -### ItemRole - -| Value | Description | -|-------|-------------| -| `user` | User message | -| `assistant` | Assistant response | -| `system` | System prompt | -| `tool` | Tool-related message | - -### ItemStatus - -| Value | Description | -|-------|-------------| -| `in_progress` | Item is streaming or pending | -| `completed` | Item is finalized | -| `failed` | Item execution failed | - -## Content Parts - -The `content` array contains typed parts that make up an item's payload. - -### text - -Plain text content. - -```json -{ "type": "text", "text": "Hello, world!" } -``` - -### json - -Structured JSON content. - -```json -{ "type": "json", "json": { "key": "value" } } -``` - -### tool_call - -Tool invocation. - -| Field | Type | Description | -|-------|------|-------------| -| `name` | string | Tool name | -| `arguments` | string | JSON-encoded arguments | -| `call_id` | string | Unique call identifier | - -```json -{ - "type": "tool_call", - "name": "read_file", - "arguments": "{\"path\": \"/src/main.ts\"}", - "call_id": "call_abc123" -} -``` - -### tool_result - -Tool execution result. - -| Field | Type | Description | -|-------|------|-------------| -| `call_id` | string | Matching call identifier | -| `output` | string | Tool output | - -```json -{ - "type": "tool_result", - "call_id": "call_abc123", - "output": "File contents here..." -} -``` - -### file_ref - -File reference with optional diff. - -| Field | Type | Description | -|-------|------|-------------| -| `path` | string | File path | -| `action` | string | `read`, `write`, `patch` | -| `diff` | string? | Unified diff (for patches) | - -```json -{ - "type": "file_ref", - "path": "/src/main.ts", - "action": "write", - "diff": "@@ -1,3 +1,4 @@\n+import { foo } from 'bar';" -} -``` - -### image - -Image reference. - -| Field | Type | Description | -|-------|------|-------------| -| `path` | string | Image file path | -| `mime` | string? | MIME type | - -```json -{ "type": "image", "path": "/tmp/screenshot.png", "mime": "image/png" } -``` - -### reasoning - -Model reasoning/thinking content. - -| Field | Type | Description | -|-------|------|-------------| -| `text` | string | Reasoning text | -| `visibility` | string | `public` or `private` | - -```json -{ "type": "reasoning", "text": "Let me think about this...", "visibility": "public" } -``` - -### status - -Status indicator. - -| Field | Type | Description | -|-------|------|-------------| -| `label` | string | Status label | -| `detail` | string? | Additional detail | - -```json -{ "type": "status", "label": "Running tests", "detail": "3 of 10 passed" } -``` - -## Source & Synthetics - -### EventSource - -The `source` field indicates who emitted the event: - -| Value | Description | -|-------|-------------| -| `agent` | Native event from the agent | -| `daemon` | Synthetic event generated by the daemon | - -### Synthetic Events - -The daemon emits synthetic events (`synthetic: true`, `source: "daemon"`) to provide a consistent event stream across all agents. Common synthetics: - -| Synthetic | When | -|-----------|------| -| `session.started` | Agent doesn't emit explicit session start | -| `session.ended` | Agent doesn't emit explicit session end | -| `turn.started` | Agent doesn't emit explicit turn start | -| `turn.ended` | Agent doesn't emit explicit turn end | -| `item.started` | Agent doesn't emit item start events | -| `item.delta` | Agent doesn't stream deltas natively | -| `question.*` | Claude Code plan mode (from ExitPlanMode tool) | - -### Raw Payloads - -Pass `include_raw=true` to event endpoints to receive the original agent payload in the `raw` field. Useful for debugging or accessing agent-specific data not in the universal schema. - -```typescript -const events = await client.getEvents("my-session", { includeRaw: true }); -// events[0].raw contains the original agent payload -``` diff --git a/docs/theme.css b/docs/theme.css index daeb719..4286d2c 100644 --- a/docs/theme.css +++ b/docs/theme.css @@ -20,7 +20,6 @@ body { color: var(--sa-text); } -/* a { color: var(--sa-primary); } @@ -41,6 +40,13 @@ select { color: var(--sa-text); } +code, +pre { + background-color: var(--sa-card); + border: 1px solid var(--sa-border); + color: var(--sa-text); +} + .card, .mintlify-card, .docs-card { @@ -64,4 +70,3 @@ select { .alert-danger { border-color: var(--sa-danger); } -*/ diff --git a/docs/troubleshooting.mdx b/docs/troubleshooting.mdx index 838cc28..18186d6 100644 --- a/docs/troubleshooting.mdx +++ b/docs/troubleshooting.mdx @@ -29,25 +29,6 @@ Verify the agent is installed: ls -la ~/.local/share/sandbox-agent/bin/ ``` -### 4. Binary libc mismatch (musl vs glibc) - -Claude Code binaries are available in both musl and glibc variants. If you see errors like: - -``` -cannot execute: required file not found -Error loading shared library libstdc++.so.6: No such file or directory -``` - -This means the wrong binary variant was downloaded. - -**For sandbox-agent 0.2.0+**: Platform detection is automatic. The correct binary (musl or glibc) is downloaded based on the runtime environment. - -**For sandbox-agent 0.1.x**: Use Alpine Linux which has native musl support: - -```dockerfile -FROM alpine:latest -RUN apk add --no-cache curl ca-certificates libstdc++ libgcc bash -``` ## Daytona Network Restrictions diff --git a/examples/boxlite-python/Dockerfile b/examples/boxlite-python/Dockerfile index 3630511..8aba774 100644 --- a/examples/boxlite-python/Dockerfile +++ b/examples/boxlite-python/Dockerfile @@ -1,5 +1,5 @@ FROM node:22-bookworm-slim RUN apt-get update && apt-get install -y curl ca-certificates && rm -rf /var/lib/apt/lists/* -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh RUN sandbox-agent install-agent claude RUN sandbox-agent install-agent codex diff --git a/examples/boxlite/Dockerfile b/examples/boxlite/Dockerfile index 3630511..8aba774 100644 --- a/examples/boxlite/Dockerfile +++ b/examples/boxlite/Dockerfile @@ -1,5 +1,5 @@ FROM node:22-bookworm-slim RUN apt-get update && apt-get install -y curl ca-certificates && rm -rf /var/lib/apt/lists/* -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh RUN sandbox-agent install-agent claude RUN sandbox-agent install-agent codex diff --git a/examples/boxlite/tsconfig.json b/examples/boxlite/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/boxlite/tsconfig.json +++ b/examples/boxlite/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/cloudflare/Dockerfile b/examples/cloudflare/Dockerfile index d0796cb..738f8a2 100644 --- a/examples/cloudflare/Dockerfile +++ b/examples/cloudflare/Dockerfile @@ -1,7 +1,7 @@ FROM cloudflare/sandbox:0.7.0 # Install sandbox-agent -RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh +RUN curl -fsSL https://releases.rivet.dev/sandbox-agent/0.4.x/install.sh | sh # Pre-install agents RUN sandbox-agent install-agent claude && \ diff --git a/examples/computesdk/tsconfig.json b/examples/computesdk/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/computesdk/tsconfig.json +++ b/examples/computesdk/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/daytona/src/daytona.ts b/examples/daytona/src/daytona.ts new file mode 100644 index 0000000..ccffc94 --- /dev/null +++ b/examples/daytona/src/daytona.ts @@ -0,0 +1,33 @@ +import { SandboxAgent } from "sandbox-agent"; +import { daytona } from "sandbox-agent/daytona"; + +function collectEnvVars(): Record { + const envVars: Record = {}; + if (process.env.ANTHROPIC_API_KEY) envVars.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; + if (process.env.OPENAI_API_KEY) envVars.OPENAI_API_KEY = process.env.OPENAI_API_KEY; + return envVars; +} + +function inspectorUrlToBaseUrl(inspectorUrl: string): string { + return inspectorUrl.replace(/\/ui\/$/, ""); +} + +export async function setupDaytonaSandboxAgent(): Promise<{ + baseUrl: string; + token?: string; + extraHeaders?: Record; + cleanup: () => Promise; +}> { + const client = await SandboxAgent.start({ + sandbox: daytona({ + create: { envVars: collectEnvVars() }, + }), + }); + + return { + baseUrl: inspectorUrlToBaseUrl(client.inspectorUrl), + cleanup: async () => { + await client.killSandbox(); + }, + }; +} diff --git a/examples/daytona/src/index.ts b/examples/daytona/src/index.ts index b881113..9c4cf85 100644 --- a/examples/daytona/src/index.ts +++ b/examples/daytona/src/index.ts @@ -16,7 +16,6 @@ console.log(`UI: ${client.inspectorUrl}`); const session = await client.createSession({ agent: detectAgent(), - cwd: "/home/daytona", }); session.onEvent((event) => { diff --git a/examples/daytona/tsconfig.json b/examples/daytona/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/daytona/tsconfig.json +++ b/examples/daytona/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/docker/tsconfig.json b/examples/docker/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/docker/tsconfig.json +++ b/examples/docker/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/e2b/src/e2b.ts b/examples/e2b/src/e2b.ts new file mode 100644 index 0000000..17762a2 --- /dev/null +++ b/examples/e2b/src/e2b.ts @@ -0,0 +1,34 @@ +import { SandboxAgent } from "sandbox-agent"; +import { e2b } from "sandbox-agent/e2b"; + +function collectEnvVars(): Record { + const envs: Record = {}; + if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; + if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; + return envs; +} + +function inspectorUrlToBaseUrl(inspectorUrl: string): string { + return inspectorUrl.replace(/\/ui\/$/, ""); +} + +export async function setupE2BSandboxAgent(): Promise<{ + baseUrl: string; + token?: string; + cleanup: () => Promise; +}> { + const template = process.env.E2B_TEMPLATE; + const client = await SandboxAgent.start({ + sandbox: e2b({ + template, + create: { envs: collectEnvVars() }, + }), + }); + + return { + baseUrl: inspectorUrlToBaseUrl(client.inspectorUrl), + cleanup: async () => { + await client.killSandbox(); + }, + }; +} diff --git a/examples/e2b/src/index.ts b/examples/e2b/src/index.ts index c20ebaa..67b74dc 100644 --- a/examples/e2b/src/index.ts +++ b/examples/e2b/src/index.ts @@ -5,15 +5,15 @@ import { detectAgent } from "@sandbox-agent/example-shared"; const envs: Record = {}; if (process.env.ANTHROPIC_API_KEY) envs.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; if (process.env.OPENAI_API_KEY) envs.OPENAI_API_KEY = process.env.OPENAI_API_KEY; +const template = process.env.E2B_TEMPLATE; const client = await SandboxAgent.start({ // ✨ NEW ✨ - sandbox: e2b({ create: { envs } }), + sandbox: e2b({ template, create: { envs } }), }); const session = await client.createSession({ agent: detectAgent(), - cwd: "/home/user", }); session.onEvent((event) => { diff --git a/examples/e2b/tsconfig.json b/examples/e2b/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/e2b/tsconfig.json +++ b/examples/e2b/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/file-system/tsconfig.json b/examples/file-system/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/file-system/tsconfig.json +++ b/examples/file-system/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/mcp-custom-tool/tsconfig.json b/examples/mcp-custom-tool/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/mcp-custom-tool/tsconfig.json +++ b/examples/mcp-custom-tool/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/mcp/tsconfig.json b/examples/mcp/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/mcp/tsconfig.json +++ b/examples/mcp/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/modal/tsconfig.json b/examples/modal/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/modal/tsconfig.json +++ b/examples/modal/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/permissions/tsconfig.json b/examples/permissions/tsconfig.json index 9c9fe06..4eec283 100644 --- a/examples/permissions/tsconfig.json +++ b/examples/permissions/tsconfig.json @@ -1,7 +1,8 @@ { "compilerOptions": { "target": "ES2022", - "lib": ["ES2022"], + "lib": ["ES2022", "DOM"], + "types": ["node"], "module": "ESNext", "moduleResolution": "Bundler", "allowImportingTsExtensions": true, diff --git a/examples/persist-memory/tsconfig.json b/examples/persist-memory/tsconfig.json index d1c0065..ec2723c 100644 --- a/examples/persist-memory/tsconfig.json +++ b/examples/persist-memory/tsconfig.json @@ -1,13 +1,15 @@ { "compilerOptions": { "target": "ES2022", + "lib": ["ES2022", "DOM"], "module": "ESNext", "moduleResolution": "Bundler", "allowImportingTsExtensions": true, "noEmit": true, "esModuleInterop": true, "strict": true, - "skipLibCheck": true + "skipLibCheck": true, + "types": ["node"] }, "include": ["src"] } diff --git a/examples/persist-postgres/tsconfig.json b/examples/persist-postgres/tsconfig.json index d1c0065..ec2723c 100644 --- a/examples/persist-postgres/tsconfig.json +++ b/examples/persist-postgres/tsconfig.json @@ -1,13 +1,15 @@ { "compilerOptions": { "target": "ES2022", + "lib": ["ES2022", "DOM"], "module": "ESNext", "moduleResolution": "Bundler", "allowImportingTsExtensions": true, "noEmit": true, "esModuleInterop": true, "strict": true, - "skipLibCheck": true + "skipLibCheck": true, + "types": ["node"] }, "include": ["src"] } diff --git a/examples/persist-sqlite/tsconfig.json b/examples/persist-sqlite/tsconfig.json index d1c0065..ec2723c 100644 --- a/examples/persist-sqlite/tsconfig.json +++ b/examples/persist-sqlite/tsconfig.json @@ -1,13 +1,15 @@ { "compilerOptions": { "target": "ES2022", + "lib": ["ES2022", "DOM"], "module": "ESNext", "moduleResolution": "Bundler", "allowImportingTsExtensions": true, "noEmit": true, "esModuleInterop": true, "strict": true, - "skipLibCheck": true + "skipLibCheck": true, + "types": ["node"] }, "include": ["src"] } diff --git a/examples/shared/src/docker.ts b/examples/shared/src/docker.ts index 6a8f40a..f4161fb 100644 --- a/examples/shared/src/docker.ts +++ b/examples/shared/src/docker.ts @@ -9,7 +9,7 @@ const __dirname = path.dirname(fileURLToPath(import.meta.url)); const REPO_ROOT = path.resolve(__dirname, "..", "..", ".."); /** Pre-built Docker image with all agents installed. */ -export const FULL_IMAGE = "rivetdev/sandbox-agent:0.4.0-full"; +export const FULL_IMAGE = "rivetdev/sandbox-agent:0.4.2-full"; export interface DockerSandboxOptions { /** Container port used by sandbox-agent inside Docker. */ diff --git a/examples/skills-custom-tool/tsconfig.json b/examples/skills-custom-tool/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/skills-custom-tool/tsconfig.json +++ b/examples/skills-custom-tool/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/skills/tsconfig.json b/examples/skills/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/skills/tsconfig.json +++ b/examples/skills/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/examples/sprites/package.json b/examples/sprites/package.json new file mode 100644 index 0000000..df808e8 --- /dev/null +++ b/examples/sprites/package.json @@ -0,0 +1,20 @@ +{ + "name": "@sandbox-agent/example-sprites", + "private": true, + "type": "module", + "scripts": { + "start": "tsx src/index.ts", + "typecheck": "tsc --noEmit" + }, + "dependencies": { + "@fly/sprites": "latest", + "@sandbox-agent/example-shared": "workspace:*", + "sandbox-agent": "workspace:*" + }, + "devDependencies": { + "@types/node": "latest", + "tsx": "latest", + "typescript": "latest", + "vitest": "^3.0.0" + } +} diff --git a/examples/sprites/src/index.ts b/examples/sprites/src/index.ts new file mode 100644 index 0000000..bf95e5d --- /dev/null +++ b/examples/sprites/src/index.ts @@ -0,0 +1,21 @@ +import { SandboxAgent } from "sandbox-agent"; +import { sprites } from "sandbox-agent/sprites"; + +const env: Record = {}; +if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; +if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY; + +const client = await SandboxAgent.start({ + sandbox: sprites({ + token: process.env.SPRITES_API_KEY ?? process.env.SPRITE_TOKEN ?? process.env.SPRITES_TOKEN, + env, + }), +}); + +console.log(`UI: ${client.inspectorUrl}`); +console.log(await client.getHealth()); + +process.once("SIGINT", async () => { + await client.destroySandbox(); + process.exit(0); +}); diff --git a/examples/sprites/tests/sprites.test.ts b/examples/sprites/tests/sprites.test.ts new file mode 100644 index 0000000..dfd1594 --- /dev/null +++ b/examples/sprites/tests/sprites.test.ts @@ -0,0 +1,34 @@ +import { describe, it, expect } from "vitest"; +import { SandboxAgent } from "sandbox-agent"; +import { sprites } from "sandbox-agent/sprites"; + +const shouldRun = Boolean(process.env.SPRITES_API_KEY || process.env.SPRITE_TOKEN || process.env.SPRITES_TOKEN); +const timeoutMs = Number.parseInt(process.env.SANDBOX_TEST_TIMEOUT_MS || "", 10) || 300_000; + +const testFn = shouldRun ? it : it.skip; + +describe("sprites provider", () => { + testFn( + "starts sandbox-agent and responds to /v1/health", + async () => { + const env: Record = {}; + if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; + if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY; + + const sdk = await SandboxAgent.start({ + sandbox: sprites({ + token: process.env.SPRITES_API_KEY ?? process.env.SPRITE_TOKEN ?? process.env.SPRITES_TOKEN, + env, + }), + }); + + try { + const health = await sdk.getHealth(); + expect(health.status).toBe("ok"); + } finally { + await sdk.destroySandbox(); + } + }, + timeoutMs, + ); +}); diff --git a/examples/sprites/tsconfig.json b/examples/sprites/tsconfig.json new file mode 100644 index 0000000..ad591c3 --- /dev/null +++ b/examples/sprites/tsconfig.json @@ -0,0 +1,17 @@ +{ + "compilerOptions": { + "target": "ES2022", + "lib": ["ES2022", "DOM"], + "module": "ESNext", + "moduleResolution": "Bundler", + "allowImportingTsExtensions": true, + "noEmit": true, + "esModuleInterop": true, + "strict": true, + "skipLibCheck": true, + "resolveJsonModule": true, + "types": ["node"] + }, + "include": ["src/**/*"], + "exclude": ["node_modules", "**/*.test.ts"] +} diff --git a/examples/vercel/src/index.ts b/examples/vercel/src/index.ts index 9839893..5a83e0c 100644 --- a/examples/vercel/src/index.ts +++ b/examples/vercel/src/index.ts @@ -19,7 +19,6 @@ console.log(`UI: ${client.inspectorUrl}`); const session = await client.createSession({ agent: detectAgent(), - cwd: "/home/vercel-sandbox", }); session.onEvent((event) => { diff --git a/examples/vercel/src/vercel.ts b/examples/vercel/src/vercel.ts new file mode 100644 index 0000000..742cd5a --- /dev/null +++ b/examples/vercel/src/vercel.ts @@ -0,0 +1,35 @@ +import { SandboxAgent } from "sandbox-agent"; +import { vercel } from "sandbox-agent/vercel"; + +function collectEnvVars(): Record { + const env: Record = {}; + if (process.env.ANTHROPIC_API_KEY) env.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY; + if (process.env.OPENAI_API_KEY) env.OPENAI_API_KEY = process.env.OPENAI_API_KEY; + return env; +} + +function inspectorUrlToBaseUrl(inspectorUrl: string): string { + return inspectorUrl.replace(/\/ui\/$/, ""); +} + +export async function setupVercelSandboxAgent(): Promise<{ + baseUrl: string; + token?: string; + cleanup: () => Promise; +}> { + const client = await SandboxAgent.start({ + sandbox: vercel({ + create: { + runtime: "node24", + env: collectEnvVars(), + }, + }), + }); + + return { + baseUrl: inspectorUrlToBaseUrl(client.inspectorUrl), + cleanup: async () => { + await client.killSandbox(); + }, + }; +} diff --git a/examples/vercel/tsconfig.json b/examples/vercel/tsconfig.json index 96ba2fd..ad591c3 100644 --- a/examples/vercel/tsconfig.json +++ b/examples/vercel/tsconfig.json @@ -9,7 +9,8 @@ "esModuleInterop": true, "strict": true, "skipLibCheck": true, - "resolveJsonModule": true + "resolveJsonModule": true, + "types": ["node"] }, "include": ["src/**/*"], "exclude": ["node_modules", "**/*.test.ts"] diff --git a/foundry/AGENT-HANDOFF.md b/foundry/AGENT-HANDOFF.md new file mode 100644 index 0000000..20bade7 --- /dev/null +++ b/foundry/AGENT-HANDOFF.md @@ -0,0 +1,179 @@ +# Foundry Agent Handoff + +## Baseline + +- Repo: `rivet-dev/sandbox-agent` +- Branch: `columbus-v2` +- Last pushed commit: `3174fe73` (`feat(foundry): checkpoint actor and workspace refactor`) +- Progress/spec tracker: [FOUNDRY-CHANGES.md](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/FOUNDRY-CHANGES.md) + +## What is already landed + +These spec slices are already implemented and pushed: + +- Item `1`: backend actor rename `auth-user` -> `user` +- Item `2`: Better Auth mapping comments +- Item `5`: task raw SQL cleanup into migrations +- Item `6`: `history` -> `audit-log` +- Item `7`: default model moved to user-scoped app state +- Item `20`: admin action prefixing +- Item `23`: dead `getTaskEnriched` / `enrichTaskRecord` removal +- Item `25`: `Workbench` -> `Workspace` rename across backend/shared/client/frontend +- Item `26`: branch rename deleted +- Organization realtime was already collapsed to full-snapshot `organizationUpdated` +- Task realtime was already aligned to `taskUpdated` + +## Known blocker + +Spec item `3` is only partially done. The singleton constraint for the Better Auth `user` table is still blocked. + +- File: [foundry/packages/backend/src/actors/user/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/user/db/schema.ts) +- Reason: Better Auth still depends on external string `user.id`, so a literal singleton `CHECK (id = 1)` on that table is not a safe mechanical change. + +## Important current state + +There are uncommitted edits on top of the pushed checkpoint. Another agent should start from the current worktree, not just `origin/columbus-v2`. + +Current dirty files: + +- [foundry/packages/backend/src/actors/github-data/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/github-data/index.ts) +- [foundry/packages/backend/src/actors/organization/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/actions.ts) +- [foundry/packages/backend/src/actors/repository/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/repository/actions.ts) +- [foundry/packages/backend/src/actors/task/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workspace.ts) +- [foundry/packages/client/src/mock/backend-client.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/client/src/mock/backend-client.ts) + +These files are the current hot path for the unfinished structural work. + +## What is partially in place but not finished + +### User-owned task UI state + +The user actor already has the schema and CRUD surface for per-user task/session UI state: + +- [foundry/packages/backend/src/actors/user/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/user/db/schema.ts) + `user_task_state` +- [foundry/packages/backend/src/actors/user/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/user/index.ts) + `getTaskState`, `upsertTaskState`, `deleteTaskState` + +But the task actor and UI are still reading/writing the old task-global fields: + +- [foundry/packages/backend/src/actors/task/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/db/schema.ts) + still contains `task_runtime.active_session_id` and session `unread` / `draft_*` +- [foundry/packages/backend/src/actors/task/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workspace.ts) + still derives unread/draft/active-session from task-local rows +- [foundry/packages/frontend/src/components/mock-layout.tsx](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/frontend/src/components/mock-layout.tsx) + still treats `activeSessionId` as frontend-local and uses task-level unread/draft state + +So items `21`, `22`, `24`, and part of `19` are only half-done. + +### Coordinator ownership + +The current architecture still violates the intended coordinator pattern: + +- Organization still owns `taskLookup` and `taskSummaries` + - [foundry/packages/backend/src/actors/organization/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/db/schema.ts) +- Organization still resolves `taskId -> repoId` + - [foundry/packages/backend/src/actors/organization/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/actions.ts) +- Task still pushes summary updates to organization instead of repository + - [foundry/packages/backend/src/actors/task/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workspace.ts) +- Repository still does not own a `tasks` projection table yet + - [foundry/packages/backend/src/actors/repository/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/repository/db/schema.ts) + +So items `9`, `13`, and `15` are still open. + +### Queue-only mutations + +Task actor workspace commands already go through queue sends. Other actors still do not fully follow the queue-only mutation rule: + +- [foundry/packages/backend/src/actors/user/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/user/index.ts) +- [foundry/packages/backend/src/actors/github-data/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/github-data/index.ts) +- [foundry/packages/backend/src/actors/organization/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/actions.ts) +- [foundry/packages/backend/src/actors/organization/app-shell.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/app-shell.ts) + +So items `4`, `10`, and `11` are still open. + +### Dynamic model/agent data + +The frontend/client still hardcode model groups: + +- [foundry/packages/frontend/src/components/mock-layout/view-model.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/frontend/src/components/mock-layout/view-model.ts) +- [foundry/packages/client/src/workspace-model.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/client/src/workspace-model.ts) +- [foundry/packages/shared/src/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/shared/src/workspace.ts) + `WorkspaceModelId` is still a hardcoded union + +The repo already has the API source of truth available through the TypeScript SDK: + +- [sdks/typescript/src/client.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/sdks/typescript/src/client.ts) + `SandboxAgent.listAgents({ config: true })` +- [server/packages/sandbox-agent/src/router.rs](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/server/packages/sandbox-agent/src/router.rs) + `/v1/agents` +- [server/packages/sandbox-agent/src/router/support.rs](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/server/packages/sandbox-agent/src/router/support.rs) + `fallback_config_options` + +So item `8` is still open. + +### GitHub sync chunking/progress + +GitHub data sync is still a delete-and-replace flow: + +- [foundry/packages/backend/src/actors/github-data/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/github-data/index.ts) + `replaceRepositories`, `replaceBranches`, `replaceMembers`, `replacePullRequests`, and full-sync flow +- [foundry/packages/backend/src/actors/github-data/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/github-data/db/schema.ts) + no generation/progress columns yet +- [foundry/packages/shared/src/app-shell.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/shared/src/app-shell.ts) + no structured sync progress field yet + +So item `16` is still open. + +## Recommended next order + +If another agent picks this up, this is the safest order: + +1. Finish items `21`, `22`, `24`, `19` together. + Reason: user-owned task UI state is already half-wired, and task schema cleanup depends on the same files. + +2. Finish items `9`, `13`, `15` together. + Reason: coordinator ownership, repo-owned task projections, and PR/task unification are the same refactor seam. + +3. Finish item `16`. + Reason: GitHub sync chunking is mostly isolated to `github-data` plus app-shell/shared snapshot wiring. + +4. Finish item `8`. + Reason: dynamic model/agent data is largely independent once user default model is already user-scoped. + +5. Finish items `4`, `10`, `11`, `12`, `18`, final event audit. + +6. Do item `17` last. + +## Concrete file hotspots for the next agent + +Backend: + +- [foundry/packages/backend/src/actors/task/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workspace.ts) +- [foundry/packages/backend/src/actors/task/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/db/schema.ts) +- [foundry/packages/backend/src/actors/task/workflow/common.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workflow/common.ts) +- [foundry/packages/backend/src/actors/task/workflow/commands.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workflow/commands.ts) +- [foundry/packages/backend/src/actors/task/workflow/init.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/task/workflow/init.ts) +- [foundry/packages/backend/src/actors/repository/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/repository/actions.ts) +- [foundry/packages/backend/src/actors/repository/db/schema.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/repository/db/schema.ts) +- [foundry/packages/backend/src/actors/organization/actions.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/organization/actions.ts) +- [foundry/packages/backend/src/actors/github-data/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/github-data/index.ts) +- [foundry/packages/backend/src/actors/user/index.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/backend/src/actors/user/index.ts) + +Shared/client/frontend: + +- [foundry/packages/shared/src/workspace.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/shared/src/workspace.ts) +- [foundry/packages/shared/src/contracts.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/shared/src/contracts.ts) +- [foundry/packages/shared/src/app-shell.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/shared/src/app-shell.ts) +- [foundry/packages/client/src/backend-client.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/client/src/backend-client.ts) +- [foundry/packages/client/src/workspace-model.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/client/src/workspace-model.ts) +- [foundry/packages/frontend/src/components/mock-layout.tsx](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/frontend/src/components/mock-layout.tsx) +- [foundry/packages/frontend/src/components/mock-layout/view-model.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/frontend/src/components/mock-layout/model-picker.tsx) +- [foundry/packages/frontend/src/features/tasks/status.ts](/Users/nathan/conductor/workspaces/sandbox-agent/columbus-v1/foundry/packages/frontend/src/features/tasks/status.ts) + +## Notes that matter + +- The pushed checkpoint is useful, but it is not the full current state. There are uncommitted edits in the hot-path backend files listed above. +- The current tree already contains a partially added `user_task_state` path. Do not duplicate that work; finish the migration by removing the old task-owned fields and rewiring readers/writers. +- The current task actor still reads mutable fields from `c.state` such as `repoRemote`, `branchName`, `title`, `task`, `sandboxProviderId`, and `agentType`. That is part of item `19`. +- The current frontend still synthesizes PR-only rows into fake tasks. That should go away as part of repo-owned task projection / PR unification. diff --git a/foundry/CLAUDE.md b/foundry/CLAUDE.md index e347a60..2d9bcbb 100644 --- a/foundry/CLAUDE.md +++ b/foundry/CLAUDE.md @@ -56,6 +56,41 @@ Use `pnpm` workspaces and Turborepo. - mock frontend changes: `just foundry-mock` or restart with `just foundry-mock-down && just foundry-mock` - local frontend-only work outside Docker: restart `pnpm --filter @sandbox-agent/foundry-frontend dev` or `just foundry-dev-mock` as appropriate - The backend does **not** hot reload. Bun's `--hot` flag causes the server to re-bind on a different port (e.g. 6421 instead of 6420), breaking all client connections while the container still exposes the original port. After backend code changes, restart the backend container: `just foundry-dev-down && just foundry-dev`. +- The dev server has debug logging enabled by default (`RIVET_LOG_LEVEL=debug`, `FOUNDRY_LOG_LEVEL=debug`) via `compose.dev.yaml`. Error stacks and timestamps are also enabled. +- The frontend client uses JSON encoding for RivetKit in development (`import.meta.env.DEV`) for easier debugging. Production uses the default encoding. + +## Foundry Base Sandbox Image + +Local Docker sandboxes use the `rivetdev/sandbox-agent:foundry-base-latest` image by default. This image extends the sandbox-agent runtime with sudo, git, neovim, gh, node, bun, chromium, and agent-browser. + +- **Dockerfile:** `docker/foundry-base.Dockerfile` (builds sandbox-agent from source, x86_64 only) +- **Publish script:** `scripts/publish-foundry-base.sh` (builds and pushes to Docker Hub `rivetdev/sandbox-agent`) +- **Tags:** `foundry-base-TZ` (timestamped) + `foundry-base-latest` (rolling) +- **Build from repo root:** `./foundry/scripts/publish-foundry-base.sh` (or `--dry-run` to skip push) +- **Override image in dev:** set `HF_LOCAL_SANDBOX_IMAGE` in `foundry/.env` or environment. The env var is passed through `compose.dev.yaml` to the backend. +- **Resolution order:** `config.sandboxProviders.local.image` (config.toml) > `HF_LOCAL_SANDBOX_IMAGE` (env var) > `DEFAULT_LOCAL_SANDBOX_IMAGE` constant in `packages/backend/src/actors/sandbox/index.ts`. +- The image must be built with `--platform linux/amd64`. The Rust build is memory-intensive; Docker Desktop needs at least 8GB RAM allocated. +- When updating the base image contents (new system packages, agent versions), rebuild and push with the publish script, then update the `foundry-base-latest` tag. + +## Production GitHub App + OAuth App + +Foundry uses two separate GitHub entities in production: + +- **OAuth App** (`GITHUB_CLIENT_ID` / `GITHUB_CLIENT_SECRET`) — handles "Sign in with GitHub" via Better Auth. This is a standard OAuth App. +- **GitHub App** (`GITHUB_APP_ID` / `GITHUB_APP_CLIENT_ID` / `GITHUB_APP_CLIENT_SECRET` / `GITHUB_APP_PRIVATE_KEY`) — handles webhooks, installation tokens for repo access, and GitHub API sync (repos, PRs). Must be manually installed on each org. + +Key env vars and where they connect: + +- `GITHUB_REDIRECT_URI` — OAuth callback, must point to `https://api.sandboxagent.dev/v1/auth/callback/github` +- `GITHUB_WEBHOOK_SECRET` — must match the secret configured on the GitHub App's Webhook settings page exactly. Mismatches cause silent 500s on webhook delivery (signature verification fails inside the actor, surfaced as a generic RivetKit `internal_error`). +- `BETTER_AUTH_URL` — must be the **API** URL (`https://api.sandboxagent.dev`), not the frontend URL. Better Auth uses this internally for sign-out and session management calls. +- `APP_URL` — the **frontend** URL (`https://foundry.sandboxagent.dev`). + +Troubleshooting: + +- **"GitHub App not installed"** — The GitHub App must be manually installed on each org. Sign-in does not auto-install it. Go to the GitHub App settings → Install App tab. The sign-in flow can only detect existing installations, not create them. +- **Webhooks not arriving** — Check the GitHub App → Advanced tab for delivery history. If deliveries show 500, the webhook secret likely doesn't match `GITHUB_WEBHOOK_SECRET`. Test with: `echo -n '{"test":true}' | openssl dgst -sha256 -hmac "$SECRET"` and curl the endpoint with the computed signature. +- **Deleting all actors wipes GitHub App installation state.** After a full actor reset, you must trigger a webhook (e.g. redeliver from GitHub App Advanced tab, or re-install the app) to repopulate installation records. ## Railway Logs @@ -73,13 +108,14 @@ Use `pnpm` workspaces and Turborepo. - All backend interaction (actor calls, metadata/health checks, backend HTTP endpoint access) must go through the dedicated client library in `packages/client`. - Outside `packages/client`, do not call backend endpoints directly (for example `fetch(.../v1/rivet...)`), except in black-box E2E tests that intentionally exercise raw transport behavior. - GUI state should update in realtime (no manual refresh buttons). Prefer RivetKit push reactivity and actor-driven events; do not add polling/refetch for normal product flows. -- Keep the mock workbench types and mock client in `packages/shared` + `packages/client` up to date with the frontend contract. The mock is the UI testing reference implementation while backend functionality catches up. +- Keep the mock workspace types and mock client in `packages/shared` + `packages/client` up to date with the frontend contract. The mock is the UI testing reference implementation while backend functionality catches up. - Keep frontend route/state coverage current in code and tests; there is no separate page-inventory doc to maintain. - If Foundry uses a shared component from `@sandbox-agent/react`, make changes in `sdks/react` instead of copying or forking that component into Foundry. - When changing shared React components in `sdks/react` for Foundry, verify they still work in the Sandbox Agent Inspector before finishing. -- When making UI changes, verify the live flow with `agent-browser`, take screenshots of the updated UI, and offer to open those screenshots in Preview when you finish. +- When making UI changes, verify the live flow with the Chrome DevTools MCP or `agent-browser`, take screenshots of the updated UI, and offer to open those screenshots in Preview when you finish. - When asked for screenshots, capture all relevant affected screens and modal states, not just a single viewport. Include empty, populated, success, and blocked/error states when they are part of the changed flow. - If a screenshot catches a transition frame, blank modal, or otherwise misleading state, retake it before reporting it. +- When verifying UI in the browser, attempt to sign in by navigating to `/signin` and clicking "Continue with GitHub". If the browser lands on the GitHub login page (github.com/login) and you don't have credentials, stop and ask the user to complete the sign-in. Do not assume the session is invalid just because you see the Foundry sign-in page — always attempt the OAuth flow first. ## Realtime Data Architecture @@ -99,7 +135,7 @@ Do not use polling (`refetchInterval`), empty "go re-fetch" broadcast events, or - **Organization actor** materializes sidebar-level data in its own SQLite: repo catalog, task summaries (title, status, branch, PR, updatedAt), repo summaries (overview/branch state), and session summaries (id, name, status, unread, model — no transcript). Task actors push summary changes to the organization actor when they mutate. The organization actor broadcasts the updated entity to connected clients. `getOrganizationSummary` reads from local tables only — no fan-out to child actors. - **Task actor** materializes its own detail state (session summaries, sandbox info, diffs, file tree). `getTaskDetail` reads from the task actor's own SQLite. The task actor broadcasts updates directly to clients connected to it. - **Session data** lives on the task actor but is a separate subscription topic. The task topic includes `sessions_summary` (list without content). The `session` topic provides full transcript and draft state. Clients subscribe to the `session` topic for whichever session is active, and filter `sessionUpdated` events by session ID (ignoring events for other sessions on the same actor). -- The expensive fan-out (querying every repository/task actor) only exists as a background reconciliation/rebuild path, never on the hot read path. +- There is no fan-out on the read path. The organization actor owns all task summaries locally. ### Subscription manager @@ -133,6 +169,14 @@ The client subscribes to `app` always, `organization` when entering an organizat - Backend mutations that affect sidebar data (task title, status, branch, PR state) must push the updated summary to the parent organization actor, which broadcasts to organization subscribers. - Comment architecture-related code: add doc comments explaining the materialized state pattern, why deltas flow the way they do, and the relationship between parent/child actor broadcasts. New contributors should understand the data flow from comments alone. +## Sandbox Architecture + +- Structurally, the system supports multiple sandboxes per task, but in practice there is exactly one active sandbox per task. Design features assuming one sandbox per task. If multi-sandbox is needed in the future, extend at that time. +- Each task has a **primary user** (owner) whose GitHub OAuth credentials are injected into the sandbox for git operations. The owner swaps when a different user sends a message. See `.context/proposal-task-owner-git-auth.md` for the full design. +- **Security: OAuth token scope.** The user's GitHub OAuth token has `repo` scope, granting full control of all private repositories the user has access to. When the user is the active task owner, their token is injected into the sandbox. This means the agent can read/write ANY repo the user has access to, not just the task's target repo. This is the standard trade-off for OAuth-based git integrations (same as GitHub Codespaces, Gitpod). The user consents to `repo` scope at sign-in time. Credential files in the sandbox are `chmod 600` and overwritten on owner swap. +- All git operations in the sandbox must be auto-authenticated. Never configure git to prompt for credentials (no interactive `GIT_ASKPASS` prompts). Use a credential store file that is pre-populated with the active owner's token. +- All git operation errors (push 401, clone failure, branch protection rejection) must surface in the UI with actionable context. Never silently swallow git errors. + ## Git State Policy - The backend stores zero git state. No local clones, no refs, no working trees, and no git-spice. @@ -141,6 +185,15 @@ The client subscribes to `app` always, `organization` when entering an organizat - Do not add backend git clone paths, `git fetch`, `git for-each-ref`, or direct backend git CLI calls. If you need git data, either read stored GitHub metadata or run the command inside a sandbox. - The `BackendDriver` has no `GitDriver` or `StackDriver`. Only `GithubDriver` and `TmuxDriver` remain. +## React Hook Dependency Safety + +- **Never use unstable references as `useEffect`/`useMemo`/`useCallback` dependencies.** React compares dependencies by reference, not value. Expressions like `?? []`, `?? {}`, `.map(...)`, `.filter(...)`, or object/array literals create new references every render, causing infinite re-render loops when used as dependencies. +- If the upstream value may be `undefined`/`null` and you need a fallback, either: + - Use the raw upstream value as the dependency and apply the fallback inside the effect body: `useEffect(() => { doThing(value ?? []); }, [value]);` + - Derive a stable primitive key: `const key = JSON.stringify(value ?? []);` then depend on `key` + - Memoize: `const stable = useMemo(() => value ?? [], [value]);` +- When reviewing code, treat any `?? []`, `?? {}`, or inline `.map()/.filter()` in a dependency array as a bug. + ## UI System - Foundry's base UI system is `BaseUI` with `Styletron`, plus Foundry-specific theme/tokens on top. Treat that as the default UI foundation. @@ -165,6 +218,7 @@ The client subscribes to `app` always, `organization` when entering an organizat - If the system reaches an unexpected state, raise an explicit error with actionable context. - Do not fail silently, swallow errors, or auto-ignore inconsistent data. - Prefer fail-fast behavior over hidden degradation when correctness is uncertain. +- **Never use bare `catch {}` or `catch { }` blocks.** Every catch must at minimum log the error with `logActorWarning` or `console.warn`. Silent catches hide bugs and make debugging impossible. If a catch is intentionally degrading (e.g. returning empty data when a sandbox is expired), it must still log so operators can see what happened. Use `catch (error) { logActorWarning(..., { error: resolveErrorMessage(error) }); }` or equivalent. ## RivetKit Dependency Policy @@ -178,16 +232,6 @@ For all Rivet/RivetKit implementation: - Example: the `task` actor instance already represents `(organizationId, repoId, taskId)`, so its SQLite tables should not need those columns for primary keys. 3. Do not use backend-global SQLite singletons; database access must go through actor `db` providers (`c.db`). 4. The default dependency source for RivetKit is the published `rivetkit` package so monorepo installs and CI remain self-contained. -5. When working on coordinated RivetKit changes, you may temporarily relink to a local checkout instead of the published package. - - Dedicated local checkout for this repo: `/Users/nathan/conductor/workspaces/task/rivet-checkout` - - Preferred local link target: `../rivet-checkout/rivetkit-typescript/packages/rivetkit` - - Sub-packages (`@rivetkit/sqlite-vfs`, etc.) resolve transitively from the RivetKit monorepo when using the local checkout. -6. Before using a local checkout, build RivetKit in the rivet repo: - ```bash - cd ../rivet-checkout/rivetkit-typescript - pnpm install - pnpm build -F rivetkit - ``` ## Rivet Routing @@ -205,8 +249,9 @@ For all Rivet/RivetKit implementation: - Do not add custom backend REST endpoints (no `/v1/*` shim layer). - We own the sandbox-agent project; treat sandbox-agent defects as first-party bugs and fix them instead of working around them. - Keep strict single-writer ownership: each table/row has exactly one actor writer. -- Parent actors (`organization`, `repository`, `task`, `history`, `sandbox-instance`) use command-only loops with no timeout. +- Parent actors (`organization`, `task`, `sandbox-instance`) use command-only loops with no timeout. - Periodic syncing lives in dedicated child actors with one timeout cadence each. +- **Task actors must be created lazily** — never during sync or bulk operations. PR sync writes virtual entries to the org's local `taskIndex`/`taskSummaries` tables. The task actor is created on first user interaction via `getOrCreate`. See `packages/backend/CLAUDE.md` "Lazy Task Actor Creation" for details. - Do not build blocking flows that wait on external systems to become ready or complete. Prefer push-based progression driven by actor messages, events, webhooks, or queue/workflow state changes. - Use workflows/background commands for any repo sync, sandbox provisioning, agent install, branch restack/rebase, or other multi-step external work. Do not keep user-facing actions/requests open while that work runs. - `send` policy: always `await` the `send(...)` call itself so enqueue failures surface immediately, but default to `wait: false`. @@ -227,8 +272,8 @@ Action handlers must return fast. The pattern: Examples: - `createTask` → `wait: true` (returns `{ taskId }`), then enqueue provisioning with `wait: false`. Client sees task appear immediately with pending status, observes `ready` via organization events. -- `sendWorkbenchMessage` → validate session is `ready` (throw if not), enqueue with `wait: false`. Client observes session transition to `running` → `idle` via session events. -- `createWorkbenchSession` → `wait: true` (returns `{ tabId }`), enqueue sandbox provisioning with `wait: false`. Client observes `pending_provision` → `ready` via task events. +- `sendWorkspaceMessage` → validate session is `ready` (throw if not), enqueue with `wait: false`. Client observes session transition to `running` → `idle` via session events. +- `createWorkspaceSession` → `wait: true` (returns `{ sessionId }`), enqueue sandbox provisioning with `wait: false`. Client observes `pending_provision` → `ready` via task events. Never use `wait: true` for operations that depend on external readiness, sandbox I/O, agent responses, git network operations, polling loops, or long-running queue drains. Never hold an action open while waiting for an external system to become ready — that is a polling/retry loop in disguise. @@ -240,11 +285,11 @@ All `wait: true` sends must have an explicit `timeout`. Maximum timeout for any ### Task creation: resolve metadata before creating the actor -When creating a task, all deterministic metadata (title, branch name) must be resolved synchronously in the parent actor (repository) *before* the task actor is created. The task actor must never be created with null `branchName` or `title`. +When creating a task, all deterministic metadata (title, branch name) must be resolved synchronously in the organization actor *before* the task actor is created. The task actor must never be created with null `branchName` or `title`. - Title is derived from the task description via `deriveFallbackTitle()` — pure string manipulation, no external I/O. - Branch name is derived from the title via `sanitizeBranchName()` + conflict checking against the repository's task index. -- The repository actor already has the task index and GitHub-backed default branch metadata. Resolve the branch name there without local git fetches. +- The organization actor owns the task index and reads GitHub-backed default branch metadata from the github-data actor. Resolve the branch name there without local git fetches. - Do not defer naming to a background provision workflow. Do not poll for names to become available. - The `onBranch` path (attaching to an existing branch) and the new-task path should both produce a fully-named task record on return. - Actor handle policy: @@ -320,9 +365,9 @@ Each entry must include: - Friction/issue - Attempted fix/workaround and outcome -## History Events +## Audit Log Events -Log notable workflow changes to `events` so `hf history` remains complete: +Log notable workflow changes to `events` so the audit log remains complete: - create - attach @@ -331,6 +376,8 @@ Log notable workflow changes to `events` so `hf history` remains complete: - status transitions - PR state transitions +When adding new task/workspace commands, always add a corresponding audit log event. + ## Validation After Changes Always run and fix failures: diff --git a/foundry/FOUNDRY-CHANGES.md b/foundry/FOUNDRY-CHANGES.md new file mode 100644 index 0000000..2bd76d2 --- /dev/null +++ b/foundry/FOUNDRY-CHANGES.md @@ -0,0 +1,1456 @@ +# Foundry Planned Changes + +## How to use this document + +Work through items checking boxes as you go. Some items have dependencies — do not start an item until its dependencies are checked off. After each item, run `pnpm -w typecheck && pnpm -w build && pnpm -w test` to validate. If an item includes a "CLAUDE.md update" section, apply it in the same change. Commit after each item passes validation. + +## Progress Log + +- 2026-03-14 10: Initial architecture mapping complete. + - Confirmed the current hot spots match the spec: `auth-user` is still mutation-by-action, `history` is still a separate actor with an `append` action wrapper, organization still owns `taskLookup`/`taskSummaries`, and the `Workbench*` surface is still shared across backend/client/frontend. + - Started foundational rename and migration planning for items `1`, `6`, and `25` because they drive most of the later fallout. +- 2026-03-14 11: Audit-log rename slice landed. + - Renamed the backend actor from `history` to `audit-log`, switched the queue name to `auditLog.command.append`, and removed the `append` action wrapper. + - Updated task/repository/organization call sites to send directly to the audit-log queue or read through the renamed audit-log handle. +- 2026-03-14 12: Foundational naming and dead-surface cleanup landed. + - Renamed the backend auth actor surface from `authUser` to `user`, including actor registration, key helpers, handles, and Better Auth service routing. + - Deleted the dead `getTaskEnriched` / `enrichTaskRecord` fan-out path and changed organization task reads to go straight to the task actor. + - Renamed admin-only GitHub rebuild/reload actions with the `admin*` prefix across backend, client, and frontend. + - Collapsed organization realtime to full-snapshot `organizationUpdated` events and aligned task events to `type: "taskUpdated"`. +- 2026-03-14 13: Task schema migration cleanup landed. + - Removed the task actor's runtime `CREATE TABLE IF NOT EXISTS` / `ALTER TABLE` helpers from `task/workbench.ts` and `task/workflow/init.ts`. + - Updated the checked-in task migration artifacts so the schema-defined task/session/runtime columns are created directly by migrations. +- 2026-03-14 14: Item 3 blocker documented. + - The spec's requested literal singleton `CHECK (id = 1)` on the Better Auth `user` table conflicts with the existing Better Auth adapter contract, which relies on external string `user.id`. + - Proceeding safely will require a design adjustment for that table rather than a straight mechanical migration. +- 2026-03-14 15: Better Auth mapping comments landed. + - Added Better Auth vs custom Foundry table/action comments in the user and organization actor schema/action surfaces so the adapter-constrained paths are explicit. +- 2026-03-15 09: Branch rename surface deleted and stale organization subscription fixed. + - Removed the remaining branch-rename surface from the client, mock backend, frontend UI, and repository action layer. There are no remaining `renameBranch` / `renameWorkbenchBranch` references in Foundry. + - Fixed the remote backend client to listen for `organizationUpdated` on the organization connection instead of the dead `workspaceUpdated` event name. +- 2026-03-15 10: Backend workspace rename landed. + - Renamed the backend task UI/workflow surface from `workbench` to `workspace`, including the task actor file, queue topic family, organization proxy actions, and the task session table name (`task_workspace_sessions`). + - Backend actor code no longer contains `Workbench` / `workbench` references, so the remaining shared/client/frontend rename can align to a stable backend target. +- 2026-03-15 11: Default model moved to user-scoped app state. + - Removed `defaultModel` from the organization schema/snapshot and stored it on the user profile instead, exposed through the app snapshot as a user preference. + - Wired `setAppDefaultModel` through the backend/app clients and changed the model picker to persist the starred/default model instead of resetting local React state on reload. +- 2026-03-15 11: Workspace surface completed across Foundry packages. + - Renamed the shared/client/frontend surface from `Workbench` to `Workspace`, including `workspace.ts`, workspace client/model files, DTO/type names, backend-client method names, frontend view-model imports, and the affected e2e/test files. + - Verified that Foundry backend/shared/client/frontend packages no longer contain `Workbench` / `workbench` references. +- 2026-03-15 11: Singleton constraints tightened where safe. + - Added `CHECK (id = 1)` enforcement for `github_meta`, `repo_meta`, `organization_profile`, and `user_profiles`, and updated the affected code paths/migrations to use row id `1`. + - The Better Auth `user` table remains blocked by the adapter contract, so item `3` is still open overall. +- 2026-03-14 12: Confirmed blocker for later user-table singleton work. + - Item `3` conflicts with the current Better Auth adapter contract for the `user` table: the adapter depends on the external string `user.id`, while the spec also asks for a literal singleton `CHECK (id = 1)` on that same table. + - That cannot be applied mechanically without redesigning the Better Auth adapter contract or introducing a separate surrogate identity column. I have not forced that change yet. +- 2026-03-15 13: Task/repository durable-state cleanup and auth-scoped workspace reads landed. + - Removed the remaining task/repository actor durable-state duplication: task `createState` now holds only `(organizationId, repoId, taskId)`, repository `createState` now holds only `(organizationId, repoId)`, task initialization seeds SQLite from the initialize queue payload, and task record reads fetch `repoRemote` through repository metadata instead of stale actor state. + - Removed the repository creation-time `remoteUrl` dependency from actor handles/callers and changed repository metadata to backfill/persist `remoteUrl` from GitHub data when needed. + - Wired Better Auth session ids through the remote client workspace/task-detail reads and through the task workflow queue handlers so user-scoped workspace state is no longer dropped on the floor by the organization/task proxy path. +- 2026-03-15 14: Coordinator routing boundary tightened. + - Removed the organization actor's fallback `taskId -> repoId` scan across repositories; task proxy actions now require `repoId` and route directly to the repository/task coordinator path the client already uses. + - Updated backend architecture notes to reflect the live repo-owned task projection (`tasks`) and the removal of the old organization-owned `taskLookup` / `taskSummaries` indexes. +- 2026-03-15 15: Workspace session-selection and dead task-status cleanup landed. + - Surfaced viewer-scoped `activeSessionId` through workspace task summary/detail DTOs, threaded it through the backend/client/mock surfaces, and added a dedicated workspace `select_session` mutation so session-tab selection now persists in `user_task_state` instead of living only in frontend local state. + - Removed dead task `diffStat` and sandbox `statusMessage` fields from the live workspace/task contracts and backend writes, and updated stale frontend/mock/e2e consumers to stop reading them. +- 2026-03-15 16: GitHub sync progress is now live on the organization topic. + - Added persisted GitHub sync phase/generation/progress fields to the github-data actor meta row and the organization profile projection, and exposed them through `organizationUpdated` snapshots so workspace consumers no longer wait on stale app-topic state during repo imports. + - Chunked branch and pull-request fetches by repository batches, added generation markers to imported GitHub rows, switched sync refreshes to upsert+sweep instead of delete-then-replace, and updated the workspace shell/dev panel to show live sync phase progress from the organization subscription. +- 2026-03-15 17: Foundry-local model lists now route through shared Sandbox Agent config resources. + - Removed the remaining duplicated hardcoded model tables from the frontend/client workspace view-model layer and switched backend default-model / agent-inference fallbacks to the shared catalog helpers in `shared/src/models.ts`. + - Updated mock/default app state to stop seeding deleted `claude-sonnet-4` / `claude-opus-4` ids, and aligned the user-profile default-model migration fallback with the shared catalog default. +- 2026-03-15 17: Shared model catalog moved off the old fixed union. + - Replaced the shared `WorkspaceModelId` closed union with string ids, introduced a shared model catalog derived from the sandbox-agent agent-config resources, and switched the client/frontend picker label helpers to consume that catalog instead of maintaining separate hardcoded `MODEL_GROUPS` arrays. + - Updated backend default-model and model→agent fallback logic to use the shared catalog/default id, and relaxed e2e env parsing so new sandbox-agent model ids can flow through without patching Foundry first. +- 2026-03-15 18: Workspace task status collapsed to a single live field. + - Removed the duplicate `runtimeStatus` field from workspace task/detail DTOs and all current backend/client/frontend consumers, so workspace task `status` is now the only task-state field on that surface. + - Removed the remaining synthetic `"new"` task status from the live workspace path; mock task creation now starts in the first concrete init state instead of exposing a frontend-only status. +- 2026-03-15 19: GitHub sync now persists branch and PR batches as they are fetched. + - The branch and pull-request phases now upsert each fetched repository batch immediately and only sweep stale rows after the phase completes, instead of buffering the full dataset in memory until the end of the sync. + - This aligns chunked progress reporting with chunked persistence and tightens recovery behavior for large repository imports. +- 2026-03-15 20: Repository-owned task projection artifacts are now aligned with runtime. + - Removed the last stale `task_lookup` Drizzle artifacts from the organization actor so the checked-in schema snapshots match the live repository-owned `tasks` projection. + - There are no remaining org/repo runtime references to the old org-side task lookup table. +- 2026-03-15 21: Legacy task/runtime fields are fully gone from the live Foundry surface. + - Confirmed the old task-table/runtime fields from item `21` are removed across backend/shared/client/frontend, and renamed the last leftover `agentTypeForModel()` helper to the neutral `sandboxAgentIdForModel()`. + - Deleted the final dead frontend diff-stat formatter/test that only referenced already-removed task diff state. +- 2026-03-15 22: Task status tracking is now fully collapsed to the canonical task status enum. + - With the earlier backend `statusMessage` removal plus this turn's workspace contract cleanup, the workspace/task surface now derives all task status UI from the canonical backend `status` enum. + - There are no remaining live workspace `runtimeStatus` or synthetic `"new"` task-state branches. +- 2026-03-15 23: Per-user workspace UI state is fully sourced from the user actor overlay. + - Confirmed the shared task actor no longer stores per-user `activeSessionId`, unread, or draft columns; those values are persisted in `user_task_state` and only projected back into workspace DTOs for the current viewer. + - The remaining active-session/unread/draft references in client/frontend code are consumer fields of that user-scoped overlay, not shared task-actor storage. +- 2026-03-15 24: Subscription topics are now fully normalized to single-snapshot events. + - Confirmed the shared realtime contracts now expose one full replacement event per topic (`appUpdated`, `organizationUpdated`, `taskUpdated`, `sessionUpdated`, `processesUpdated`) with matching wire event names and type fields. + - The client subscription manager already treats organization/task topics as full-snapshot refreshes, so there are no remaining multi-variant organization events or `taskDetailUpdated` name mismatches in live code. +- 2026-03-15 25: Sidebar PR/task split dead branches trimmed further. + - Removed the remaining dead `pr:`-id sidebar branch and switched the workspace sidebar to the real `pullRequest.isDraft` field instead of stale `pullRequest.status` reads. + - This does not finish item `15`, but it reduces the remaining synthetic PR/task split surface in the frontend. +- 2026-03-15 26: User-actor mutations now flow through a dedicated workflow queue. + - Added [user/workflow.ts](/home/nathan/sandbox-agent/foundry/packages/backend/src/actors/user/workflow.ts) plus shared query helpers, wired the user actor up with explicit queue names, and moved auth/profile/session/task-state mutations behind workflow handlers instead of direct action bodies. +- 2026-03-15 27: Organization GitHub/shell/billing mutations now route through workflow queues. + - Added shared organization queue definitions in `organization/queues.ts`, taught the organization workflow to handle the remaining GitHub projection, org-profile, and billing mutation commands, and switched the app-shell, Better Auth, GitHub-data actor, and org-isolation test to send queue messages instead of calling direct org mutation actions. + - Deleted the dead organization shell mutation actions that no longer had callers (`applyOrganizationSyncCompleted`, `markOrganizationSyncFailed`, `applyGithubInstallationCreated`, `applyGithubInstallationRemoved`, `applyGithubRepositoryChanges`), which moves items `4`, `10`, and `12` forward even though the broader org action split is still open. +- 2026-03-15 28: Organization action split trimmed more of the monolith and removed dead event types. + - Moved `starSandboxAgentRepo` into `organization/actions/onboarding.ts` and the admin GitHub reload actions into `organization/actions/github.ts`, so `organization/actions.ts` is carrying fewer unrelated app-shell responsibilities. + - Deleted the dead backend-only `actors/events.ts` type file after confirming nothing in Foundry still imports those old task/PR event interfaces. +- 2026-03-15 29: Repo overview branch rows now carry a single PR object. + - Replaced the repo-overview branch DTO's scalar PR fields (`prNumber`, `prState`, `prUrl`, `reviewStatus`, `reviewer`) with `pullRequest: WorkspacePullRequestSummary | null`, and updated repository overview assembly plus the organization dashboard to consume that unified PR shape. + - This does not finish item `15`, but it removes another synthetic PR-only read surface and makes the repo overview align better with the task summary PR model. +- 2026-03-15 30: Repo overview stopped falling back to raw GitHub PR rows. + - Changed repository overview assembly to read PR metadata only from the repo-owned task projection instead of rejoining live GitHub PR rows on read, so the dashboard is one step closer to treating PRs as task data rather than a separate UI entity. +- 2026-03-15 31: GitHub organization-shell repair now uses the org workflow queue. + - Converted `syncOrganizationShellFromGithub` from a direct org action into a workflow-backed mutation command and updated the GitHub org sync path to send `organization.command.github.organization_shell.sync_from_github` instead of calling the action directly. + - Updated Better Auth adapter writes and task user-overlay writes to send directly to the user workflow queue, which partially lands item `4` and sets up item `11` for the user actor. +- 2026-03-15 27: Workflow layout standardized and queue-only write paths expanded. + - Split the remaining inline actor workflows into dedicated files for `audit-log`, `repository`, `github-data`, and `organization`, and moved user read actions into `user/actions/*` with Better Auth-prefixed action names. + - Removed the task actor's public mutation action wrappers entirely, moved organization/repository/github-data/task coordination onto direct queue sends, and made repository metadata reads stop mutating `repo_meta` on cache misses. +- 2026-03-15 28: PR-only admin/UI seams trimmed and PR branches now claim real tasks. + - Removed the remaining dedicated "reload pull requests" / "reload pull request" admin hooks from the backend/client/frontend surfaces and deleted the sidebar PR-only context action. + - Repository PR refresh now lazily creates a branch-owned task when a pull request arrives for an unclaimed branch, so PR-only branches stop living purely as a side table in GitHub sync flows. +- 2026-03-15 29: Organization Better Auth writes now use workflow queues. + - Split the organization actor's Better Auth routing and verification reads into `organization/actions/better-auth.ts`, moved `APP_SHELL_ORGANIZATION_ID` to `organization/constants.ts`, and renamed the org Better Auth read surface to the `betterAuth*` form. + - Added dedicated organization workflow queue handlers for session/email/account index writes plus verification CRUD, and updated `services/better-auth.ts` to send those mutations directly to organization queues instead of calling mutation actions. +- 2026-03-15 30: Shared model routing metadata is now centralized. + - Extended the shared model catalog with explicit `agentKind` and `sandboxAgentId` metadata, changed `WorkspaceAgentKind` to a dynamic string, and switched backend task session creation to resolve sandbox agent ids through the shared catalog instead of hardcoded `Codex` vs `Claude` branching. + - Updated the mock app/workspace and frontend model picker/new-task flows to consume the shared catalog/default model instead of forcing stale `Claude`/`Codex` fallbacks or a baked-in `gpt-5.3-codex` create-task default. +- 2026-03-15 31: Dead GitHub-data PR reload surface removed and fixture PR shapes aligned. + - Deleted the unused GitHub-data `reloadPullRequest` workflow command plus the dead `listOpenPullRequests` / `getPullRequestForBranch` action surface that no longer has live Foundry callers. + - Fixed the stale client `workspace-model.ts` pull-request fixtures to use the live `WorkspacePullRequestSummary` shape, which removes the last targeted client type errors in the touched slice. +- 2026-03-15 32: Organization action splitting continued past Better Auth. + - Moved the app snapshot/default-model/org-profile actions into `organization/actions/organization.ts`, onboarding actions into `organization/actions/onboarding.ts`, and app-level GitHub token/import actions into `organization/actions/github.ts`, then composed those files at the actor boundary. + - `organization/app-shell.ts` now exports shared helpers for those domains and no longer directly defines the moved action handlers, shrinking the remaining monolith and advancing item `10`. +- 2026-03-15 33: Task PR detail now reads the repository-owned task projection. + - Removed duplicate scalar PR fields from `TaskRecord` and `WorkspaceTaskDetail`, switched the remaining frontend/client consumers to the canonical `pullRequest` object, and trimmed stale mock/test scaffolding that still populated those dead fields. + - Replaced the task actor's PR lookup path with a repository projection read (`getProjectedTaskSummary`) so task detail/summary no longer ask the repo actor to re-query GitHub PR rows by branch. +- 2026-03-15 34: Workspace model catalogs now come from the live sandbox-agent API. + - Added a shared normalizer for `/v1/agents?config=true` payloads, exposed sandbox-scoped `listWorkspaceModelGroups()` from the task sandbox actor, and switched backend workspace session creation to resolve sandbox agent ids from the live sandbox catalog instead of only the checked-in default tables. + - Updated the frontend workspace model picker to query the active sandbox for model groups and use that live catalog for labels/options, while keeping the shared default catalog only as a fallback when no sandbox is available yet or the sandbox-agent connection is unavailable. +- 2026-03-15 35: Backend-only organization snapshot refresh is now queue-backed. + - Added `organization.command.snapshot.broadcast` to the organization workflow, switched repository and app-import callers to send that queue message instead of calling the organization actor's `refreshOrganizationSnapshot` action directly, and removed the direct action wrapper. + - Deleted the dead `adminReconcileWorkspaceState` organization action/interface entry after confirming nothing in Foundry still calls it. +- 2026-03-15 36: Dead backend actor export cleanup continued. + - Removed the stale `export * from "./events.js"` line from `backend/src/actors/index.ts`, which was left behind after deleting the dead backend event type file. + - This keeps the backend actor barrel aligned with the live file set and advances the final dead-code/event audit. +- 2026-03-15 34: Item 17 removed from this checklist; do not leave started items half-finished. + - By request, item `17` (`Type all actor context parameters — remove c: any`) is deferred out of this Foundry task and should not block completion here. + - Process note for the remaining checklist work: once an item is started, finish that item to completion before opening a different partial seam. Item `15` is the current priority under that rule. +- 2026-03-15 35: Task/PR unification now routes live PR changes through repository-owned task summaries only. + - GitHub PR sync and webhook handling now send concrete PR summaries directly to the repository coordinator, which lazily creates a real branch-owned task when needed and persists PR metadata on the task projection instead of re-querying raw `github_pull_requests` rows from repository reads. + - Cleared the last stale scalar PR test references (`prUrl`, `reviewStatus`, `reviewer`) so the remaining Foundry surfaces consistently use the canonical `pullRequest` object. +- 2026-03-15 36: Organization action entrypoints are now fully organized under `actions/`, and the public mutation surface is queue-only. + - Moved organization task/workspace proxy actions plus `createTaskMutation` into `organization/actions/tasks.ts`, added `organization/actions/app.ts` so every composed org action bundle now lives under `organization/actions/*`, and removed dead `app-shell` exports that no longer had external callers. + - Audited the remaining public organization actor actions and confirmed the write paths go through organization/repository/task/github-data workflow queues instead of direct mutation actions, which closes item `4` and item `10`. +- 2026-03-15 37: Organization dead-code audit completed. + - Removed the leftover exported-only Better Auth predicate helper from `organization/actions/better-auth.ts`; it is now module-private because nothing outside that file uses it. + - Audited the remaining organization actor surface and confirmed the live public reads/writes still in use are the composed `actions/*` bundles plus workflow mutation helpers. There are no remaining dead org action exports from the pre-refactor monolith. +- 2026-03-15 38: Final dead-event and dead-surface audit completed for the in-scope Foundry refactor. + - Confirmed the live Foundry realtime topics each have a single event type (`appUpdated`, `organizationUpdated`, `taskUpdated`, `sessionUpdated`), and the deleted legacy event names (`workspaceUpdated`, `taskSummaryUpdated`, `taskDetailUpdated`, `pullRequestUpdated`, `pullRequestRemoved`) no longer exist in live Foundry code. + - Re-audited the major removed compatibility seams (`Workbench`, branch rename, PR-only sidebar ids, duplicate runtime task status, `getTaskEnriched`, organization-owned task lookup tables) and found no remaining live references beyond expected domain strings like GitHub webhook event names or CLI `pr` labels. +- 2026-03-15 39: Item 15 was finished for real by moving PR ownership into the task actor. + - Added task-local `pull_request_json` storage, switched task detail/summary reads to the task DB, and added `task.command.pull_request.sync` so GitHub/repository flows update PR metadata through the task coordinator instead of overlaying it in the repository projection. + - The mock right sidebar now trusts the canonical `task.pullRequest.url` field instead of rebuilding a PR URL from repo name + PR number. +- 2026-03-15 40: Better Auth user singleton constraint is now enforced without breaking the adapter contract. + - The user actor's `user` table now uses an integer singleton primary key with `CHECK (id = 1)` plus a separate `auth_user_id` column for Better Auth's external string identity. + - Updated the user actor query/join/mutation helpers so Better Auth still reads and writes logical `user.id` as the external string id while SQLite enforces the singleton row invariant locally. + +No backwards compatibility — delete old code, don't deprecate. If something is removed, remove it everywhere (backend, client, shared types, frontend, tests, mocks). + +### Suggested execution order (respects dependencies) + +**Wave 1 — no dependencies, can be done in any order:** +1, 2, 3, 4, 5, 6, 13, 16, 20, 21, 23, 25 + +**Wave 2 — depends on wave 1:** +7 (after 1), 9 (after 13), 10 (after 1+6), 11 (after 4), 22 (after 1), 24 (after 21), 26 (after 25) + +**Wave 3 — depends on wave 2:** +8 (after 7+25), 12 (after 10), 15 (after 9+13), 19 (after 21+24) + +**Wave 4 — depends on wave 3:** +14 (after 15) + +**Final:** +18 (after everything), final audit pass (after everything) + +### Index + +- [x] 1. Rename Auth User actor → User actor +- [x] 2. Add Better Auth mapping comments to user/org actor tables +- [x] 3. Enforce `id = 1` CHECK constraint on single-row tables +- [x] 4. Move all mutation actions to queue messages +- [x] 5. Migrate task actor raw SQL to Drizzle migrations +- [x] 6. Rename History actor → Audit Log actor +- [x] 7. Move starred/default model to user actor settings *(depends on: 1)* +- [x] 8. Replace hardcoded model/agent lists with sandbox-agent API data *(depends on: 7, 25)* +- [x] 9. Flatten `taskLookup` + `taskSummaries` into single `tasks` table *(depends on: 13)* +- [x] 10. Reorganize user and org actor actions into `actions/` folders *(depends on: 1, 6)* +- [x] 11. Standardize workflow file structure across all actors *(depends on: 4)* +- [x] 12. Audit and remove dead code in organization actor *(depends on: 10)* +- [x] 13. Enforce coordinator pattern and fix ownership violations +- [x] 14. Standardize one event per subscription topic *(depends on: 15)* +- [x] 15. Unify tasks and pull requests — PRs are just task data *(depends on: 9, 13)* +- [x] 16. Chunk GitHub data sync and publish progress +- [x] 18. Final pass: remove all dead code *(depends on: all other items)* +- [x] 19. Remove duplicate data between `c.state` and SQLite *(depends on: 21, 24)* +- [x] 20. Prefix admin/recovery actions with `admin` +- [x] 21. Remove legacy/session-scoped fields from task table +- [x] 22. Move per-user UI state from task actor to user actor *(depends on: 1)* +- [x] 23. Delete `getTaskEnriched` and `enrichTaskRecord` (dead code) +- [x] 24. Clean up task status tracking *(depends on: 21)* +- [x] 25. Remove "Workbench" prefix from all types, functions, files, tables +- [x] 26. Delete branch rename (branches immutable after creation) *(depends on: 25)* +- [x] Final audit pass: dead events scan *(depends on: all other items)* + +Deferred follow-up outside this checklist: + +- 17. Type all actor context parameters — remove `c: any` *(removed from this task's scope by request)* + +--- + +## [ ] 1. Rename Auth User actor → User actor + +**Rationale:** The actor is already a single per-user actor storing all user data. The "Auth" prefix is unnecessary. + +### Files to change + +- **`foundry/packages/backend/src/actors/auth-user/`** → rename directory to `user/` + - `index.ts` — rename export `authUser` → `user`, display name `"Auth User"` → `"User"` + - `db/schema.ts`, `db/db.ts`, `db/migrations.ts`, `db/drizzle.config.ts` — update any auth-prefixed references +- **`foundry/packages/backend/src/actors/keys.ts`** — `authUserKey()` → `userKey()` +- **`foundry/packages/backend/src/actors/handles.ts`** — `getOrCreateAuthUser` → `getOrCreateUser`, `getAuthUser` → `getUser`, `selfAuthUser` → `selfUser` +- **`foundry/packages/backend/src/actors/index.ts`** — update import path and registration +- **`foundry/packages/backend/src/services/better-auth.ts`** — update all `authUser` references +- **Action names** — consider dropping "Auth" prefix from `createAuthRecord`, `findOneAuthRecord`, `updateAuthRecord`, `deleteAuthRecord`, `countAuthRecords`, etc. + +--- + +## [ ] 2. Add Better Auth mapping comments to user/org actor tables, actions, and queues + +**Rationale:** The user and organization actors contain a mix of Better Auth-driven and custom Foundry code. Tables, actions, and queues that exist to serve Better Auth's adapter need comments so developers know which pieces are constrained by Better Auth's schema/contract and which are ours to change freely. + +### Table mapping + +| Actor | Table | Better Auth? | Notes | +|---|---|---|---| +| user | `user` | Yes — 1:1 `user` model | All fields from Better Auth | +| user | `session` | Yes — 1:1 `session` model | All fields from Better Auth | +| user | `account` | Yes — 1:1 `account` model | All fields from Better Auth | +| user | `user_profiles` | No — custom Foundry | GitHub login, role, eligible orgs, starter repo status | +| user | `session_state` | No — custom Foundry | Active organization per session | +| org | `auth_verification` | Yes — Better Auth `verification` model | Lives on org actor because verification happens before user exists | +| org | `auth_session_index` | No — custom routing index | Maps session tokens → user actor IDs for Better Auth adapter routing | +| org | `auth_email_index` | No — custom routing index | Maps emails → user actor IDs for Better Auth adapter routing | +| org | `auth_account_index` | No — custom routing index | Maps OAuth accounts → user actor IDs for Better Auth adapter routing | + +### Action/queue mapping (user actor) + +| Action/Queue | Better Auth? | Notes | +|---|---|---| +| `createAuthRecord` | Yes — Better Auth adapter | Called by Better Auth adapter to create user/session/account records | +| `findOneAuthRecord` | Yes — Better Auth adapter | Called by Better Auth adapter for single-record lookups with joins | +| `findManyAuthRecords` | Yes — Better Auth adapter | Called by Better Auth adapter for multi-record queries | +| `updateAuthRecord` | Yes — Better Auth adapter | Called by Better Auth adapter to update records | +| `updateManyAuthRecords` | Yes — Better Auth adapter | Called by Better Auth adapter for bulk updates | +| `deleteAuthRecord` | Yes — Better Auth adapter | Called by Better Auth adapter to delete records | +| `deleteManyAuthRecords` | Yes — Better Auth adapter | Called by Better Auth adapter for bulk deletes | +| `countAuthRecords` | Yes — Better Auth adapter | Called by Better Auth adapter for count queries | +| `getAppAuthState` | No — custom Foundry | Aggregates auth state for frontend consumption | +| `upsertUserProfile` | No — custom Foundry | Manages Foundry-specific user profile data | +| `upsertSessionState` | No — custom Foundry | Manages Foundry-specific session state | + +### Action/queue mapping (organization actor app-shell) + +| Action/Queue | Better Auth? | Notes | +|---|---|---| +| App-shell auth index CRUD actions | Yes — Better Auth adapter routing | Maintain lookup indexes so the adapter can route by session/email/account to the correct user actor | +| `auth_verification` CRUD | Yes — Better Auth `verification` model | Used for email verification and password resets | + +### Files to change + +- **`foundry/packages/backend/src/actors/auth-user/db/schema.ts`** — add doc comments to each table: + - `user`, `session`, `account`: "Better Auth core model — schema defined at https://better-auth.com/docs/concepts/database" + - `user_profiles`, `session_state`: "Custom Foundry table — not part of Better Auth" +- **`foundry/packages/backend/src/actors/auth-user/index.ts`** — add doc comments to each action/queue: + - Better Auth adapter actions: "Better Auth adapter — called by the Better Auth adapter in better-auth.ts. Schema constrained by Better Auth." + - Custom actions: "Custom Foundry action — not part of Better Auth" +- **`foundry/packages/backend/src/actors/organization/db/schema.ts`** — add doc comments to `auth_verification` (Better Auth core), and the three index tables (Better Auth adapter routing) +- **`foundry/packages/backend/src/actors/organization/app-shell.ts`** — add doc comments to auth index actions marking them as Better Auth adapter routing infrastructure + +--- + +## [x] 3. Enforce `id = 1` CHECK constraint on all single-row actor tables + +**Rationale:** When an actor instance represents a single entity, tables that hold exactly one row should enforce this at the DB level with a `CHECK (id = 1)` constraint. The task actor already does this correctly; other actors don't. + +### Tables needing the constraint + +| Actor | Table | Current enforcement | Fix needed | +|---|---|---|---| +| auth-user (→ user) | `user` | None | Add `CHECK (id = 1)`, use integer PK | +| auth-user (→ user) | `user_profiles` | None | Add `CHECK (id = 1)`, use integer PK | +| github-data | `github_meta` | Hardcoded `id=1` in code only | Add `CHECK (id = 1)` in schema | +| organization | `organization_profile` | None | Add `CHECK (id = 1)`, use integer PK | +| repository | `repo_meta` | Hardcoded `id=1` in code only | Add `CHECK (id = 1)` in schema | +| task | `task` | CHECK constraint | Already correct | +| task | `task_runtime` | CHECK constraint | Already correct | + +### Files to change + +- **`foundry/packages/backend/src/actors/auth-user/db/schema.ts`** — change `user` and `user_profiles` tables to integer PK with CHECK constraint +- **`foundry/packages/backend/src/actors/auth-user/index.ts`** — update queries to use `id = 1` pattern +- **`foundry/packages/backend/src/services/better-auth.ts`** — update adapter to use fixed `id = 1` +- **`foundry/packages/backend/src/actors/github-data/db/schema.ts`** — add CHECK constraint to `github_meta` (already uses `id=1` in code) +- **`foundry/packages/backend/src/actors/organization/db/schema.ts`** — change `organization_profile` to integer PK with CHECK constraint +- **`foundry/packages/backend/src/actors/organization/actions.ts`** — update queries to use `id = 1` +- **`foundry/packages/backend/src/actors/repository/db/schema.ts`** — add CHECK constraint to `repo_meta` (already uses `id=1` in code) +- All affected actors — regenerate `db/migrations.ts` + +### CLAUDE.md update + +- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Single-row tables (tables that hold exactly one record per actor instance, e.g. metadata or profile tables) must use an integer primary key with a `CHECK (id = 1)` constraint to enforce the singleton invariant at the database level. Follow the pattern established in the task actor's `task` and `task_runtime` tables." + +--- + +## [x] 4. Move all mutation actions to queue messages + +**Rationale:** Actions should be read-only (queries). All mutations (INSERT/UPDATE/DELETE) should go through queue messages processed by workflow handlers. This ensures single-writer consistency and aligns with the actor model. No actor currently does this correctly — the history actor has the mutation in the workflow handler, but the `append` action wraps a `wait: true` queue send, which is the same anti-pattern (callers should send to the queue directly). + +### Violations by actor + +**User actor (auth-user)** — `auth-user/index.ts` — 7 mutation actions: +- `createAuthRecord` (INSERT, line 164) +- `updateAuthRecord` (UPDATE, line 205) +- `updateManyAuthRecords` (UPDATE, line 219) +- `deleteAuthRecord` (DELETE, line 234) +- `deleteManyAuthRecords` (DELETE, line 243) +- `upsertUserProfile` (UPSERT, line 283) +- `upsertSessionState` (UPSERT, line 331) + +**GitHub Data actor** — `github-data/index.ts` — 7 mutation actions: +- `fullSync` (batch INSERT/DELETE/UPDATE, line 686) +- `reloadOrganization` (batch, line 690) +- `reloadAllPullRequests` (batch, line 694) +- `reloadRepository` (INSERT/UPDATE, line 698) +- `reloadPullRequest` (INSERT/DELETE/UPDATE, line 763) +- `clearState` (batch DELETE, line 851) +- `handlePullRequestWebhook` (INSERT/UPDATE/DELETE, line 879) + +**Organization actor — `actions.ts`** — 5 mutation actions: +- `applyTaskSummaryUpdate` (UPSERT, line 464) +- `removeTaskSummary` (DELETE, line 476) +- `applyGithubRepositoryProjection` (UPSERT, line 521) +- `applyGithubDataProjection` (INSERT/UPDATE/DELETE, line 547) +- `recordGithubWebhookReceipt` (UPDATE, line 620) + +**Organization actor — `app-shell.ts`** — 38 mutation actions: + +Better Auth index mutations (11): +- `authUpsertSessionIndex` (UPSERT) +- `authDeleteSessionIndex` (DELETE) +- `authUpsertEmailIndex` (UPSERT) +- `authDeleteEmailIndex` (DELETE) +- `authUpsertAccountIndex` (UPSERT) +- `authDeleteAccountIndex` (DELETE) +- `authCreateVerification` (INSERT) +- `authUpdateVerification` (UPDATE) +- `authUpdateManyVerification` (UPDATE) +- `authDeleteVerification` (DELETE) +- `authDeleteManyVerification` (DELETE) + +Organization profile/state mutations (13): +- `updateOrganizationShellProfile` (UPDATE on organizationProfile) +- `markOrganizationSyncStarted` (UPDATE on organizationProfile) +- `applyOrganizationSyncCompleted` (UPDATE on organizationProfile) +- `markOrganizationSyncFailed` (UPDATE on organizationProfile) +- `applyOrganizationStripeCustomer` (UPDATE on organizationProfile) +- `applyOrganizationStripeSubscription` (UPSERT on organizationProfile) +- `applyOrganizationFreePlan` (UPDATE on organizationProfile) +- `setOrganizationBillingPaymentMethod` (UPDATE on organizationProfile) +- `setOrganizationBillingStatus` (UPDATE on organizationProfile) +- `upsertOrganizationInvoice` (UPSERT on invoices) +- `recordOrganizationSeatUsage` (UPSERT on seatAssignments) +- `applyGithubInstallationCreated` (UPDATE on organizationProfile) +- `applyGithubInstallationRemoved` (UPDATE on organizationProfile) + +App-level mutations that delegate + mutate (8): +- `skipAppStarterRepo` (calls upsertUserProfile) +- `starAppStarterRepo` (calls upsertUserProfile + child mutation) +- `selectAppOrganization` (calls setActiveOrganization) +- `triggerAppRepoImport` (calls markOrganizationSyncStarted) +- `createAppCheckoutSession` (calls applyOrganizationFreePlan + applyOrganizationStripeCustomer) +- `finalizeAppCheckoutSession` (calls applyOrganizationStripeCustomer) +- `cancelAppScheduledRenewal` (calls setOrganizationBillingStatus) +- `resumeAppSubscription` (calls setOrganizationBillingStatus) +- `recordAppSeatUsage` (calls recordOrganizationSeatUsage) +- `handleAppStripeWebhook` (calls multiple org mutations) +- `handleAppGithubWebhook` (calls org mutations + github-data mutations) +- `syncOrganizationShellFromGithub` (multiple DB operations) +- `applyGithubRepositoryChanges` (calls applyGithubRepositoryProjection) + +**Task actor workbench** — `task/workbench.ts` — 14 mutation actions: +- `renameWorkbenchTask` (UPDATE, line 970) +- `renameWorkbenchBranch` (UPDATE, line 988) +- `createWorkbenchSession` (INSERT, line 1039) +- `renameWorkbenchSession` (UPDATE, line 1125) +- `setWorkbenchSessionUnread` (UPDATE, line 1136) +- `updateWorkbenchDraft` (UPDATE, line 1143) +- `changeWorkbenchModel` (UPDATE, line 1152) +- `sendWorkbenchMessage` (UPDATE, line 1205) +- `stopWorkbenchSession` (UPDATE, line 1255) +- `syncWorkbenchSessionStatus` (UPDATE, line 1265) +- `closeWorkbenchSession` (UPDATE, line 1331) +- `markWorkbenchUnread` (UPDATE, line 1363) +- `publishWorkbenchPr` (UPDATE, line 1375) +- `revertWorkbenchFile` (UPDATE, line 1403) + +**Repository actor** — `repository/actions.ts` — 5 mutation actions/helpers: +- `createTask` → calls `createTaskMutation()` (INSERT on taskIndex + creates task actor) +- `registerTaskBranch` → calls `registerTaskBranchMutation()` (INSERT/UPDATE on taskIndex) +- `reinsertTaskIndexRow()` (INSERT/UPDATE, called from `getTaskEnriched`) +- `deleteStaleTaskIndexRow()` (DELETE) +- `persistRemoteUrl()` (INSERT/UPDATE on repoMeta, called from `getRepoOverview`) + +### History (audit log) actor — `append` action must also be removed + +The history actor's workflow handler is correct (mutation in queue handler), but the `append` action (line 77) is a `wait: true` wrapper around the queue send — same anti-pattern. Delete the `append` action. Callers (the `appendHistory()` helper in `task/workflow/common.ts`) should send directly to the `auditLog.command.append` queue with `wait: false` (audit log writes are fire-and-forget, no need to block the caller). + +### Reference patterns (queue handlers only, no action wrappers) +- **Task actor core** — initialize, attach, push, sync, merge, archive, kill all use queue messages directly + +### Migration approach + +This is NOT about wrapping queue sends inside actions. The mutation actions must be **removed entirely** and replaced with queue messages that callers (including `packages/client`) send directly. + +Each actor needs: +1. Define queue message types for each mutation +2. Move mutation logic from action handlers into workflow/queue handlers +3. **Delete the mutation actions** — do not wrap them +4. Update `packages/client` to send queue messages directly to the actor instead of calling the old action +5. Update any inter-actor callers (e.g. `better-auth.ts`, `app-shell.ts`, other actors) to send queue messages instead of calling actions + +### CLAUDE.md update + +- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Actions must be read-only. All database mutations (INSERT, UPDATE, DELETE, UPSERT) must be queue messages processed by workflow handlers. Callers (client, other actors, services) send messages directly to the queue — do not wrap queue sends inside actions. Follow the pattern established in the task workflow actor's queue handlers." + +--- + +## [ ] 5. Migrate task actor raw SQL to Drizzle migrations + +**Rationale:** The task actor uses raw `db.execute()` with `ALTER TABLE ... ADD COLUMN` in `workbench.ts` and `workflow/init.ts` instead of proper Drizzle migrations. All actor DBs should use the standard Drizzle migration pattern. + +### Files to change + +- **`foundry/packages/backend/src/actors/task/workbench.ts`** (lines 24-56) — remove `ALTER TABLE` raw SQL, add columns to `db/schema.ts` and generate a proper migration +- **`foundry/packages/backend/src/actors/task/workflow/init.ts`** (lines 12-15) — same treatment +- **`foundry/packages/backend/src/actors/task/db/schema.ts`** — add the missing columns that are currently added via `ALTER TABLE` +- **`foundry/packages/backend/src/actors/task/db/migrations.ts`** — regenerate with new migration + +### CLAUDE.md update + +- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "All actor databases must use Drizzle ORM with proper schema definitions and generated migrations. No raw SQL (`db.execute()`, `ALTER TABLE`, etc.). Schema changes must go through `schema.ts` + migration generation." + +--- + +## [ ] 6. Rename History actor → Audit Log actor + +**Rationale:** The actor functions as a comprehensive audit log tracking task lifecycle events. "Audit Log" better describes its purpose. + +### Files to change + +- **`foundry/packages/backend/src/actors/history/`** → rename directory to `audit-log/` + - `index.ts` — rename export `history` → `auditLog`, display name `"History"` → `"Audit Log"`, queue `history.command.append` → `auditLog.command.append` + - Internal types: `HistoryInput` → `AuditLogInput`, `AppendHistoryCommand` → `AppendAuditLogCommand`, `ListHistoryParams` → `ListAuditLogParams` +- **`foundry/packages/backend/src/actors/keys.ts`** — `historyKey()` → `auditLogKey()` +- **`foundry/packages/backend/src/actors/handles.ts`** — `getOrCreateHistory` → `getOrCreateAuditLog`, `selfHistory` → `selfAuditLog` +- **`foundry/packages/backend/src/actors/index.ts`** — update import path and registration +- **`foundry/packages/shared/src/contracts.ts`** — `HistoryEvent` → `AuditLogEvent` +- **`foundry/packages/backend/src/actors/organization/actions.ts`** — `history()` action → `auditLog()`, update imports +- **`foundry/packages/backend/src/actors/repository/actions.ts`** — update `getOrCreateHistory` calls +- **`foundry/packages/backend/src/actors/task/workflow/common.ts`** — `appendHistory()` → `appendAuditLog()` +- **`foundry/packages/backend/src/actors/task/workflow/init.ts`** — update imports and calls +- **`foundry/packages/backend/src/actors/task/workflow/commands.ts`** — update imports and calls +- **`foundry/packages/backend/src/actors/task/workflow/push.ts`** — update imports and calls + +### Coverage gaps to fix + +The audit log only covers 9 of ~24 significant events (37.5%). The entire `task/workbench.ts` file has zero logging. Add audit log calls for: + +**High priority (missing lifecycle events):** +- `task.switch` — in `task/workflow/index.ts` handleSwitchActivity +- `task.session.created` — in `task/workbench.ts` createWorkbenchSession +- `task.session.closed` — in `task/workbench.ts` closeWorkbenchSession +- `task.session.stopped` — in `task/workbench.ts` stopWorkbenchSession + +**Medium priority (missing user actions):** +- `task.session.renamed` — renameWorkbenchSession +- `task.message.sent` — sendWorkbenchMessage +- `task.model.changed` — changeWorkbenchModel +- `task.title.changed` — renameWorkbenchTask +- `task.branch.renamed` — renameWorkbenchBranch +- `task.pr.published` — publishWorkbenchPr +- `task.file.reverted` — revertWorkbenchFile + +**Low priority / debatable:** +- `task.draft.updated`, `task.session.unread`, `task.derived.refreshed`, `task.transcript.refreshed` + +### CLAUDE.md updates needed + +- **`foundry/packages/backend/CLAUDE.md`** — rename `HistoryActor` → `AuditLogActor` in actor hierarchy, add maintenance rule: "Every new action or command handler that represents a user-visible or workflow-significant event must append to the audit log actor. The audit log must remain a comprehensive record of all significant operations." +- **`foundry/CLAUDE.md`** — rename "History Events" section → "Audit Log Events", update the list to include all events above, add note: "When adding new task/workbench commands, always add a corresponding audit log event." + +--- + +## [ ] 7. Move starred/default model to user actor settings + +**Dependencies:** item 1 + +**Rationale:** The starred/default model preference is currently broken — the frontend stores it in local React state that resets on reload. The org actor's `organizationProfile` table has a `defaultModel` column but there's no action to update it and it's the wrong scope anyway. This is a per-user preference, not an org setting. + +### Current state (broken) + +- **Frontend** (`mock-layout.tsx` line 313) — `useState("claude-sonnet-4")` — local state, lost on reload +- **Model picker UI** (`model-picker.tsx`) — has star icons + `onSetDefault` callback, but it only updates local state +- **Org actor** (`organization/db/schema.ts` line 43) — `defaultModel` column exists but nothing writes to it +- **No backend persistence** — starred model is not saved anywhere + +### Changes needed + +1. **Add `user_settings` table to user actor** (or add `defaultModel` column to `user_profiles`): + - `defaultModel` (text) — the user's starred/preferred model + - File: `foundry/packages/backend/src/actors/auth-user/db/schema.ts` + +2. **Add queue message to user actor** to update the default model: + - File: `foundry/packages/backend/src/actors/auth-user/index.ts` + +3. **Remove `defaultModel` from org actor** `organizationProfile` table (wrong scope): + - File: `foundry/packages/backend/src/actors/organization/db/schema.ts` + +4. **Update frontend** to read starred model from user settings (via `app` subscription) and send queue message on star click: + - File: `foundry/packages/frontend/src/components/mock-layout/model-picker.tsx` + - File: `foundry/packages/frontend/src/components/mock-layout.tsx` + +5. **Update shared types** — move `defaultModel` from `FoundryOrganizationSettings` to user settings type: + - File: `foundry/packages/shared/src/app-shell.ts` + +6. **Update client** to send the queue message to user actor: + - File: `foundry/packages/client/` + +--- + +## [ ] 8. Replace hardcoded model/agent lists with sandbox-agent API data + +**Dependencies:** items 7, 25 + +**Rationale:** The frontend hardcodes 8 models in a static list and ignores the sandbox-agent API's `GET /v1/agents` endpoint which already exposes the full agent config — models, modes, and reasoning/thought levels per agent. The frontend should consume this API 1:1 instead of maintaining its own stale copy. + +### Current state (hardcoded) + +- **`foundry/packages/frontend/src/components/mock-layout/view-model.ts`** (lines 20-39) — hardcoded `MODEL_GROUPS` with 8 models +- **`foundry/packages/client/src/workbench-model.ts`** (lines 18-37) — identical hardcoded `MODEL_GROUPS` copy +- **`foundry/packages/shared/src/workbench.ts`** (lines 5-13) — `WorkbenchModelId` hardcoded union type +- No modes or thought/reasoning levels exposed in UI at all +- No API calls to discover available models + +### What the sandbox-agent API already provides (`GET /v1/agents`) + +Per agent, the API returns: +- **models** — full list with display names (Claude: 4, Codex: 6, Cursor: 35+, OpenCode: 239) +- **modes** — execution modes (Claude: 5, Codex: 3, OpenCode: 2) +- **thought_level** — reasoning levels (Codex: low/medium/high/xhigh, Mock: low/medium/high) +- **capabilities** — plan_mode, reasoning, status support +- **credentialsAvailable** / **installed** — agent availability + +### Changes needed + +1. **Remove hardcoded model lists** from: + - `foundry/packages/frontend/src/components/mock-layout/view-model.ts` — delete `MODEL_GROUPS` + - `foundry/packages/client/src/workbench-model.ts` — delete `MODEL_GROUPS` + - `foundry/packages/shared/src/workbench.ts` — replace `WorkbenchModelId` union type with `string` (dynamic from API) + +2. **Backend: fetch and cache agent config from sandbox-agent API** + - Add an action or startup flow that calls `GET /v1/agents?config=true` on the sandbox-agent API + - Cache the result (agent list + models + modes + thought levels) in the appropriate actor + - Expose it to the frontend via the existing subscription/event system + +3. **Frontend: consume API-driven config** + - Model picker reads available models from backend-provided agent config, not hardcoded list + - Expose modes selector per agent + - Expose thought/reasoning level selector for agents that support it (Codex, Mock) + - Group models by agent as the API does (not by arbitrary provider grouping) + +4. **Update shared types** — make model/mode/thought_level types dynamic strings rather than hardcoded unions: + - `foundry/packages/shared/src/workbench.ts` + +5. **No backwards compatibility needed** — we're cleaning up, not preserving old behavior + +--- + +## [ ] 9. Flatten `taskLookup` + `taskSummaries` into single `tasks` table on org actor + +**Dependencies:** item 13 + +**Rationale:** `taskLookup` (taskId → repoId) is a strict subset of `taskSummaries` (which also has repoId + title, status, branch, PR, sessions). There's no reason for two tables with the same primary key. Flatten into one `tasks` table. + +### Current state + +- **`taskLookup`** — `taskId` (PK), `repoId` — used only for taskId → repoId resolution +- **`taskSummaries`** — `taskId` (PK), `repoId`, `title`, `status`, `repoName`, `updatedAtMs`, `branch`, `pullRequestJson`, `sessionsSummaryJson` — materialized sidebar data + +### Changes needed + +1. **Merge into single `tasks` table** in `foundry/packages/backend/src/actors/organization/db/schema.ts`: + - Drop `taskLookup` table + - Rename `taskSummaries` → `tasks` + - Keep all columns from `taskSummaries` (already includes `repoId`) + +2. **Update all references**: + - `foundry/packages/backend/src/actors/organization/actions.ts` — replace `taskLookup` queries with `tasks` table lookups + - `foundry/packages/backend/src/actors/organization/app-shell.ts` — if it references either table + - Any imports of the old table names from schema + +3. **Regenerate migrations** — `foundry/packages/backend/src/actors/organization/db/migrations.ts` + +--- + +## [x] 10. Reorganize user and organization actor actions into `actions/` folders + +**Dependencies:** items 1, 6 + +**Rationale:** Both actors cram too many concerns into single files. The organization actor has `app-shell.ts` (1,947 lines) + `actions.ts` mixing Better Auth, Stripe, GitHub, onboarding, workbench proxying, and org state. The user actor mixes Better Auth adapter CRUD with custom Foundry actions. Split into `actions/` folders grouped by domain, with `betterAuth` prefix on all Better Auth actions. + +### User actor → `user/actions/` + +| File | Actions | Source | +|---|---|---| +| `actions/better-auth.ts` | `betterAuthCreateRecord`, `betterAuthFindOneRecord`, `betterAuthFindManyRecords`, `betterAuthUpdateRecord`, `betterAuthUpdateManyRecords`, `betterAuthDeleteRecord`, `betterAuthDeleteManyRecords`, `betterAuthCountRecords` + all helper functions (`tableFor`, `columnFor`, `normalizeValue`, `clauseToExpr`, `buildWhere`, `applyJoinToRow`, `applyJoinToRows`) | Currently in `index.ts` | +| `actions/user.ts` | `getAppAuthState`, `upsertUserProfile`, `upsertSessionState` | Currently in `index.ts` | + +### Organization actor → `organization/actions/` + +**Delete `app-shell.ts`** — split its ~50 actions + helpers across these files: + +| File | Actions | Source | +|---|---|---| +| `actions/better-auth.ts` | `betterAuthFindSessionIndex`, `betterAuthUpsertSessionIndex`, `betterAuthDeleteSessionIndex`, `betterAuthFindEmailIndex`, `betterAuthUpsertEmailIndex`, `betterAuthDeleteEmailIndex`, `betterAuthFindAccountIndex`, `betterAuthUpsertAccountIndex`, `betterAuthDeleteAccountIndex`, `betterAuthCreateVerification`, `betterAuthFindOneVerification`, `betterAuthFindManyVerification`, `betterAuthUpdateVerification`, `betterAuthUpdateManyVerification`, `betterAuthDeleteVerification`, `betterAuthDeleteManyVerification`, `betterAuthCountVerification` + auth clause builder helpers | Currently in `app-shell.ts` | +| `actions/stripe.ts` | `createAppCheckoutSession`, `finalizeAppCheckoutSession`, `createAppBillingPortalSession`, `cancelAppScheduledRenewal`, `resumeAppSubscription`, `recordAppSeatUsage`, `handleAppStripeWebhook`, `applyOrganizationStripeCustomer`, `applyOrganizationStripeSubscription`, `applyOrganizationFreePlan`, `setOrganizationBillingPaymentMethod`, `setOrganizationBillingStatus`, `upsertOrganizationInvoice`, `recordOrganizationSeatUsage` | Currently in `app-shell.ts` | +| `actions/github.ts` | `resolveAppGithubToken`, `beginAppGithubInstall`, `triggerAppRepoImport`, `handleAppGithubWebhook`, `syncOrganizationShellFromGithub`, `syncGithubOrganizations`, `applyGithubInstallationCreated`, `applyGithubInstallationRemoved`, `applyGithubRepositoryChanges`, `reloadGithubOrganization`, `reloadGithubPullRequests`, `reloadGithubRepository`, `reloadGithubPullRequest`, `applyGithubRepositoryProjection`, `applyGithubDataProjection`, `recordGithubWebhookReceipt`, `refreshTaskSummaryForGithubBranch` | Currently split across `app-shell.ts` and `actions.ts` | +| `actions/onboarding.ts` | `skipAppStarterRepo`, `starAppStarterRepo`, `starSandboxAgentRepo`, `selectAppOrganization` | Currently in `app-shell.ts` | +| `actions/organization.ts` | `getAppSnapshot`, `getOrganizationShellState`, `getOrganizationShellStateIfInitialized`, `updateOrganizationShellProfile`, `updateAppOrganizationProfile`, `markOrganizationSyncStarted`, `applyOrganizationSyncCompleted`, `markOrganizationSyncFailed`, `useOrganization`, `getOrganizationSummary`, `reconcileWorkbenchState` | Currently split across `app-shell.ts` and `actions.ts` | +| `actions/tasks.ts` | `createTask`, `createWorkbenchTask`, `listTasks`, `getTask`, `switchTask`, `applyTaskSummaryUpdate`, `removeTaskSummary`, `findTaskForGithubBranch`, `applyOpenPullRequestUpdate`, `removeOpenPullRequest`, `attachTask`, `pushTask`, `syncTask`, `mergeTask`, `archiveTask`, `killTask` | Currently in `actions.ts` | +| `actions/workbench.ts` | `markWorkbenchUnread`, `renameWorkbenchTask`, `renameWorkbenchBranch`, `createWorkbenchSession`, `renameWorkbenchSession`, `setWorkbenchSessionUnread`, `updateWorkbenchDraft`, `changeWorkbenchModel`, `sendWorkbenchMessage`, `stopWorkbenchSession`, `closeWorkbenchSession`, `publishWorkbenchPr`, `revertWorkbenchFile` | Currently in `actions.ts` (proxy calls to task actor) | +| `actions/repos.ts` | `listRepos`, `getRepoOverview` | Currently in `actions.ts` | +| `actions/history.ts` | `history` (→ `auditLog` after rename) | Currently in `actions.ts` | + +Also move: +- `APP_SHELL_ORGANIZATION_ID` constant → `organization/constants.ts` +- `runOrganizationWorkflow` → `organization/workflow.ts` +- Private helpers (`buildAppSnapshot`, `assertAppOrganization`, `collectAllTaskSummaries`, etc.) → colocate with the action file that uses them + +### Files to update + +- **`foundry/packages/backend/src/services/better-auth.ts`** — update all action name references to use `betterAuth` prefix +- **`foundry/packages/backend/src/actors/organization/index.ts`** — import and spread action objects from `actions/` files instead of `app-shell.ts` + `actions.ts` +- **`foundry/packages/backend/src/actors/auth-user/index.ts`** (or `user/index.ts`) — import actions from `actions/` files + +--- + +## [ ] 11. Standardize workflow file structure across all actors + +**Dependencies:** item 4 + +**Rationale:** Workflow logic is inconsistently placed — inline in `index.ts`, in `actions.ts`, or in a `workflow/` directory. Standardize: every actor with a workflow gets a `workflow.ts` file. If the workflow is large, use `workflow/{index,...}.ts`. + +### Changes per actor + +| Actor | Current location | New location | Notes | +|---|---|---|---| +| user (auth-user) | None | `workflow.ts` (new) | Needs a workflow for mutations (item 4) | +| github-data | Inline in `index.ts` (~57 lines) | `workflow.ts` | Extract `runGithubDataWorkflow` + handler | +| history (→ audit-log) | Inline in `index.ts` (~18 lines) | `workflow.ts` | Extract `runHistoryWorkflow` + `appendHistoryRow` | +| organization | In `actions.ts` (~51 lines) | `workflow.ts` | Extract `runOrganizationWorkflow` + queue handlers | +| repository | In `actions.ts` (~42 lines) | `workflow.ts` | Extract `runRepositoryWorkflow` + queue handlers | +| task | `workflow/` directory (926 lines) | `workflow/` directory — already correct | Keep as-is: `workflow/index.ts`, `workflow/queue.ts`, `workflow/common.ts`, `workflow/init.ts`, `workflow/commands.ts`, `workflow/push.ts` | +| sandbox | None (wrapper) | N/A | No custom workflow needed | + +### Pattern + +- **Small workflows** (< ~200 lines): single `workflow.ts` file +- **Large workflows** (> ~200 lines): `workflow/index.ts` holds the main loop, other files hold step groups: + - `workflow/index.ts` — main loop + handler dispatch + - `workflow/queue.ts` — queue name definitions (if many) + - `workflow/{group}.ts` — step/activity functions grouped by domain + +### CLAUDE.md update + +- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Every actor with a message queue must have its workflow logic in a dedicated `workflow.ts` file (or `workflow/index.ts` for complex actors). Do not inline workflow logic in `index.ts` or `actions.ts`. Actions are read-only handlers; workflow handlers process queue messages and perform mutations." + +--- + +--- + +## [ ] 12. Audit and remove dead code in organization actor + +**Dependencies:** item 10 + +**Rationale:** The organization actor has ~50+ actions across `app-shell.ts` and `actions.ts`. Likely some are unused or vestigial. Audit all actions and queues for dead code and remove anything that has no callers. + +### Scope + +- All actions in `organization/actions.ts` and `organization/app-shell.ts` +- All queue message types and their handlers +- Helper functions that may no longer be called +- Shared types in `packages/shared` that only served removed actions + +### Approach + +- Trace each action/queue from caller → handler to confirm it's live +- Remove any action with no callers (client, other actors, services, HTTP endpoints) +- Remove any queue handler with no senders +- Remove associated types and helpers + +--- + +## [ ] 13. Enforce coordinator pattern and fix ownership violations + +**Rationale:** The actor hierarchy follows a coordinator pattern: org → repo → task → session. The coordinator owns the index/summary of its children, handles create/destroy, and children push updates up to their coordinator. Several violations exist where levels are skipped. + +### Coordinator hierarchy (add to CLAUDE.md) + +``` +Organization (coordinator for repos) +├── Repository (coordinator for tasks) +│ └── Task (coordinator for sessions) +│ └── Session +``` + +**Rules:** +- The coordinator owns the index/summary table for its direct children +- The coordinator handles create/destroy of its direct children +- Children push summary updates UP to their direct coordinator (not skipping levels) +- Read paths go through the coordinator, not direct cross-level access +- No backwards compatibility needed — we're cleaning up + +### Violations to fix + +#### V1: Task index tables on wrong actor (HIGH) + +`taskLookup` and `taskSummaries` (item 9 merges these into `tasks`) are on the **organization** actor but should be on the **repository** actor, since repo is the coordinator for tasks. + +**Fix:** +- Move the merged `tasks` table (from item 9) to `repository/db/schema.ts` +- Repository owns task summaries, not organization +- Organization gets a `repoSummaries` table instead (repo count, latest activity, etc.) — the repo pushes its summary up to org + +#### V2: Tasks push summaries directly to org, skipping repo (HIGH) + +Task actors call `organization.applyTaskSummaryUpdate()` directly (line 464 in `actions.ts`), bypassing the repository coordinator. + +**Fix:** +- Task pushes summary to `repository.applyTaskSummaryUpdate()` instead +- Repository updates its `tasks` table, then pushes a repo summary up to organization +- Organization never receives task-level updates directly + +#### V3: Org resolves taskId → repoId from its own table (MEDIUM) + +`resolveRepoId(c, taskId)` in `organization/actions.ts` queries `taskLookup` directly. Used by `switchTask`, `attachTask`, `pushTask`, `syncTask`, `mergeTask`, `archiveTask`, `killTask` (7 actions). + +**Fix:** +- Remove `resolveRepoId()` from org actor +- Org must know the `repoId` from the caller (frontend already knows which repo a task belongs to) or query the repo actor +- Update all 7 proxy actions to require `repoId` in their input instead of looking it up + +#### V4: Duplicate task creation bookkeeping at org level (MEDIUM) + +`createTaskMutation` in org actor calls `repository.createTask()`, then independently inserts `taskLookup` and seeds `taskSummaries`. Repository already inserts its own `taskIndex` row. + +**Fix:** +- Org calls `repository.createTask()` — that's it +- Repository handles all task index bookkeeping internally +- Repository pushes the new task summary back up to org as part of its repo summary update + +### Files to change + +- **`foundry/packages/backend/src/actors/organization/db/schema.ts`** — remove `taskLookup` and `taskSummaries`, add `repoSummaries` if needed +- **`foundry/packages/backend/src/actors/repository/db/schema.ts`** — add merged `tasks` table (task summaries) +- **`foundry/packages/backend/src/actors/organization/actions.ts`** — remove `resolveRepoId()`, `applyTaskSummaryUpdate`, `removeTaskSummary`, `findTaskForGithubBranch`, `refreshTaskSummaryForGithubBranch`; update proxy actions to require `repoId` in input +- **`foundry/packages/backend/src/actors/repository/actions.ts`** — add `applyTaskSummaryUpdate` action (receives from task), push repo summary to org +- **`foundry/packages/backend/src/actors/task/workflow/common.ts`** — change summary push target from org → repo +- **`foundry/packages/shared/src/contracts.ts`** — update input types to include `repoId` where needed +- **`foundry/packages/client/`** — update calls to pass `repoId` + +### CLAUDE.md update + +- **`foundry/packages/backend/CLAUDE.md`** — add coordinator pattern rules: + ``` + ## Coordinator Pattern + + The actor hierarchy follows a strict coordinator pattern: + - Organization = coordinator for repositories + - Repository = coordinator for tasks + - Task = coordinator for sessions + + Rules: + - Each coordinator owns the index/summary table for its direct children. + - Only the coordinator handles create/destroy of its direct children. + - Children push summary updates to their direct coordinator only (never skip levels). + - Cross-level access (e.g. org directly querying task state) is not allowed — go through the coordinator. + - Proxy actions at higher levels (e.g. org.pushTask) must delegate to the correct coordinator, not bypass it. + ``` + +--- + +--- + +## [ ] 14. Standardize one event per subscription topic across all actors + +**Dependencies:** item 15 + +**Rationale:** Each subscription topic should have exactly one event type carrying the full replacement snapshot. The organization topic currently violates this with 7 subtypes. Additionally, event naming is inconsistent across actors. Standardize all of them. + +### Current state + +| Topic | Wire event name | Event type field | Subtypes | Issue | +|---|---|---|---|---| +| `app` | `appUpdated` | `type: "appUpdated"` | 1 | Name is fine | +| `organization` | `organizationUpdated` | 7 variants | **7** | Needs consolidation | +| `task` | `taskUpdated` | `type: "taskDetailUpdated"` | 1 | Wire name ≠ type name | +| `session` | `sessionUpdated` | `type: "sessionUpdated"` | 1 | Fine | +| `sandboxProcesses` | `processesUpdated` | `type: "processesUpdated"` | 1 | Fine | + +### Target state + +Every topic gets exactly one event. Wire event name = type field = `{topic}Updated`. Each carries the full snapshot for that topic. + +| Topic | Event name | Payload | +|---|---|---| +| `app` | `appUpdated` | `FoundryAppSnapshot` | +| `organization` | `organizationUpdated` | `OrganizationSummarySnapshot` | +| `task` | `taskUpdated` | `WorkbenchTaskDetail` | +| `session` | `sessionUpdated` | `WorkbenchSessionDetail` | +| `sandboxProcesses` | `processesUpdated` | `SandboxProcessSnapshot[]` | + +### Organization — consolidate 7 subtypes into 1 + +Remove the discriminated union. Replace all 7 subtypes: +- `taskSummaryUpdated`, `taskRemoved`, `repoAdded`, `repoUpdated`, `repoRemoved`, `pullRequestUpdated`, `pullRequestRemoved` + +With a single `organizationUpdated` event carrying the full `OrganizationSummarySnapshot`. The client replaces its cached state — same pattern as every other topic. + +### Task — fix event type name mismatch + +Wire event is `taskUpdated` but the type field says `taskDetailUpdated`. Rename to `taskUpdated` everywhere for consistency. + +### Files to change + +- **`foundry/packages/shared/src/realtime-events.ts`** — replace `OrganizationEvent` union with single event type; rename `TaskEvent.type` from `taskDetailUpdated` → `taskUpdated` +- **`foundry/packages/backend/src/actors/organization/actions.ts`** — update all 7 `c.broadcast("organizationUpdated", { type: "taskSummaryUpdated", ... })` calls to emit single event with full snapshot +- **`foundry/packages/backend/src/actors/organization/app-shell.ts`** — same for any broadcasts here +- **`foundry/packages/backend/src/actors/task/workbench.ts`** — rename `taskDetailUpdated` → `taskUpdated` in broadcast calls +- **`foundry/packages/client/src/subscription/topics.ts`** — simplify `applyEvent` for organization topic (no more discriminated union handling); update task event type name +- **`foundry/packages/client/src/subscription/mock-manager.ts`** — update mock event handling +- **`foundry/packages/frontend/`** — update any direct references to event type names + +### CLAUDE.md update + +- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Each subscription topic must have exactly one event type. The event carries the full replacement snapshot for that topic — no discriminated unions, no partial patches, no subtypes. Event name must match the pattern `{topic}Updated` (e.g. `organizationUpdated`, `taskUpdated`). When state changes, broadcast the full snapshot; the client replaces its cached state." + +--- + +## [x] 15. Unify tasks and pull requests — PRs are just task data + +**Dependencies:** items 9, 13 + +**Rationale:** From the client's perspective, tasks and PRs are the same thing — a branch with work on it. The frontend already merges them into one sorted list, converting PRs to synthetic task objects with `pr:{prId}` IDs. The distinction is artificial. A "task" should represent any branch, and the task actor lazily wraps it. PR metadata is just data the task holds. + +### Current state (separate entities) + +- **Tasks**: stored in task actor SQLite, surfaced via `WorkbenchTaskSummary`, events via `taskSummaryUpdated` +- **PRs**: stored in GitHub data actor (`githubPullRequests` table), surfaced via `WorkbenchOpenPrSummary`, events via `pullRequestUpdated`/`pullRequestRemoved` +- **Frontend hack**: converts PRs to fake task objects with `pr:{prId}` IDs, merges into one list +- **Filtering logic**: org actor silently swallows `pullRequestUpdated` if a task claims the same branch — fragile coupling +- **Two separate types**: `WorkbenchTaskSummary` and `WorkbenchOpenPrSummary` with overlapping fields + +### Target state (unified) + +- **One entity**: a "task" represents a branch. Task actors are lazily created when needed (user creates one, or a PR arrives for an unclaimed branch). +- **PR data lives on the task**: the task actor stores PR metadata (number, title, state, url, isDraft, authorLogin, etc.) as part of its state, not as a separate entity +- **One type**: `WorkbenchTaskSummary` includes full PR fields (nullable). No separate `WorkbenchOpenPrSummary`. +- **One event**: `organizationUpdated` carries task summaries that include PR data. No separate PR events. +- **No synthetic IDs**: every item in the sidebar is a real task with a real taskId + +### Changes needed + +1. **Remove `WorkbenchOpenPrSummary` type** from `packages/shared/src/workbench.ts` — merge its fields into `WorkbenchTaskSummary` +2. **Expand task's `pullRequest` field** from `{ number, status }` to full PR metadata (number, title, state, url, headRefName, baseRefName, isDraft, authorLogin, updatedAtMs) +3. **Remove `openPullRequests` from `OrganizationSummarySnapshot`** — all items are tasks now +4. **Remove PR-specific events** from `realtime-events.ts`: `pullRequestUpdated`, `pullRequestRemoved` +5. **Remove PR-specific actions** from organization actor: `applyOpenPullRequestUpdate`, `removeOpenPullRequest` +6. **Remove branch-claiming filter logic** in org actor (the `if task claims branch, skip PR` check) +7. **GitHub data actor PR sync**: when PRs arrive (webhook or sync), create/update a task for that branch lazily via the repository coordinator +8. **Task actor**: store PR metadata in its DB (new columns or table), update when GitHub data pushes changes +9. **Frontend**: remove `toOpenPrTaskModel` conversion, remove `pr:` ID prefix hack, remove separate `openPullRequests` state — sidebar is just tasks +10. **Repository actor**: when a PR arrives for a branch with no task, lazily create a task actor for it (lightweight, no sandbox needed) + +### Implications for coordinator pattern (item 13) + +This reinforces: repo is the coordinator for tasks. When GitHub data detects a new PR for a branch, it tells the repo coordinator, which creates/updates the task. The task holds the PR data and pushes its summary to the repo coordinator. + +### No backwards compatibility needed + +The `authSessionIndex`, `authEmailIndex`, `authAccountIndex`, and `authVerification` tables stay on the org actor. They're routing indexes needed by the Better Auth adapter to resolve user identity before the user actor can be accessed (e.g. session token → userId lookup). Already covered in item 2 for adding comments explaining this. + +--- + +## [ ] 16. Chunk GitHub data sync and publish progress + +**Rationale:** `runFullSync` in the github-data actor fetches everything at once (all repos, branches, members, PRs), replaces all tables atomically, and has a 5-minute timeout. For large orgs this will timeout or lose all data mid-sync (replace pattern deletes everything first). Needs to be chunked with incremental progress. + +### Current state (broken for large orgs) + +- `runFullSync()` (`github-data/index.ts` line 486-538): + 1. Fetches ALL repos, branches, members, PRs in 4 sequential calls + 2. `replaceRepositories/Branches/Members/PullRequests` — deletes all rows then inserts all new rows + 3. Single 5-minute timeout wraps the entire operation + 4. No progress reporting to the client — just "Syncing GitHub data..." → "Synced N repositories" + 5. If it fails mid-sync, data is partially deleted with no recovery + +### Changes needed + +1. **Chunk the sync by repository** — sync repos first (paginated from GitHub API), then for each repo chunk, sync its branches and PRs. Members can be a separate chunk. + +2. **Incremental upsert, not replace** — don't delete-then-insert. Use upsert per row so partial sync doesn't lose data. Mark rows with a sync generation ID; after full sync completes, delete rows from previous generations. + +3. **Run in a loop, not a single step** — each chunk is a separate workflow step with its own timeout. If one chunk fails, previous chunks are persisted. + +4. **Publish progress per chunk** — after each chunk completes: + - Update `github_meta` with progress (e.g. `syncedRepos: 15/42`) + - Push progress to the organization actor + - Organization broadcasts to clients so the UI shows progress (e.g. "Syncing repositories... 15/42") + +5. **Initial sync uses the same chunked approach** — `github-data-initial-sync` step should kick off the chunked loop, not call `runFullSync` directly + +### Files to change + +- **`foundry/packages/backend/src/actors/github-data/index.ts`**: + - Refactor `runFullSync` into chunked loop + - Replace `replaceRepositories/Branches/Members/PullRequests` with upsert + generation sweep + - Add progress metadata to `github_meta` table + - Publish progress to org actor after each chunk +- **`foundry/packages/backend/src/actors/github-data/db/schema.ts`** — add sync generation column to all tables, add progress fields to `github_meta` +- **`foundry/packages/backend/src/actors/organization/actions.ts`** (or `app-shell.ts`) — handle sync progress updates and broadcast to clients +- **`foundry/packages/shared/src/app-shell.ts`** — add sync progress fields to `FoundryGithubState` (e.g. `syncProgress: { current: number; total: number } | null`) +- **`foundry/packages/frontend/`** — show sync progress in UI (e.g. "Syncing repositories... 15/42") + +--- + +--- + +# Deferred follow-up outside this task + +## 17. Type all actor context parameters — remove `c: any` + +**Rationale:** 272+ instances of `c: any`, `ctx: any`, `loopCtx: any` across all actor code. This eliminates type safety for DB access, state access, broadcasts, and queue operations. All context parameters should use RivetKit's proper context types. + +### Scope (by file, approximate count) + +| File | `any` contexts | +|---|---| +| `organization/app-shell.ts` | ~108 | +| `organization/actions.ts` | ~56 | +| `task/workbench.ts` | ~53 | +| `github-data/index.ts` | ~23 | +| `repository/actions.ts` | ~22 | +| `sandbox/index.ts` | ~21 | +| `handles.ts` | ~19 | +| `task/workflow/commands.ts` | ~10 | +| `task/workflow/init.ts` | ~4 | +| `auth-user/index.ts` | ~2 | +| `history/index.ts` | ~2 | +| `task/workflow/index.ts` | ~2 | +| `task/workflow/common.ts` | ~2 | +| `task/workflow/push.ts` | ~1 | +| `polling.ts` | ~1 | + +### Changes needed + +1. **Determine correct RivetKit context types** — check RivetKit exports for `ActionContext`, `ActorContextOf`, `WorkflowContext`, `LoopContext`, or equivalent. Reference `polling.ts` which already defines typed contexts (`PollingActorContext`, `WorkflowPollingActorContext`). + +2. **Define per-actor context types** — each actor has its own state shape and DB schema, so the context type should be specific (e.g. `ActionContext` or similar). + +3. **Replace all `c: any`** with the proper typed context across every file listed above. + +4. **Type workflow/loop contexts** — `ctx: any` in workflow functions and `loopCtx: any` in loop callbacks need proper types too. + +### CLAUDE.md update + +- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "All actor context parameters (`c`, `ctx`, `loopCtx`) must be properly typed using RivetKit's context types. Never use `any` for actor contexts. Each actor should define or derive its context type from the actor definition." + +--- + +## [ ] 18. Final pass: remove all dead code + +**Dependencies:** all other items (do this last, after 17) + +**Rationale:** After completing all changes above, many actions, queues, SQLite tables, workflow steps, shared types, and helper functions will be orphaned. Do a full scan to find and remove everything that's dead. + +### Scope + +Scan the entire foundry codebase for: +- **Dead actions** — actions with no callers (client, other actors, services, HTTP endpoints) +- **Dead queues** — queue message types with no senders +- **Dead SQLite tables** — tables with no reads or writes +- **Dead workflow steps** — step names that are no longer referenced +- **Dead shared types** — types in `packages/shared` that are no longer imported +- **Dead helper functions** — private functions with no callers +- **Dead imports** — unused imports across all files + +### When to do this + +After all items 1–17 are complete. Not before — removing code while other items are in progress will create conflicts. + +--- + +## [ ] 19. Remove duplicate data between `c.state` and SQLite + +**Dependencies:** items 21, 24 + +**Rationale:** Several actors store the same data in both `c.state` (RivetKit durable state) and their SQLite tables. Mutable fields that exist in both can silently diverge — `c.state` becomes stale when the SQLite copy is updated. Per the existing CLAUDE.md rule, `c.state` should hold only small scalars/identifiers; anything queryable or mutable belongs in SQLite. + +### Duplicates found + +**Task actor** — `c.state` (`createState` in `task/index.ts` lines 124-139) vs `task`/`taskRuntime` tables: + +| Field | In SQLite? | Mutable? | Verdict | +|---|---|---|---| +| `organizationId` | No | No | **KEEP** — identity field | +| `repoId` | No | No | **KEEP** — identity field | +| `taskId` | No | No | **KEEP** — identity field | +| `repoRemote` | No (but org `repos` table has it) | No | **DELETE** — not needed on task, read from repo/org | +| `branchName` | Yes (`task.branch_name`) | Yes | **REMOVE from c.state** — HIGH risk, goes stale on rename | +| `title` | Yes (`task.title`) | Yes | **REMOVE from c.state** — HIGH risk, goes stale on rename | +| `task` (description) | Yes (`task.task`) | No | **REMOVE from c.state** — redundant | +| `sandboxProviderId` | Yes (`task.sandbox_provider_id`) | No | **REMOVE from c.state** — redundant | +| `agentType` | Yes (`task.agent_type`) | Yes | **DELETE entirely** — session-specific (item 21) | +| `explicitTitle` | No | No | **MOVE to SQLite** — creation metadata | +| `explicitBranchName` | No | No | **MOVE to SQLite** — creation metadata | +| `initialPrompt` | No | No | **DELETE entirely** — dead code, session-specific (item 21) | +| `initialized` | No | Yes | **DELETE entirely** — dead code, `status` already tracks init progress | +| `previousStatus` | No | No | **DELETE entirely** — never set, never read | + +**Repository actor** — `c.state` (`createState` in `repository/index.ts`) vs `repoMeta` table: + +| Field | Mutable? | Risk | +|---|---|---| +| `remoteUrl` | No | Low — redundant but safe | + +### Fix + +Remove all duplicated fields from `c.state`. Keep only identity fields needed for actor key resolution (e.g. `organizationId`, `repoId`, `taskId`). Read mutable data from SQLite. + +**Task actor `c.state` should become:** +```typescript +createState: (_c, input) => ({ + organizationId: input.organizationId, + repoId: input.repoId, + taskId: input.taskId, +}) +``` + +Fields already in SQLite (`branchName`, `title`, `task`, `sandboxProviderId`) — remove from `c.state`, read from SQLite only. Fields not yet in SQLite (`explicitTitle`, `explicitBranchName`) — add to `task` table, remove from `c.state`. Dead code to delete entirely: `agentType`, `initialPrompt` (item 21), `initialized`, `previousStatus`, `repoRemote`. + +**Repository actor `c.state` should become:** +```typescript +createState: (_c, input) => ({ + organizationId: input.organizationId, + repoId: input.repoId, +}) +``` + +`remoteUrl` is removed from repo actor `c.state` entirely. The repo actor reads `remoteUrl` from its own `repoMeta` SQLite table when needed. The org actor already stores `remoteUrl` in its `repos` table (source of truth from GitHub data). The `getOrCreateRepository()` helper in `handles.ts` currently requires `remoteUrl` as a parameter and passes it as `createWithInput` — this parameter must be removed. Every call site in `organization/actions.ts` and `organization/app-shell.ts` currently does a DB lookup for `remoteUrl` just to pass it to `getOrCreateRepository()` — all of those lookups go away. On actor creation, the repo actor should populate its `repoMeta.remoteUrl` by querying the org actor or github-data actor, not by receiving it as a create input. + +### Files to change + +- **`foundry/packages/backend/src/actors/task/index.ts`** — trim `createState`, update all `c.state.*` reads for removed fields to read from SQLite instead +- **`foundry/packages/backend/src/actors/task/workbench.ts`** — update `c.state.*` reads +- **`foundry/packages/backend/src/actors/task/workflow/*.ts`** — update `c.state.*` reads +- **`foundry/packages/backend/src/actors/repository/index.ts`** — trim `createState`, remove `remoteUrl` from input type +- **`foundry/packages/backend/src/actors/repository/actions.ts`** — update all `c.state.remoteUrl` reads to query `repoMeta` table; remove `persistRemoteUrl()` helper +- **`foundry/packages/backend/src/actors/handles.ts`** — remove `remoteUrl` parameter from `getOrCreateRepository()` +- **`foundry/packages/backend/src/actors/organization/actions.ts`** — remove all `remoteUrl` lookups done solely to pass to `getOrCreateRepository()` (~10 call sites) +- **`foundry/packages/backend/src/actors/organization/app-shell.ts`** — same cleanup for app-shell call sites + +### CLAUDE.md update + +- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Never duplicate data between `c.state` and SQLite. `c.state` holds only immutable identity fields needed for actor key resolution (e.g. `organizationId`, `repoId`, `taskId`). All mutable data and anything queryable must live exclusively in SQLite. If a field can change after actor creation, it must not be in `c.state`." + +--- + +## [ ] 20. Prefix all admin/recovery actions with `admin` + +**Rationale:** Several actions are admin-only recovery/rebuild operations but their names don't distinguish them from normal product flows. Prefix with `admin` so it's immediately clear these are not part of regular user flows. + +### Actions to rename + +**Organization actor:** + +| Current name | New name | Why it's admin | +|---|---|---| +| `reconcileWorkbenchState` | `adminReconcileWorkbenchState` | Full fan-out rebuild of task summary projection | +| `reloadGithubOrganization` | `adminReloadGithubOrganization` | Manual trigger to refetch all org GitHub data | +| `reloadGithubPullRequests` | `adminReloadGithubPullRequests` | Manual trigger to refetch all PR data | +| `reloadGithubRepository` | `adminReloadGithubRepository` | Manual trigger to refetch single repo | +| `reloadGithubPullRequest` | `adminReloadGithubPullRequest` | Manual trigger to refetch single PR | + +**GitHub Data actor:** + +| Current name | New name | Why it's admin | +|---|---|---| +| `fullSync` | `adminFullSync` | Full replace of all GitHub data — recovery operation | +| `reloadOrganization` | `adminReloadOrganization` | Triggers full sync manually | +| `reloadAllPullRequests` | `adminReloadAllPullRequests` | Triggers full sync manually | +| `clearState` | `adminClearState` | Deletes all GitHub data — recovery from lost access | + +**NOT renamed** (these are triggered by webhooks/normal flows, not manual admin actions): +- `reloadRepository` — called by push/create/delete webhooks (incremental, normal flow) +- `reloadPullRequest` — called by PR webhooks (incremental, normal flow) +- `handlePullRequestWebhook` — webhook handler (normal flow) +- `syncGithubOrganizations` — called during OAuth callback (normal flow, though also used for repair) + +### Files to change + +- **`foundry/packages/backend/src/actors/github-data/index.ts`** — rename actions +- **`foundry/packages/backend/src/actors/organization/actions.ts`** — rename actions +- **`foundry/packages/client/src/backend-client.ts`** — update method names +- **`foundry/packages/frontend/`** — update any references to renamed actions + +### CLAUDE.md update + +- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Admin-only actions (recovery, rebuild, manual resync, state reset) must be prefixed with `admin` (e.g. `adminReconcileState`, `adminClearState`). This makes it clear they are not part of normal product flows and should not be called from regular client code paths." + +--- + +## [ ] 21. Remove legacy/session-scoped fields from task table + +**Rationale:** The `task` table has fields that either belong on the session, are redundant with data from other actors, or are dead code from the removed local git clone. These should be cleaned up. + +### Fields to remove from `task` table and `c.state` + +**`agentType`** — Legacy from when task = 1 session. Only used for `defaultModelForAgent(c.state.agentType)` to pick the default model when creating a new session. Sessions already have their own `model` column in `taskWorkbenchSessions`. The default model for new sessions should come from user settings (see item 16 — starred model stored in user actor). Remove `agentType` from task table, `c.state`, `createState`, `TaskRecord`, and all `defaultModelForAgent()` call sites. Replace with user settings lookup. + +**`initialPrompt`** — Stored on `c.state` at task creation but **never read anywhere**. Completely dead code. This is also session-specific, not task-specific — the initial prompt belongs on the first session, not the task. Remove from `c.state`, `createState` input type, and `CreateTaskCommand`/`CreateTaskInput` types. Remove from `repository/actions.ts` create flow. + +**`prSubmitted`** — Redundant boolean set when `submitPullRequest` runs. PR state already flows from GitHub webhooks → github-data actor → branch name lookup. This boolean can go stale (PR closed and reopened, PR deleted, etc.). Remove entirely — PR existence is derivable from github-data by branch name (already how `enrichTaskRecord` and `buildTaskSummary` work). + +### Dead fields on `taskRuntime` table + +**`provisionStage`** — Values: `"queued"`, `"ready"`, `"error"`. Redundant with `status` — `init_complete` implies ready, `error` implies error. Never read in business logic. Delete. + +**`provisionStageUpdatedAt`** — Timestamp for `provisionStage` changes. Never read anywhere. Delete. + +### Dead fields on `TaskRecord` (in `workflow/common.ts`) + +These are always hardcoded to `null` — remnants of the removed local git clone: + +- `diffStat` — was populated from `branches` table (deleted) +- `hasUnpushed` — was populated from `branches` table (deleted) +- `conflictsWithMain` — was populated from `branches` table (deleted) +- `parentBranch` — was populated from `branches` table (deleted) + +Remove from `TaskRecord` type, `getCurrentRecord()`, and all consumers (contracts, mock client, tests, frontend). + +### Files to change + +- **`foundry/packages/backend/src/actors/task/db/schema.ts`** — remove `agentType` and `prSubmitted` columns from `task` table; remove `provisionStage` and `provisionStageUpdatedAt` from `taskRuntime` table +- **`foundry/packages/backend/src/actors/task/index.ts`** — remove `agentType`, `initialPrompt`, `initialized`, `previousStatus`, `repoRemote` from `createState` and input type +- **`foundry/packages/backend/src/actors/task/workbench.ts`** — remove `defaultModelForAgent()`, `agentTypeForModel()`, update session creation to use user settings for default model; remove `prSubmitted` set in `submitPullRequest` +- **`foundry/packages/backend/src/actors/task/workflow/common.ts`** — remove `agentType`, `prSubmitted`, `diffStat`, `hasUnpushed`, `conflictsWithMain`, `parentBranch` from `getCurrentRecord()` and `TaskRecord` construction +- **`foundry/packages/backend/src/actors/task/workflow/init.ts`** — remove `agentType` from task row inserts +- **`foundry/packages/shared/src/contracts.ts`** — remove `agentType`, `prSubmitted`, `diffStat`, `prUrl`, `hasUnpushed`, `conflictsWithMain`, `parentBranch` from `TaskRecord` schema (note: `prUrl` and `prAuthor` should stay if still populated by `enrichTaskRecord`, or move to the unified task/PR model from item 15) +- **`foundry/packages/client/src/mock/backend-client.ts`** — update mock to remove dead fields +- **`foundry/packages/client/test/view-model.test.ts`** — update test fixtures +- **`foundry/packages/frontend/src/features/tasks/model.test.ts`** — update test fixtures +- **`foundry/packages/backend/src/actors/organization/actions.ts`** — remove any references to `agentType` in task creation input +- **`foundry/packages/backend/src/actors/repository/actions.ts`** — update `enrichTaskRecord()` to stop setting dead fields + +--- + +## [ ] 22. Move per-user UI state from task actor to user actor + +**Dependencies:** item 1 + +**Rationale:** The task actor stores UI-facing state that is user-specific, not task-global. With multiplayer (multiple users viewing the same task), this breaks — each user has their own active session, their own unread state, their own drafts. These must live on the user actor, keyed by `(taskId, sessionId)`, not on the shared task actor. + +### Per-user state currently on the task actor (wrong) + +**`taskRuntime.activeSessionId`** — Which session the user is "looking at." Used to: +- Determine which session's status drives the task-level status (running/idle) — this is wrong, the task status should reflect ALL sessions, not one user's active tab +- Return a "current" session in `attachTask` responses — this is per-user +- Migration path for legacy single-session tasks in `ensureWorkbenchSeeded` + +This should move to the user actor as `activeSessionId` per `(userId, taskId)`. + +**`taskWorkbenchSessions.unread`** — Per-user unread state stored globally on the session. If user A reads a session, user B's unread state is also cleared. Move to user actor keyed by `(userId, taskId, sessionId)`. + +**`taskWorkbenchSessions.draftText` / `draftAttachmentsJson` / `draftUpdatedAt`** — Per-user draft state stored globally. If user A starts typing a draft, it overwrites user B's draft. Move to user actor keyed by `(userId, taskId, sessionId)`. + +### What stays on the task actor (correct — task-global state) + +- `taskRuntime.activeSandboxId` — which sandbox is running (global to the task) +- `taskRuntime.activeSwitchTarget` / `activeCwd` — sandbox connection state (global) +- `taskRuntime.statusMessage` — provisioning/runtime status (global) +- `taskWorkbenchSessions.model` — which model the session uses (global) +- `taskWorkbenchSessions.status` — session runtime status (global) +- `taskWorkbenchSessions.transcriptJson` — session transcript (global) + +### Fix + +Add a `userTaskState` table to the user actor: + +```typescript +export const userTaskState = sqliteTable("user_task_state", { + taskId: text("task_id").notNull(), + sessionId: text("session_id").notNull(), + activeSessionId: text("active_session_id"), // per-user active tab + unread: integer("unread").notNull().default(0), + draftText: text("draft_text").notNull().default(""), + draftAttachmentsJson: text("draft_attachments_json").notNull().default("[]"), + draftUpdatedAt: integer("draft_updated_at"), + updatedAt: integer("updated_at").notNull(), +}, (table) => ({ + pk: primaryKey(table.taskId, table.sessionId), +})); +``` + +Remove `activeSessionId` from `taskRuntime`. Remove `unread`, `draftText`, `draftAttachmentsJson`, `draftUpdatedAt` from `taskWorkbenchSessions`. + +The task-level status should be derived from ALL sessions (e.g., task is "running" if ANY session is running), not from one user's `activeSessionId`. + +### Files to change + +- **`foundry/packages/backend/src/actors/auth-user/db/schema.ts`** — add `userTaskState` table +- **`foundry/packages/backend/src/actors/task/db/schema.ts`** — remove `activeSessionId` from `taskRuntime`; remove `unread`, `draftText`, `draftAttachmentsJson`, `draftUpdatedAt` from `taskWorkbenchSessions` +- **`foundry/packages/backend/src/actors/task/workbench.ts`** — remove all `activeSessionId` reads/writes; remove draft/unread mutation functions; task status derivation should check all sessions +- **`foundry/packages/backend/src/actors/task/workflow/common.ts`** — remove `activeSessionId` from `getCurrentRecord()` +- **`foundry/packages/backend/src/actors/task/workflow/commands.ts`** — remove `activeSessionId` references in `attachTask` +- **`foundry/packages/backend/src/actors/task/workflow/init.ts`** — remove `activeSessionId` initialization +- **`foundry/packages/client/`** — draft/unread/activeSession operations route to user actor instead of task actor +- **`foundry/packages/frontend/`** — update subscription to fetch per-user state from user actor + +### CLAUDE.md update + +- **`foundry/packages/backend/CLAUDE.md`** — add constraint: "Per-user UI state (active session tab, unread counts, draft text, draft attachments) must live on the user actor, not on shared task/session actors. Task actors hold only task-global state visible to all users. This is critical for multiplayer correctness — multiple users may view the same task simultaneously with different active sessions, unread states, and in-progress drafts." + +--- + +## [ ] 23. Delete `getTaskEnriched` and `enrichTaskRecord` (dead code) + +**Rationale:** `getTaskEnriched` is dead code with zero callers from the client. It's also the worst fan-out pattern in the codebase: org → repo actor → task actor (`.get()`) → github-data actor (`listPullRequestsForRepository` fetches ALL PRs, then `.find()`s by branch name). This is exactly the pattern the coordinator model eliminates — task detail comes from `getTaskDetail` on the task actor, sidebar data comes from materialized `taskSummaries` on the org actor. + +### What to delete + +- **`enrichTaskRecord()`** — `repository/actions.ts:117-143`. Fetches all PRs for a repo to find one by branch name. Dead code. +- **`getTaskEnriched` action** — `repository/actions.ts:432-450`. Only caller of `enrichTaskRecord`. Dead code. +- **`getTaskEnriched` org proxy** — `organization/actions.ts:838-849`. Only caller of the repo action. Dead code. +- **`GetTaskEnrichedCommand` type** — wherever defined. + +### Files to change + +- **`foundry/packages/backend/src/actors/repository/actions.ts`** — delete `enrichTaskRecord()` and `getTaskEnriched` action +- **`foundry/packages/backend/src/actors/organization/actions.ts`** — delete `getTaskEnriched` proxy action + +--- + +## [ ] 24. Clean up task status tracking + +**Dependencies:** item 21 + +**Rationale:** Task status tracking is spread across `c.state`, the `task` SQLite table, and the `taskRuntime` table with redundant and dead fields. Consolidate to a single `status` enum on the `task` table. Remove `statusMessage` — human-readable status text should be derived on the client from the `status` enum, not stored on the backend. + +### Fields to delete + +| Field | Location | Why | +|---|---|---| +| `initialized` | `c.state` | Dead code — never read. `status` already tracks init progress. | +| `previousStatus` | `c.state` | Dead code — never set, never read. | +| `statusMessage` | `taskRuntime` table | Client concern — the client should derive display text from the `status` enum. The backend should not store UI copy. | +| `provisionStage` | `taskRuntime` table | Redundant — `status` already encodes provision progress (`init_bootstrap_db` → `init_enqueue_provision` → `init_complete`). | +| `provisionStageUpdatedAt` | `taskRuntime` table | Dead — never read. | + +### What remains + +- **`status`** on the `task` table — the single canonical state machine enum. Values: `init_bootstrap_db`, `init_enqueue_provision`, `init_complete`, `running`, `idle`, `error`, `archive_*`, `kill_*`, `archived`, `killed`. + +### Files to change + +- **`foundry/packages/backend/src/actors/task/db/schema.ts`** — remove `statusMessage`, `provisionStage`, `provisionStageUpdatedAt` from `taskRuntime` table +- **`foundry/packages/backend/src/actors/task/index.ts`** — remove `initialized`, `previousStatus` from `createState` +- **`foundry/packages/backend/src/actors/task/workflow/common.ts`** — remove `statusMessage` parameter from `setTaskState()`, remove it from `getCurrentRecord()` query +- **`foundry/packages/backend/src/actors/task/workflow/init.ts`** — remove `statusMessage`, `provisionStage`, `provisionStageUpdatedAt` from taskRuntime inserts/updates; remove `ensureTaskRuntimeCacheColumns()` raw ALTER TABLE for these columns +- **`foundry/packages/backend/src/actors/task/workflow/commands.ts`** — remove `statusMessage` from handler updates +- **`foundry/packages/backend/src/actors/task/workflow/push.ts`** — remove `statusMessage` updates +- **`foundry/packages/backend/src/actors/task/workbench.ts`** — remove `statusMessage` from `buildTaskDetail()`, remove `ensureTaskRuntimeCacheColumns()` for these columns +- **`foundry/packages/shared/src/workbench.ts`** — remove `statusMessage` from `WorkbenchTaskDetail` +- **`foundry/packages/frontend/`** — derive display text from `status` enum instead of reading `statusMessage` + +--- + +## [ ] 25. Remove "Workbench" prefix from all types, functions, files, and tables + +**Rationale:** "Workbench" is not a real concept in the system. It's a namespace prefix applied to every type, function, file, and table name. The actual entities are Task, Session, Repository, Sandbox, Transcript, Draft, etc. — "Workbench" adds zero information and obscures what things actually are. + +### Rename strategy + +Drop "Workbench" everywhere. If the result collides with an existing name (e.g., auth `Session`), use the domain prefix (e.g., `TaskSession` vs auth `Session`). + +### Type renames (`shared/src/workbench.ts`) + +| Before | After | +|---|---| +| `WorkbenchTaskStatus` | `TaskStatus` (already exists as base, merge) | +| `WorkbenchAgentKind` | `AgentKind` | +| `WorkbenchModelId` | `ModelId` | +| `WorkbenchSessionStatus` | `SessionStatus` | +| `WorkbenchTranscriptEvent` | `TranscriptEvent` | +| `WorkbenchComposerDraft` | `ComposerDraft` | +| `WorkbenchSessionSummary` | `SessionSummary` | +| `WorkbenchSessionDetail` | `SessionDetail` | +| `WorkbenchFileChange` | `FileChange` | +| `WorkbenchFileTreeNode` | `FileTreeNode` | +| `WorkbenchLineAttachment` | `LineAttachment` | +| `WorkbenchHistoryEvent` | `HistoryEvent` | +| `WorkbenchDiffLineKind` | `DiffLineKind` | +| `WorkbenchParsedDiffLine` | `ParsedDiffLine` | +| `WorkbenchPullRequestSummary` | `PullRequestSummary` | +| `WorkbenchOpenPrSummary` | `OpenPrSummary` | +| `WorkbenchSandboxSummary` | `SandboxSummary` | +| `WorkbenchTaskSummary` | `TaskSummary` | +| `WorkbenchTaskDetail` | `TaskDetail` | +| `WorkbenchRepositorySummary` | `RepositorySummary` | +| `WorkbenchSession` | `TaskSession` (avoids auth `Session` collision) | +| `WorkbenchTask` | `TaskSnapshot` (avoids `task` table collision) | +| `WorkbenchRepo` | `RepoSnapshot` | +| `WorkbenchRepositorySection` | `RepositorySection` | +| `TaskWorkbenchSnapshot` | `DashboardSnapshot` | +| `WorkbenchModelOption` | `ModelOption` | +| `WorkbenchModelGroup` | `ModelGroup` | +| `TaskWorkbenchSelectInput` | `SelectTaskInput` | +| `TaskWorkbenchCreateTaskInput` | `CreateTaskInput` | +| `TaskWorkbenchRenameInput` | `RenameTaskInput` | +| `TaskWorkbenchSendMessageInput` | `SendMessageInput` | +| `TaskWorkbenchSessionInput` | `SessionInput` | +| `TaskWorkbenchRenameSessionInput` | `RenameSessionInput` | +| `TaskWorkbenchChangeModelInput` | `ChangeModelInput` | +| `TaskWorkbenchUpdateDraftInput` | `UpdateDraftInput` | +| `TaskWorkbenchSetSessionUnreadInput` | `SetSessionUnreadInput` | +| `TaskWorkbenchDiffInput` | `DiffInput` | +| `TaskWorkbenchCreateTaskResponse` | `CreateTaskResponse` | +| `TaskWorkbenchAddSessionResponse` | `AddSessionResponse` | + +### File renames + +| Before | After | +|---|---| +| `shared/src/workbench.ts` | `shared/src/types.ts` (or split into `task.ts`, `session.ts`, etc.) | +| `backend/src/actors/task/workbench.ts` | `backend/src/actors/task/sessions.ts` (already planned in item 7) | +| `client/src/workbench-client.ts` | `client/src/task-client.ts` | +| `client/src/workbench-model.ts` | `client/src/model.ts` | +| `client/src/remote/workbench-client.ts` | `client/src/remote/task-client.ts` | +| `client/src/mock/workbench-client.ts` | `client/src/mock/task-client.ts` | + +### Table rename + +| Before | After | +|---|---| +| `task_workbench_sessions` | `task_sessions` | + +### Function renames (backend — drop "Workbench" infix) + +All functions in `backend/src/actors/task/workbench.ts`: +- `createWorkbenchSession` → `createSession` +- `closeWorkbenchSession` → `closeSession` +- `changeWorkbenchModel` → `changeModel` +- `sendWorkbenchMessage` → `sendMessage` +- `stopWorkbenchSession` → `stopSession` +- `renameWorkbenchBranch` → deleted (see item 26) +- `renameWorkbenchTask` → `renameTask` +- `renameWorkbenchSession` → `renameSession` +- `revertWorkbenchFile` → `revertFile` +- `publishWorkbenchPr` → `publishPr` +- `updateWorkbenchDraft` → `updateDraft` +- `setWorkbenchSessionUnread` → `setSessionUnread` +- `markWorkbenchUnread` → `markUnread` +- `syncWorkbenchSessionStatus` → `syncSessionStatus` +- `ensureWorkbenchSeeded` → `ensureSessionSeeded` + +### Queue/command type renames (backend) + +- `TaskWorkbenchValueCommand` → `TaskValueCommand` +- `TaskWorkbenchSessionTitleCommand` → `SessionTitleCommand` +- `TaskWorkbenchSessionUnreadCommand` → `SessionUnreadCommand` + +### Scope + +~420 occurrences across shared (35+ types), backend (200+ refs), client (324 refs), frontend (96 refs). Mechanical find-and-replace once the rename map is settled. + +### Files to change + +- **`foundry/packages/shared/src/workbench.ts`** — rename file, rename all exported types +- **`foundry/packages/shared/src/index.ts`** — update re-export path +- **`foundry/packages/shared/src/app-shell.ts`** — update `WorkbenchModelId` → `ModelId` import +- **`foundry/packages/shared/src/realtime-events.ts`** — update all `Workbench*` type imports +- **`foundry/packages/backend/src/actors/task/workbench.ts`** — rename file + all functions +- **`foundry/packages/backend/src/actors/task/index.ts`** — update imports and action registrations +- **`foundry/packages/backend/src/actors/task/db/schema.ts`** — rename `taskWorkbenchSessions` → `taskSessions` +- **`foundry/packages/backend/src/actors/task/workflow/`** — update all workbench references +- **`foundry/packages/backend/src/actors/organization/`** — update type imports and action names +- **`foundry/packages/backend/src/actors/repository/`** — update type imports +- **`foundry/packages/client/src/`** — rename files + update all type/function references +- **`foundry/packages/frontend/src/`** — update all type imports + +### CLAUDE.md update + +Update `foundry/packages/backend/CLAUDE.md` coordinator hierarchy diagram: `taskWorkbenchSessions` → `taskSessions`. + +--- + +## [ ] 26. Delete branch rename (branches immutable after creation) + +**Dependencies:** item 25 + +**Rationale:** Branch name is assigned once at task creation and never changes. Branch rename is unused in the frontend UI and SDK, adds ~80 lines of code, and creates a transactional consistency risk (git rename succeeds but index update fails). + +### Delete + +- **`task/workbench.ts`** — delete `renameWorkbenchBranch()` (~50 lines) +- **`task/index.ts`** — delete `renameWorkbenchBranch` action +- **`task/workflow/queue.ts`** — remove `"task.command.workbench.rename_branch"` queue type +- **`task/workflow/index.ts`** — remove `"task.command.workbench.rename_branch"` handler +- **`organization/actions.ts`** — delete `renameWorkbenchBranch` proxy action +- **`repository/actions.ts`** — delete `registerTaskBranch` action (only caller was rename flow) +- **`client/src/workbench-client.ts`** — remove `renameBranch` from interface +- **`client/src/remote/workbench-client.ts`** — delete `renameBranch()` method +- **`client/src/mock/workbench-client.ts`** — delete `renameBranch()` method +- **`client/src/backend-client.ts`** — delete `renameWorkbenchBranch` from interface + implementation +- **`client/src/mock/backend-client.ts`** — delete `renameWorkbenchBranch` implementation +- **`frontend/src/components/mock-layout.tsx`** — remove `renameBranch` from client interface, delete `onRenameBranch` callbacks and all `renameBranch` wiring (~8 refs) +- **`shared/src/workbench.ts`** — delete `TaskWorkbenchRenameInput` (if only used by branch rename; check if task title rename shares it) + +### Keep + +- `deriveFallbackTitle()` + `sanitizeBranchName()` + `resolveCreateFlowDecision()` — initial branch derivation at creation +- `registerTaskBranchMutation()` — used during task creation for `onBranch` path +- `renameWorkbenchTask()` — title rename is independent, stays +- `taskIndex` table — still the coordinator index for branch→task mapping + +--- + +## [ ] Final audit pass (run after all items above are complete) + +### Dead code scan + +Already tracked in item 18: once all changes are complete, do a full scan to find dead actions, queues, SQLite tables, and workflow steps that need to be removed. + +### Dead events audit + +Scan all event types emitted by actors (in `packages/shared/src/realtime-events.ts` and anywhere actors call `c.broadcast()` or similar). Cross-reference against all client subscribers (in `packages/client/` and `packages/frontend/`). Remove any events that are emitted but never subscribed to by any client. This includes events that may have been superseded by the consolidated single-topic-per-actor pattern (item 14). diff --git a/foundry/compose.dev.yaml b/foundry/compose.dev.yaml index c57d971..7fa492d 100644 --- a/foundry/compose.dev.yaml +++ b/foundry/compose.dev.yaml @@ -44,6 +44,7 @@ services: STRIPE_WEBHOOK_SECRET: "${STRIPE_WEBHOOK_SECRET:-}" STRIPE_PRICE_TEAM: "${STRIPE_PRICE_TEAM:-}" FOUNDRY_SANDBOX_PROVIDER: "${FOUNDRY_SANDBOX_PROVIDER:-local}" + HF_LOCAL_SANDBOX_IMAGE: "${HF_LOCAL_SANDBOX_IMAGE:-rivetdev/sandbox-agent:foundry-base-latest}" E2B_API_KEY: "${E2B_API_KEY:-}" E2B_TEMPLATE: "${E2B_TEMPLATE:-}" HF_E2B_TEMPLATE: "${HF_E2B_TEMPLATE:-${E2B_TEMPLATE:-}}" @@ -56,8 +57,6 @@ services: - "7741:7741" volumes: - "..:/app" - # The linked RivetKit checkout resolves from Foundry packages to /task/rivet-checkout in-container. - - "../../../task/rivet-checkout:/task/rivet-checkout:ro" # Reuse the host Codex auth profile for local sandbox-agent Codex sessions in dev. - "${HOME}/.codex:/root/.codex" - "/var/run/docker.sock:/var/run/docker.sock" @@ -86,7 +85,6 @@ services: - "..:/app" # Ensure logs in .foundry/ persist on the host even if we change source mounts later. - "./.foundry:/app/foundry/.foundry" - - "../../../task/rivet-checkout:/task/rivet-checkout:ro" # Use Linux-native repo dependencies inside the container instead of host node_modules. - "foundry_node_modules:/app/node_modules" - "foundry_client_node_modules:/app/foundry/packages/client/node_modules" diff --git a/foundry/compose.mock.yaml b/foundry/compose.mock.yaml index c4a06ff..6c57875 100644 --- a/foundry/compose.mock.yaml +++ b/foundry/compose.mock.yaml @@ -15,7 +15,6 @@ services: volumes: - "..:/app" - "./.foundry:/app/foundry/.foundry" - - "../../../task/rivet-checkout:/task/rivet-checkout:ro" - "mock_node_modules:/app/node_modules" - "mock_client_node_modules:/app/foundry/packages/client/node_modules" - "mock_frontend_node_modules:/app/foundry/packages/frontend/node_modules" diff --git a/foundry/docker/backend.Dockerfile b/foundry/docker/backend.Dockerfile index 3dc1c7d..ae14ddf 100644 --- a/foundry/docker/backend.Dockerfile +++ b/foundry/docker/backend.Dockerfile @@ -19,6 +19,7 @@ RUN pnpm --filter @sandbox-agent/foundry-backend deploy --prod /out FROM oven/bun:1.2 AS runtime ENV NODE_ENV=production ENV HOME=/home/task +ENV RIVET_RUNNER_VERSION_FILE=/etc/foundry/rivet-runner-version WORKDIR /app RUN apt-get update \ && apt-get install -y --no-install-recommends \ @@ -31,6 +32,8 @@ RUN addgroup --system --gid 1001 task \ && adduser --system --uid 1001 --home /home/task --ingroup task task \ && mkdir -p /home/task \ && chown -R task:task /home/task /app +RUN mkdir -p /etc/foundry \ + && date +%s > /etc/foundry/rivet-runner-version COPY --from=build /out ./ USER task EXPOSE 7741 diff --git a/foundry/docker/backend.dev.Dockerfile b/foundry/docker/backend.dev.Dockerfile index 46177c3..c4b6c3a 100644 --- a/foundry/docker/backend.dev.Dockerfile +++ b/foundry/docker/backend.dev.Dockerfile @@ -21,6 +21,9 @@ RUN curl -fsSL "https://releases.rivet.dev/sandbox-agent/${SANDBOX_AGENT_VERSION ENV PATH="/root/.local/bin:${PATH}" ENV SANDBOX_AGENT_BIN="/root/.local/bin/sandbox-agent" +ENV RIVET_RUNNER_VERSION_FILE=/etc/foundry/rivet-runner-version +RUN mkdir -p /etc/foundry \ + && date +%s > /etc/foundry/rivet-runner-version WORKDIR /app diff --git a/foundry/docker/backend.preview.Dockerfile b/foundry/docker/backend.preview.Dockerfile index 00774f2..91cd7c7 100644 --- a/foundry/docker/backend.preview.Dockerfile +++ b/foundry/docker/backend.preview.Dockerfile @@ -20,11 +20,13 @@ RUN curl -fsSL "https://releases.rivet.dev/sandbox-agent/${SANDBOX_AGENT_VERSION ENV PATH="/root/.local/bin:${PATH}" ENV SANDBOX_AGENT_BIN="/root/.local/bin/sandbox-agent" +ENV RIVET_RUNNER_VERSION_FILE=/etc/foundry/rivet-runner-version +RUN mkdir -p /etc/foundry \ + && date +%s > /etc/foundry/rivet-runner-version WORKDIR /workspace/quebec COPY quebec /workspace/quebec -COPY rivet-checkout /workspace/rivet-checkout RUN pnpm install --frozen-lockfile RUN pnpm --filter @sandbox-agent/foundry-shared build diff --git a/foundry/docker/foundry-base.Dockerfile b/foundry/docker/foundry-base.Dockerfile new file mode 100644 index 0000000..b4b9e26 --- /dev/null +++ b/foundry/docker/foundry-base.Dockerfile @@ -0,0 +1,190 @@ +# syntax=docker/dockerfile:1.10.0 +# +# Foundry base sandbox image. +# +# Builds sandbox-agent from source (reusing the upstream Dockerfile.full build +# stages) and layers Foundry-specific tooling on top: sudo, git, neovim, gh, +# node, bun, chromium, and agent-browser. +# +# Build: +# docker build --platform linux/amd64 \ +# -f foundry/docker/foundry-base.Dockerfile \ +# -t rivetdev/sandbox-agent:foundry-base- . +# +# Must be invoked from the repository root so the COPY . picks up the full +# source tree for the Rust + inspector build stages. + +# ============================================================================ +# Build inspector frontend +# ============================================================================ +FROM --platform=linux/amd64 node:22-alpine AS inspector-build +WORKDIR /app +RUN npm install -g pnpm + +COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./ +COPY frontend/packages/inspector/package.json ./frontend/packages/inspector/ +COPY sdks/cli-shared/package.json ./sdks/cli-shared/ +COPY sdks/acp-http-client/package.json ./sdks/acp-http-client/ +COPY sdks/react/package.json ./sdks/react/ +COPY sdks/typescript/package.json ./sdks/typescript/ + +RUN pnpm install --filter @sandbox-agent/inspector... + +COPY docs/openapi.json ./docs/ +COPY sdks/cli-shared ./sdks/cli-shared +COPY sdks/acp-http-client ./sdks/acp-http-client +COPY sdks/react ./sdks/react +COPY sdks/typescript ./sdks/typescript + +RUN cd sdks/cli-shared && pnpm exec tsup +RUN cd sdks/acp-http-client && pnpm exec tsup +RUN cd sdks/typescript && SKIP_OPENAPI_GEN=1 pnpm exec tsup +RUN cd sdks/react && pnpm exec tsup + +COPY frontend/packages/inspector ./frontend/packages/inspector +RUN cd frontend/packages/inspector && pnpm exec vite build + +# ============================================================================ +# AMD64 Builder - sandbox-agent static binary +# ============================================================================ +FROM --platform=linux/amd64 rust:1.88.0 AS builder + +ENV DEBIAN_FRONTEND=noninteractive + +RUN apt-get update && apt-get install -y \ + musl-tools \ + musl-dev \ + llvm-14-dev \ + libclang-14-dev \ + clang-14 \ + libssl-dev \ + pkg-config \ + ca-certificates \ + g++ \ + g++-multilib \ + git \ + curl \ + wget && \ + rm -rf /var/lib/apt/lists/* + +RUN wget -q https://github.com/cross-tools/musl-cross/releases/latest/download/x86_64-unknown-linux-musl.tar.xz && \ + tar -xf x86_64-unknown-linux-musl.tar.xz -C /opt/ && \ + rm x86_64-unknown-linux-musl.tar.xz && \ + rustup target add x86_64-unknown-linux-musl + +ENV PATH="/opt/x86_64-unknown-linux-musl/bin:$PATH" \ + LIBCLANG_PATH=/usr/lib/llvm-14/lib \ + CLANG_PATH=/usr/bin/clang-14 \ + CC_x86_64_unknown_linux_musl=x86_64-unknown-linux-musl-gcc \ + CXX_x86_64_unknown_linux_musl=x86_64-unknown-linux-musl-g++ \ + AR_x86_64_unknown_linux_musl=x86_64-unknown-linux-musl-ar \ + CARGO_TARGET_X86_64_UNKNOWN_LINUX_MUSL_LINKER=x86_64-unknown-linux-musl-gcc \ + CARGO_INCREMENTAL=0 \ + CARGO_NET_GIT_FETCH_WITH_CLI=true + +ENV SSL_VER=1.1.1w +RUN wget https://www.openssl.org/source/openssl-$SSL_VER.tar.gz && \ + tar -xzf openssl-$SSL_VER.tar.gz && \ + cd openssl-$SSL_VER && \ + ./Configure no-shared no-async --prefix=/musl --openssldir=/musl/ssl linux-x86_64 && \ + make -j$(nproc) && \ + make install_sw && \ + cd .. && \ + rm -rf openssl-$SSL_VER* + +ENV OPENSSL_DIR=/musl \ + OPENSSL_INCLUDE_DIR=/musl/include \ + OPENSSL_LIB_DIR=/musl/lib \ + PKG_CONFIG_ALLOW_CROSS=1 \ + RUSTFLAGS="-C target-feature=+crt-static -C link-arg=-static-libgcc" + +WORKDIR /build +COPY . . + +COPY --from=inspector-build /app/frontend/packages/inspector/dist ./frontend/packages/inspector/dist + +RUN --mount=type=cache,target=/usr/local/cargo/registry \ + --mount=type=cache,target=/usr/local/cargo/git \ + --mount=type=cache,target=/build/target \ + cargo build -p sandbox-agent --release --target x86_64-unknown-linux-musl -j4 && \ + cp target/x86_64-unknown-linux-musl/release/sandbox-agent /sandbox-agent + +# ============================================================================ +# Runtime - Foundry base sandbox image +# ============================================================================ +FROM --platform=linux/amd64 node:22-bookworm-slim + +ENV DEBIAN_FRONTEND=noninteractive + +# --- System packages -------------------------------------------------------- +RUN apt-get update && apt-get install -y --no-install-recommends \ + bash \ + ca-certificates \ + curl \ + git \ + gnupg \ + neovim \ + sudo \ + unzip \ + wget \ + # Chromium and its runtime deps + chromium \ + fonts-liberation \ + libasound2 \ + libatk-bridge2.0-0 \ + libatk1.0-0 \ + libcups2 \ + libdbus-1-3 \ + libdrm2 \ + libgbm1 \ + libgtk-3-0 \ + libnspr4 \ + libnss3 \ + libx11-xcb1 \ + libxcomposite1 \ + libxdamage1 \ + libxrandr2 \ + xdg-utils \ + && rm -rf /var/lib/apt/lists/* + +# --- GitHub CLI (gh) ------------------------------------------------------- +RUN curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg \ + | dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg \ + && chmod go+r /usr/share/keyrings/githubcli-archive-keyring.gpg \ + && echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages stable main" \ + > /etc/apt/sources.list.d/github-cli.list \ + && apt-get update && apt-get install -y gh \ + && rm -rf /var/lib/apt/lists/* + +# --- Bun -------------------------------------------------------------------- +RUN curl -fsSL https://bun.sh/install | bash \ + && mv /root/.bun/bin/bun /usr/local/bin/bun \ + && ln -sf /usr/local/bin/bun /usr/local/bin/bunx \ + && rm -rf /root/.bun + +# --- sandbox-agent binary (from local build) -------------------------------- +COPY --from=builder /sandbox-agent /usr/local/bin/sandbox-agent +RUN chmod +x /usr/local/bin/sandbox-agent + +# --- sandbox user with passwordless sudo ------------------------------------ +RUN useradd -m -s /bin/bash sandbox \ + && echo "sandbox ALL=(ALL) NOPASSWD:ALL" > /etc/sudoers.d/sandbox \ + && chmod 0440 /etc/sudoers.d/sandbox + +USER sandbox +WORKDIR /home/sandbox + +# Point Chromium/Playwright at the system binary +ENV CHROME_PATH=/usr/bin/chromium +ENV CHROMIUM_PATH=/usr/bin/chromium +ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium +ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true + +# --- Install all sandbox-agent agents + agent-browser ----------------------- +RUN sandbox-agent install-agent --all +RUN sudo npm install -g agent-browser + +EXPOSE 2468 + +ENTRYPOINT ["sandbox-agent"] +CMD ["server", "--host", "0.0.0.0", "--port", "2468"] diff --git a/foundry/docker/frontend.dev.Dockerfile b/foundry/docker/frontend.dev.Dockerfile index 3b0d8e4..dd74dd0 100644 --- a/foundry/docker/frontend.dev.Dockerfile +++ b/foundry/docker/frontend.dev.Dockerfile @@ -8,4 +8,4 @@ RUN npm install -g pnpm@10.28.2 WORKDIR /app -CMD ["bash", "-lc", "pnpm install --force --frozen-lockfile --filter @sandbox-agent/foundry-frontend... && cd foundry/packages/frontend && exec pnpm vite --host 0.0.0.0 --port 4173"] +CMD ["bash", "-lc", "pnpm install --frozen-lockfile --filter @sandbox-agent/foundry-frontend... && cd foundry/packages/frontend && exec pnpm vite --host 0.0.0.0 --port 4173"] diff --git a/foundry/docker/frontend.preview.Dockerfile b/foundry/docker/frontend.preview.Dockerfile index 05cbba7..dd10422 100644 --- a/foundry/docker/frontend.preview.Dockerfile +++ b/foundry/docker/frontend.preview.Dockerfile @@ -7,7 +7,6 @@ RUN npm install -g pnpm@10.28.2 WORKDIR /workspace/quebec COPY quebec /workspace/quebec -COPY rivet-checkout /workspace/rivet-checkout RUN pnpm install --frozen-lockfile RUN pnpm --filter @sandbox-agent/foundry-shared build diff --git a/foundry/packages/backend/CLAUDE.md b/foundry/packages/backend/CLAUDE.md index 432bc85..f7e054d 100644 --- a/foundry/packages/backend/CLAUDE.md +++ b/foundry/packages/backend/CLAUDE.md @@ -5,14 +5,12 @@ Keep the backend actor tree aligned with this shape unless we explicitly decide to change it: ```text -OrganizationActor -├─ HistoryActor(organization-scoped global feed) +OrganizationActor (direct coordinator for tasks) +├─ AuditLogActor (organization-scoped global feed) ├─ GithubDataActor -├─ RepositoryActor(repo) -│ └─ TaskActor(task) -│ ├─ TaskSessionActor(session) × N -│ │ └─ SessionStatusSyncActor(session) × 0..1 -│ └─ Task-local workbench state +├─ TaskActor(task) +│ ├─ taskSessions → session metadata/transcripts +│ └─ taskSandboxes → sandbox instance index └─ SandboxInstanceActor(sandboxProviderId, sandboxId) × N ``` @@ -28,53 +26,173 @@ Children push updates **up** to their direct coordinator only. Coordinators broa ### Coordinator hierarchy and index tables ```text -OrganizationActor (coordinator for repos + auth users) +OrganizationActor (coordinator for tasks + auth users) │ │ Index tables: -│ ├─ repos → RepositoryActor index (repo catalog) -│ ├─ taskLookup → TaskActor index (taskId → repoId routing) -│ ├─ taskSummaries → TaskActor index (materialized sidebar projection) -│ ├─ authSessionIndex → AuthUserActor index (session token → userId) -│ ├─ authEmailIndex → AuthUserActor index (email → userId) -│ └─ authAccountIndex → AuthUserActor index (OAuth account → userId) +│ ├─ taskIndex → TaskActor index (taskId → repoId + branchName) +│ ├─ taskSummaries → TaskActor materialized sidebar projection +│ ├─ authSessionIndex → UserActor index (session token → userId) +│ ├─ authEmailIndex → UserActor index (email → userId) +│ └─ authAccountIndex → UserActor index (OAuth account → userId) │ -├─ RepositoryActor (coordinator for tasks) +├─ TaskActor (coordinator for sessions + sandboxes) │ │ │ │ Index tables: -│ │ └─ taskIndex → TaskActor index (taskId → branchName) +│ │ ├─ taskWorkspaceSessions → Session index (session metadata + transcript) +│ │ └─ taskSandboxes → SandboxInstanceActor index (sandbox history) │ │ -│ └─ TaskActor (coordinator for sessions + sandboxes) -│ │ -│ │ Index tables: -│ │ ├─ taskWorkbenchSessions → Session index (session metadata, transcript, draft) -│ │ └─ taskSandboxes → SandboxInstanceActor index (sandbox history) -│ │ -│ └─ SandboxInstanceActor (leaf) +│ └─ SandboxInstanceActor (leaf) │ -├─ HistoryActor (organization-scoped audit log, not a coordinator) +├─ AuditLogActor (organization-scoped audit log, not a coordinator) └─ GithubDataActor (GitHub API cache, not a coordinator) ``` When adding a new index table, annotate it in the schema file with a doc comment identifying it as a coordinator index and which child actor it indexes (see existing examples). +## GitHub Sync Data Model + +The GithubDataActor syncs **repositories** and **pull requests** from GitHub, not branches. We only need repos (to know which repos exist and their metadata) and PRs (to lazily populate virtual tasks in the sidebar). Branch data is not synced because we only create tasks from PRs or fresh user-initiated creation, never from bare branches. Generated branch names for new tasks are treated as unique enough to skip conflict detection against remote branches. + +Tasks are either: +1. **Created fresh** by the user (no PR yet, branch name generated from task description) +2. **Lazily populated from pull requests** during PR sync (virtual task entries in org tables, no actor spawned) + +## Lazy Task Actor Creation — CRITICAL + +**Task actors must NEVER be created during GitHub sync or bulk operations.** Creating hundreds of task actors simultaneously causes OOM crashes. An org can have 200+ PRs; spawning an actor per PR kills the process. + +### The two creation points + +There are exactly **two** places that may create a task actor: + +1. **`createTaskMutation`** in `task-mutations.ts` — the only backend code that calls `getOrCreateTask`. Triggered by explicit user action ("New Task" button). One actor at a time. + +2. **`backend-client.ts` client helper** — calls `client.task.getOrCreate(...)`. This is the lazy materialization point: when a user clicks a virtual task in the sidebar, the client creates the actor, and it self-initializes in `getCurrentRecord()` (`workflow/common.ts`) by reading branch/title from the org's `getTaskIndexEntry` action. + +### The rule + +### The rule + +**Never use `getOrCreateTask` inside a sync loop, webhook handler, or any bulk operation.** That's what caused the OOM — 186 actors spawned simultaneously during PR sync. + +`getOrCreateTask` IS allowed in: +- `createTaskMutation` — explicit user "New Task" action +- `requireWorkspaceTask` — user-initiated actions (createSession, sendMessage, etc.) that may hit a virtual task +- `getTask` action on the org — called by sandbox actor and client, needs to materialize virtual tasks +- `backend-client.ts` client helper — lazy materialization when user views a task + +### Virtual tasks (PR-driven) + +During PR sync, `refreshTaskSummaryForBranchMutation` is called for every changed PR (via github-data's `emitPullRequestChangeEvents`). It writes **virtual task entries** to the org actor's local `taskIndex` + `taskSummaries` tables only. No task actor is spawned. No cross-actor calls to task actors. + +When the user interacts with a virtual task (clicks it, creates a session): +1. Client or org actor calls `getOrCreate` on the task actor key → actor is created with empty DB +2. Any action on the actor calls `getCurrentRecord()` → sees empty DB → reads branch/title from org's `getTaskIndexEntry` → calls `initBootstrapDbActivity` + `initCompleteActivity` → task is now real + +### Call sites to watch + +- `refreshTaskSummaryForBranchMutation` — called in bulk during sync. Must ONLY write to org local tables. Never create task actors or call task actor actions. +- `emitPullRequestChangeEvents` in github-data — iterates all changed PRs. Must remain fire-and-forget with no actor fan-out. + +## Queue vs Action Decision Framework + +The default is a direct action. Use a queue only if the answer to one or more of these questions is **yes**. + +Actions are pure RPCs with no DB overhead on send — fast, but if the call fails the operation is lost. Queues persist the message to the database on send, guaranteeing it will be processed even if the target actor is busy, slow, or recovering. The tradeoff: queues add write overhead and serialize processing. + +### 1. Does this operation coordinate multi-step work? + +Does it involve external I/O (sandbox API, GitHub API, agent process management) or state machine transitions where interleaving would corrupt state? This is different from database-level serialization — a simple read-then-write on SQLite can use a transaction. The queue is for ordering operations that span DB writes + external I/O. + +**Queue examples:** +- `workspace.send_message` — sends to sandbox agent, writes session status, does owner-swap. Multi-step with external I/O. +- `push` / `sync` / `merge` — git operations in sandbox that must not interleave. +- `createTask` — read-then-write across task index + actor creation. Returns result, so `wait: true`. + +**Action examples:** +- `billing.stripe_customer.apply` — single column upsert, no external I/O. +- `workspace.update_draft` — writes draft text, no coordination with sandbox ops. +- `workspace.rename_task` — updates title column, queue handlers don't touch title. + +### 2. Must this message be processed no matter what? + +Is this a cross-actor fire-and-forget where the caller won't retry and data loss is unacceptable? A queue persists the message — if the target is down, it waits. An action RPC that fails is gone. + +**Queue examples:** +- `audit.append` — caller must never be affected by audit failures, and audit entries must not be lost. +- `applyTaskSummaryUpdate` — task actor pushes summary to org and moves on. Won't retry if org is busy. +- `refreshTaskSummaryForBranch` — webhook-driven, won't be redelivered for the same event. + +**Action examples:** +- `billing.invoice.upsert` — Stripe retries handle failures externally. No durability need on our side. +- `workspace.mark_unread` — UI convenience state. Acceptable to lose on transient failure. +- `github.webhook_receipt.record` — timestamp columns with no downstream effects. + +### Once on a queue: wait or fire-and-forget? + +If the caller needs a return value, use `wait: true`. If the UI updates via push events, use `wait: false`. + +Full migration plan: `QUEUE_TO_ACTION_MIGRATION.md`. + ## Ownership Rules -- `OrganizationActor` is the organization coordinator and lookup/index owner. -- `HistoryActor` is organization-scoped. There is one organization-level history feed. -- `RepositoryActor` is the repo coordinator and owns repo-local caches/indexes. +- `OrganizationActor` is the organization coordinator, direct coordinator for tasks, and lookup/index owner. It owns the task index, task summaries, and repo catalog. +- `AuditLogActor` is organization-scoped. There is one organization-level audit log feed. - `TaskActor` is one branch. Treat `1 task = 1 branch` once branch assignment is finalized. - `TaskActor` can have many sessions. - `TaskActor` can reference many sandbox instances historically, but should have only one active sandbox/session at a time. -- Session unread state and draft prompts are backend-owned workbench state, not frontend-local state. -- Branch rename is a real git operation, not just metadata. +- Session unread state and draft prompts are backend-owned workspace state, not frontend-local state. +- Branch names are immutable after task creation. Do not implement branch-rename flows. - `SandboxInstanceActor` stays separate from `TaskActor`; tasks/sessions reference it by identity. - The backend stores no local git state. No clones, no refs, no working trees, and no git-spice. Repository metadata comes from GitHub API data and webhook events. Any working-tree git operation runs inside a sandbox via `executeInSandbox()`. - When a backend request path must aggregate multiple independent actor calls or reads, prefer bounded parallelism over sequential fan-out when correctness permits. Do not serialize independent work by default. - Only a coordinator creates/destroys its children. Do not create child actors from outside the coordinator. -- Children push state changes up to their direct coordinator only — never skip levels (e.g., task pushes to repo, not directly to org, unless org is the direct coordinator for that index). +- Children push state changes up to their direct coordinator only. Task actors push summary updates directly to the organization actor. - Read paths must use the coordinator's local index tables. Do not fan out to child actors on the hot read path. - Never build "enriched" read actions that chain through multiple actors (e.g., coordinator → child actor → sibling actor). If data from multiple actors is needed for a read, it should already be materialized in the coordinator's index tables via push updates. If it's not there, fix the write path to push it — do not add a fan-out read path. +## Drizzle Migration Maintenance + +After changing any actor's `db/schema.ts`, you **must** regenerate the corresponding migration so the runtime creates the tables that match the schema. Forgetting this step causes `no such table` errors at runtime. + +1. **Generate a new drizzle migration.** Run from `packages/backend`: + ```bash + npx drizzle-kit generate --config=./src/actors//db/drizzle.config.ts + ``` + If the interactive prompt is unavailable (e.g. in a non-TTY), manually create a new `.sql` file under `./src/actors//db/drizzle/` and add the corresponding entry to `meta/_journal.json`. + +2. **Regenerate the compiled `migrations.ts`.** Run from the foundry root: + ```bash + npx tsx packages/backend/src/actors/_scripts/generate-actor-migrations.ts + ``` + +3. **Verify insert/upsert calls.** Every column with `.notNull()` (and no `.default(...)`) must be provided a value in all `insert()` and `onConflictDoUpdate()` calls. Missing a NOT NULL column causes a runtime constraint violation, not a type error. + +4. **Nuke RivetKit state in dev** after migration changes to start fresh: + ```bash + docker compose -f compose.dev.yaml down + docker volume rm foundry_foundry_rivetkit_storage + docker compose -f compose.dev.yaml up -d + ``` + +Actors with drizzle migrations: `organization`, `audit-log`, `task`. Other actors (`user`, `github-data`) use inline migrations without drizzle. + +## Workflow Step Nesting — FORBIDDEN + +**Never call `c.step()` / `ctx.step()` from inside another step's `run` callback.** RivetKit workflow steps cannot be nested. Doing so causes the runtime error: *"Cannot start a new workflow entry while another is in progress."* + +This means: +- Functions called from within a step `run` callback must NOT use `c.step()`, `c.loop()`, `c.sleep()`, or `c.queue.next()`. +- If a mutation function needs to be called both from a step and standalone, it must only do plain DB/API work — no workflow primitives. The workflow step wrapping belongs in the workflow file, not in the mutation. +- Helper wrappers that conditionally call `c.step()` (like a `runSyncStep` pattern) are dangerous — if the caller is already inside a step, the nested `c.step()` will crash at runtime with no compile-time warning. + +**Rule of thumb:** Workflow primitives (`step`, `loop`, `sleep`, `queue.next`) may only appear at the top level of a workflow function or inside a `loop` callback — never inside a step's `run`. + +## SQLite Constraints + +- Single-row tables must use an integer primary key with `CHECK (id = 1)` to enforce the singleton invariant at the database level. +- Follow the task actor pattern for metadata/profile rows and keep the fixed row id in code as `1`, not a string sentinel. + ## Multiplayer Correctness Per-user UI state must live on the user actor, not on shared task/session actors. This is critical for multiplayer — multiple users may view the same task simultaneously with different active sessions, unread states, and in-progress drafts. @@ -85,6 +203,133 @@ Per-user UI state must live on the user actor, not on shared task/session actors Do not store per-user preferences, selections, or ephemeral UI state on shared actors. If a field's value should differ between two users looking at the same task, it belongs on the user actor. +## Audit Log Maintenance + +Every new action or command handler that represents a user-visible or workflow-significant event must append to the audit log actor. The audit log must remain a comprehensive record of significant operations. + +## Debugging Actors + +### RivetKit Inspector UI + +The RivetKit inspector UI at `http://localhost:6420/ui/` is the most reliable way to debug actor state in local development. The inspector HTTP API (`/inspector/workflow-history`) has a known bug where it returns empty `{}` even when the workflow has entries — always cross-check with the UI. + +**Useful inspector URL pattern:** +``` +http://localhost:6420/ui/?u=http%3A%2F%2F127.0.0.1%3A6420&ns=default&r=default&n=[%22%22]&actorId=&tab= +``` + +Tabs: `workflow`, `database`, `state`, `queue`, `connections`, `metadata`. + +**To find actor IDs:** +```bash +curl -s 'http://127.0.0.1:6420/actors?name=organization' +``` + +**To query actor DB via bun (inside container):** +```bash +docker compose -f compose.dev.yaml exec -T backend bun -e ' + var Database = require("bun:sqlite"); + var db = new Database("/root/.local/share/foundry/rivetkit/databases/.db", { readonly: true }); + console.log(JSON.stringify(db.query("SELECT name FROM sqlite_master WHERE type=?").all("table"))); +' +``` + +**To call actor actions via inspector:** +```bash +curl -s -X POST 'http://127.0.0.1:6420/gateway//inspector/action/' \ + -H 'Content-Type: application/json' -d '{"args":[{}]}' +``` + +### Known inspector API bugs + +- `GET /inspector/workflow-history` may return `{"history":{}}` even when workflow has run. Use the UI's Workflow tab instead. +- `GET /inspector/queue` is reliable for checking pending messages. +- `GET /inspector/state` is reliable for checking actor state. + +## Inbox & Notification System + +The user actor owns two per-user systems: a **task feed** (sidebar ordering) and **notifications** (discrete events). These are distinct concepts that share a common "bump" mechanism. + +### Core distinction: bumps vs. notifications + +A **bump** updates the task's position in the user's sidebar feed. A **notification** is a discrete event entry shown in the notification panel. Every notification also triggers a bump, but not every bump creates a notification. + +| Event | Bumps task? | Creates notification? | +|-------|-------------|----------------------| +| User sends a message | Yes | No | +| User opens/clicks a task | Yes | No | +| User creates a session | Yes | No | +| Agent finishes responding | Yes | Yes | +| PR review requested | Yes | Yes | +| PR merged | Yes | Yes | +| PR comment added | Yes | Yes | +| Agent error/needs input | Yes | Yes | + +### Recipient resolution + +Notifications and bumps go to the **task owner** only. Each task has exactly one owner at a time (the user who last sent a message or explicitly took ownership). This is an acceptable race condition — it rarely makes sense for two users to work on the same task simultaneously, and ownership transfer is explicit. + +The system supports multiplayer (multiple users can view the same task), but the notification/bump target is always the single current owner. Each user has their own independent notification and unread state on their own user actor. + +### Tables (on user actor) + +Two new tables: + +- **`userTaskFeed`** — one row per task. Tracks `bumpedAtMs` and `bumpReason` for sidebar sort order. Does NOT denormalize task content (title, repo, etc.) — the frontend queries the org actor for task content and uses the feed only for ordering/filtering. +- **`userNotifications`** — discrete notification entries with `type`, `message`, `read` state, and optional `sessionId`. Retention: notifications are retained for a configurable number of days after being marked read, then cleaned up. + +### Queue commands (user actor workflow) + +- `user.bump_task` — upserts `userTaskFeed` row, no notification created. Used for user-initiated actions (send message, open task, create session). +- `user.notify` — inserts `userNotifications` row AND upserts `userTaskFeed` (auto-bump). Used for system events (agent finished, PR review requested). +- `user.mark_read` — marks notifications read for a given `(taskId, sessionId?)`. Also updates `userTaskState.unread` for the session. + +### Data flow + +Task actor (or org actor) resolves the current task owner, then sends to the owner's user actor queue: +1. `user.notify(...)` for notification-worthy events (auto-bumps the feed) +2. `user.bump_task(...)` for non-notification bumps (send message, open task) + +The user actor processes the queue message, writes to its local tables, and broadcasts a `userFeedUpdated` event to connected clients. + +### Sidebar architecture change + +The left sidebar changes from showing the repo/PR tree to showing **recent tasks** ordered by `userTaskFeed.bumpedAtMs`. Two new buttons at the top of the sidebar: +- **All Repositories** — navigates to a page showing the current repo + PR list (preserving existing functionality) +- **Notifications** — navigates to a page showing the full notification list + +The sidebar reads from two sources: +- **User actor** (`userTaskFeed`) — provides sort order and "which tasks are relevant to this user" +- **Org actor** (`taskSummaries`) — provides task content (title, status, branch, PR state, session summaries) + +The frontend merges these: org snapshot gives task data, user feed gives sort order. Uses the existing subscription system (`useSubscription`) for both initial state fetch and streaming updates. + +### `updatedAtMs` column semantics + +The org actor's `taskSummaries.updatedAtMs` and the user actor's `userTaskFeed.bumpedAtMs` serve different purposes: +- `taskSummaries.updatedAtMs` — updated by task actor push. Reflects the last time the task's global state changed (any mutation, any user). Used for "All Repositories" / "All Tasks" views. +- `userTaskFeed.bumpedAtMs` — updated by bump/notify commands. Reflects the last time this specific user's attention was drawn to this task. Used for the per-user sidebar sort. + +Add doc comments on both columns clarifying the update source. + +### Unread semantics + +Each user has independent unread state. The existing `userTaskState` table tracks per-`(taskId, sessionId)` unread state. When the user clicks a session: +1. `userTaskState.unread` is set to 0 for that session +2. All `userNotifications` rows matching `(taskId, sessionId)` are marked `read = 1` + +These two unread systems must stay in sync via the `user.mark_read` queue command. + +## Better Auth: Actions, Not Queues + +All Better Auth adapter operations (verification CRUD, session/email/account index mutations, and user-actor auth record mutations) are exposed as **actions**, not queue commands. This is an intentional exception to the normal pattern of using queues for mutations. + +**Why:** The org actor's workflow queue is shared with GitHub sync, webhook processing, task mutations, and billing — 20+ queue names processed sequentially. During the OAuth callback, Better Auth needs to read/write verification records and upsert session/account indexes. If any long-running queue handler (e.g., a GitHub sync step) is ahead in the queue, auth operations time out (10s), `expectQueueResponse` throws a regular `Error`, and Better Auth's `parseState` catches it as a non-`StateError` → redirects to `?error=please_restart_the_process`. + +**Why it's safe:** Auth operations are simple SQLite reads/writes scoped to a single actor instance with no cross-actor side effects. They don't need workflow replay semantics or sequential ordering guarantees relative to other queue commands. + +**Rule:** Never move Better Auth operations back to queue commands. If new auth-related mutations are added, expose them as actions on the relevant actor. + ## Maintenance - Keep this file up to date whenever actor ownership, hierarchy, or lifecycle responsibilities change. diff --git a/foundry/packages/backend/src/actors/auth-user/db/db.ts b/foundry/packages/backend/src/actors/audit-log/db/db.ts similarity index 69% rename from foundry/packages/backend/src/actors/auth-user/db/db.ts rename to foundry/packages/backend/src/actors/audit-log/db/db.ts index b434338..d808ec0 100644 --- a/foundry/packages/backend/src/actors/auth-user/db/db.ts +++ b/foundry/packages/backend/src/actors/audit-log/db/db.ts @@ -2,4 +2,4 @@ import { db } from "rivetkit/db/drizzle"; import * as schema from "./schema.js"; import migrations from "./migrations.js"; -export const authUserDb = db({ schema, migrations }); +export const auditLogDb = db({ schema, migrations }); diff --git a/foundry/packages/backend/src/actors/audit-log/db/drizzle.config.ts b/foundry/packages/backend/src/actors/audit-log/db/drizzle.config.ts new file mode 100644 index 0000000..da5e904 --- /dev/null +++ b/foundry/packages/backend/src/actors/audit-log/db/drizzle.config.ts @@ -0,0 +1,6 @@ +import { defineConfig } from "rivetkit/db/drizzle"; + +export default defineConfig({ + out: "./src/actors/audit-log/db/drizzle", + schema: "./src/actors/audit-log/db/schema.ts", +}); diff --git a/foundry/packages/backend/src/actors/history/db/drizzle/0000_fluffy_kid_colt.sql b/foundry/packages/backend/src/actors/audit-log/db/drizzle/0000_fluffy_kid_colt.sql similarity index 100% rename from foundry/packages/backend/src/actors/history/db/drizzle/0000_fluffy_kid_colt.sql rename to foundry/packages/backend/src/actors/audit-log/db/drizzle/0000_fluffy_kid_colt.sql diff --git a/foundry/packages/backend/src/actors/audit-log/db/drizzle/0001_add_repo_id.sql b/foundry/packages/backend/src/actors/audit-log/db/drizzle/0001_add_repo_id.sql new file mode 100644 index 0000000..9ada559 --- /dev/null +++ b/foundry/packages/backend/src/actors/audit-log/db/drizzle/0001_add_repo_id.sql @@ -0,0 +1 @@ +ALTER TABLE `events` ADD COLUMN `repo_id` text; diff --git a/foundry/packages/backend/src/actors/history/db/drizzle/meta/0000_snapshot.json b/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/0000_snapshot.json similarity index 100% rename from foundry/packages/backend/src/actors/history/db/drizzle/meta/0000_snapshot.json rename to foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/0000_snapshot.json diff --git a/foundry/packages/backend/src/actors/repository/db/drizzle/meta/0000_snapshot.json b/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/0001_snapshot.json similarity index 64% rename from foundry/packages/backend/src/actors/repository/db/drizzle/meta/0000_snapshot.json rename to foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/0001_snapshot.json index 940b4e6..cf2910c 100644 --- a/foundry/packages/backend/src/actors/repository/db/drizzle/meta/0000_snapshot.json +++ b/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/0001_snapshot.json @@ -1,48 +1,31 @@ { "version": "6", "dialect": "sqlite", - "id": "6ffd6acb-e737-46ee-a8fe-fcfddcdd6ea9", - "prevId": "00000000-0000-0000-0000-000000000000", + "id": "a1b2c3d4-0001-4000-8000-000000000001", + "prevId": "e592c829-141f-4740-88b7-09cf957a4405", "tables": { - "repo_meta": { - "name": "repo_meta", + "events": { + "name": "events", "columns": { "id": { "name": "id", "type": "integer", "primaryKey": true, "notNull": true, - "autoincrement": false + "autoincrement": true }, - "remote_url": { - "name": "remote_url", + "repo_id": { + "name": "repo_id", "type": "text", "primaryKey": false, - "notNull": true, + "notNull": false, "autoincrement": false }, - "updated_at": { - "name": "updated_at", - "type": "integer", - "primaryKey": false, - "notNull": true, - "autoincrement": false - } - }, - "indexes": {}, - "foreignKeys": {}, - "compositePrimaryKeys": {}, - "uniqueConstraints": {}, - "checkConstraints": {} - }, - "task_index": { - "name": "task_index", - "columns": { "task_id": { "name": "task_id", "type": "text", - "primaryKey": true, - "notNull": true, + "primaryKey": false, + "notNull": false, "autoincrement": false }, "branch_name": { @@ -52,15 +35,22 @@ "notNull": false, "autoincrement": false }, - "created_at": { - "name": "created_at", - "type": "integer", + "kind": { + "name": "kind", + "type": "text", "primaryKey": false, "notNull": true, "autoincrement": false }, - "updated_at": { - "name": "updated_at", + "payload_json": { + "name": "payload_json", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "created_at": { + "name": "created_at", "type": "integer", "primaryKey": false, "notNull": true, diff --git a/foundry/packages/backend/src/actors/history/db/drizzle/meta/_journal.json b/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/_journal.json similarity index 59% rename from foundry/packages/backend/src/actors/history/db/drizzle/meta/_journal.json rename to foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/_journal.json index 93cf8ce..0393be2 100644 --- a/foundry/packages/backend/src/actors/history/db/drizzle/meta/_journal.json +++ b/foundry/packages/backend/src/actors/audit-log/db/drizzle/meta/_journal.json @@ -8,6 +8,13 @@ "when": 1773376223815, "tag": "0000_fluffy_kid_colt", "breakpoints": true + }, + { + "idx": 1, + "version": "6", + "when": 1773376223816, + "tag": "0001_add_repo_id", + "breakpoints": true } ] } diff --git a/foundry/packages/backend/src/actors/history/db/migrations.ts b/foundry/packages/backend/src/actors/audit-log/db/migrations.ts similarity index 78% rename from foundry/packages/backend/src/actors/history/db/migrations.ts rename to foundry/packages/backend/src/actors/audit-log/db/migrations.ts index 766c225..5bf9b5a 100644 --- a/foundry/packages/backend/src/actors/history/db/migrations.ts +++ b/foundry/packages/backend/src/actors/audit-log/db/migrations.ts @@ -10,6 +10,12 @@ const journal = { tag: "0000_fluffy_kid_colt", breakpoints: true, }, + { + idx: 1, + when: 1773376223816, + tag: "0001_add_repo_id", + breakpoints: true, + }, ], } as const; @@ -24,6 +30,8 @@ export default { \`payload_json\` text NOT NULL, \`created_at\` integer NOT NULL ); +`, + m0001: `ALTER TABLE \`events\` ADD COLUMN \`repo_id\` text; `, } as const, }; diff --git a/foundry/packages/backend/src/actors/history/db/schema.ts b/foundry/packages/backend/src/actors/audit-log/db/schema.ts similarity index 77% rename from foundry/packages/backend/src/actors/history/db/schema.ts rename to foundry/packages/backend/src/actors/audit-log/db/schema.ts index 80eb7f4..d275dd4 100644 --- a/foundry/packages/backend/src/actors/history/db/schema.ts +++ b/foundry/packages/backend/src/actors/audit-log/db/schema.ts @@ -2,10 +2,11 @@ import { integer, sqliteTable, text } from "rivetkit/db/drizzle"; export const events = sqliteTable("events", { id: integer("id").primaryKey({ autoIncrement: true }), + repoId: text("repo_id"), taskId: text("task_id"), branchName: text("branch_name"), kind: text("kind").notNull(), - // Structured by the history event kind definitions in application code. + // Structured by the audit-log event kind definitions in application code. payloadJson: text("payload_json").notNull(), createdAt: integer("created_at").notNull(), }); diff --git a/foundry/packages/backend/src/actors/audit-log/index.ts b/foundry/packages/backend/src/actors/audit-log/index.ts new file mode 100644 index 0000000..db32829 --- /dev/null +++ b/foundry/packages/backend/src/actors/audit-log/index.ts @@ -0,0 +1,180 @@ +// @ts-nocheck +import { and, desc, eq } from "drizzle-orm"; +import { actor, queue } from "rivetkit"; +import { workflow, Loop } from "rivetkit/workflow"; +import type { AuditLogEvent } from "@sandbox-agent/foundry-shared"; +import { selfAuditLog } from "../handles.js"; +import { logActorWarning, resolveErrorMessage } from "../logging.js"; +import { auditLogDb } from "./db/db.js"; +import { events } from "./db/schema.js"; + +export interface AuditLogInput { + organizationId: string; +} + +export interface AppendAuditLogCommand { + kind: string; + repoId?: string; + taskId?: string; + branchName?: string; + payload: Record; +} + +export interface ListAuditLogParams { + repoId?: string; + branch?: string; + taskId?: string; + limit?: number; +} + +// --------------------------------------------------------------------------- +// Queue names +// --------------------------------------------------------------------------- + +const AUDIT_LOG_QUEUE_NAMES = ["auditLog.command.append"] as const; + +type AuditLogQueueName = (typeof AUDIT_LOG_QUEUE_NAMES)[number]; + +function auditLogWorkflowQueueName(name: AuditLogQueueName): AuditLogQueueName { + return name; +} + +// --------------------------------------------------------------------------- +// Mutation functions +// --------------------------------------------------------------------------- + +async function appendMutation(c: any, body: AppendAuditLogCommand): Promise<{ ok: true }> { + const now = Date.now(); + await c.db + .insert(events) + .values({ + repoId: body.repoId ?? null, + taskId: body.taskId ?? null, + branchName: body.branchName ?? null, + kind: body.kind, + payloadJson: JSON.stringify(body.payload), + createdAt: now, + }) + .run(); + return { ok: true }; +} + +// --------------------------------------------------------------------------- +// Workflow command loop +// --------------------------------------------------------------------------- + +type AuditLogWorkflowHandler = (loopCtx: any, body: any) => Promise; + +const AUDIT_LOG_COMMAND_HANDLERS: Record = { + "auditLog.command.append": async (c, body) => appendMutation(c, body), +}; + +async function runAuditLogWorkflow(ctx: any): Promise { + await ctx.loop("audit-log-command-loop", async (loopCtx: any) => { + const msg = await loopCtx.queue.next("next-audit-log-command", { + names: [...AUDIT_LOG_QUEUE_NAMES], + completable: true, + }); + + if (!msg) { + return Loop.continue(undefined); + } + + const handler = AUDIT_LOG_COMMAND_HANDLERS[msg.name as AuditLogQueueName]; + if (!handler) { + logActorWarning("auditLog", "unknown audit-log command", { command: msg.name }); + await msg.complete({ error: `Unknown command: ${msg.name}` }).catch(() => {}); + return Loop.continue(undefined); + } + + try { + // Wrap in a step so c.state and c.db are accessible inside mutation functions. + const result = await loopCtx.step({ + name: msg.name, + timeout: 60_000, + run: async () => handler(loopCtx, msg.body), + }); + await msg.complete(result); + } catch (error) { + const message = resolveErrorMessage(error); + logActorWarning("auditLog", "audit-log workflow command failed", { + command: msg.name, + error: message, + }); + await msg.complete({ error: message }).catch(() => {}); + } + + return Loop.continue(undefined); + }); +} + +// --------------------------------------------------------------------------- +// Actor definition +// --------------------------------------------------------------------------- + +/** + * Organization-scoped audit log. One per org, not one per repo. + * + * The org is the coordinator for all tasks across repos, and we frequently need + * to query the full audit trail across repos (e.g. org-wide activity feed, + * compliance). A per-repo audit log would require fan-out reads every time. + * Keeping it org-scoped gives us a single queryable feed with optional repoId + * filtering when callers want a narrower view. + */ +export const auditLog = actor({ + db: auditLogDb, + queues: Object.fromEntries(AUDIT_LOG_QUEUE_NAMES.map((name) => [name, queue()])), + options: { + name: "Audit Log", + icon: "database", + }, + createState: (_c, input: AuditLogInput) => ({ + organizationId: input.organizationId, + }), + actions: { + // Mutation — self-send to queue for workflow history + async append(c: any, body: AppendAuditLogCommand): Promise<{ ok: true }> { + const self = selfAuditLog(c); + await self.send(auditLogWorkflowQueueName("auditLog.command.append"), body, { wait: false }); + return { ok: true }; + }, + + // Read — direct action (no queue) + async list(c, params?: ListAuditLogParams): Promise { + const whereParts = []; + if (params?.repoId) { + whereParts.push(eq(events.repoId, params.repoId)); + } + if (params?.taskId) { + whereParts.push(eq(events.taskId, params.taskId)); + } + if (params?.branch) { + whereParts.push(eq(events.branchName, params.branch)); + } + + const base = c.db + .select({ + id: events.id, + repoId: events.repoId, + taskId: events.taskId, + branchName: events.branchName, + kind: events.kind, + payloadJson: events.payloadJson, + createdAt: events.createdAt, + }) + .from(events); + + const rows = await (whereParts.length > 0 ? base.where(and(...whereParts)) : base) + .orderBy(desc(events.createdAt)) + .limit(params?.limit ?? 100) + .all(); + + return rows.map((row) => ({ + ...row, + organizationId: c.state.organizationId, + repoId: row.repoId ?? null, + })); + }, + }, + run: workflow(runAuditLogWorkflow), +}); diff --git a/foundry/packages/backend/src/actors/auth-user/db/schema.ts b/foundry/packages/backend/src/actors/auth-user/db/schema.ts deleted file mode 100644 index b87567a..0000000 --- a/foundry/packages/backend/src/actors/auth-user/db/schema.ts +++ /dev/null @@ -1,70 +0,0 @@ -import { integer, sqliteTable, text, uniqueIndex } from "drizzle-orm/sqlite-core"; - -export const authUsers = sqliteTable("user", { - id: text("id").notNull().primaryKey(), - name: text("name").notNull(), - email: text("email").notNull(), - emailVerified: integer("email_verified").notNull(), - image: text("image"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -export const authSessions = sqliteTable( - "session", - { - id: text("id").notNull().primaryKey(), - token: text("token").notNull(), - userId: text("user_id").notNull(), - expiresAt: integer("expires_at").notNull(), - ipAddress: text("ip_address"), - userAgent: text("user_agent"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), - }, - (table) => ({ - tokenIdx: uniqueIndex("session_token_idx").on(table.token), - }), -); - -export const authAccounts = sqliteTable( - "account", - { - id: text("id").notNull().primaryKey(), - accountId: text("account_id").notNull(), - providerId: text("provider_id").notNull(), - userId: text("user_id").notNull(), - accessToken: text("access_token"), - refreshToken: text("refresh_token"), - idToken: text("id_token"), - accessTokenExpiresAt: integer("access_token_expires_at"), - refreshTokenExpiresAt: integer("refresh_token_expires_at"), - scope: text("scope"), - password: text("password"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), - }, - (table) => ({ - providerAccountIdx: uniqueIndex("account_provider_account_idx").on(table.providerId, table.accountId), - }), -); - -export const userProfiles = sqliteTable("user_profiles", { - userId: text("user_id").notNull().primaryKey(), - githubAccountId: text("github_account_id"), - githubLogin: text("github_login"), - roleLabel: text("role_label").notNull(), - eligibleOrganizationIdsJson: text("eligible_organization_ids_json").notNull(), - starterRepoStatus: text("starter_repo_status").notNull(), - starterRepoStarredAt: integer("starter_repo_starred_at"), - starterRepoSkippedAt: integer("starter_repo_skipped_at"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -export const sessionState = sqliteTable("session_state", { - sessionId: text("session_id").notNull().primaryKey(), - activeOrganizationId: text("active_organization_id"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), -}); diff --git a/foundry/packages/backend/src/actors/auth-user/index.ts b/foundry/packages/backend/src/actors/auth-user/index.ts deleted file mode 100644 index a77635a..0000000 --- a/foundry/packages/backend/src/actors/auth-user/index.ts +++ /dev/null @@ -1,353 +0,0 @@ -import { and, asc, count as sqlCount, desc, eq, gt, gte, inArray, isNotNull, isNull, like, lt, lte, ne, notInArray, or } from "drizzle-orm"; -import { actor } from "rivetkit"; -import { authUserDb } from "./db/db.js"; -import { authAccounts, authSessions, authUsers, sessionState, userProfiles } from "./db/schema.js"; - -const tables = { - user: authUsers, - session: authSessions, - account: authAccounts, - userProfiles, - sessionState, -} as const; - -function tableFor(model: string) { - const table = tables[model as keyof typeof tables]; - if (!table) { - throw new Error(`Unsupported auth user model: ${model}`); - } - return table as any; -} - -function columnFor(table: any, field: string) { - const column = table[field]; - if (!column) { - throw new Error(`Unsupported auth user field: ${field}`); - } - return column; -} - -function normalizeValue(value: unknown): unknown { - if (value instanceof Date) { - return value.getTime(); - } - if (Array.isArray(value)) { - return value.map((entry) => normalizeValue(entry)); - } - return value; -} - -function clauseToExpr(table: any, clause: any) { - const column = columnFor(table, clause.field); - const value = normalizeValue(clause.value); - - switch (clause.operator) { - case "ne": - return value === null ? isNotNull(column) : ne(column, value as any); - case "lt": - return lt(column, value as any); - case "lte": - return lte(column, value as any); - case "gt": - return gt(column, value as any); - case "gte": - return gte(column, value as any); - case "in": - return inArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); - case "not_in": - return notInArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); - case "contains": - return like(column, `%${String(value ?? "")}%`); - case "starts_with": - return like(column, `${String(value ?? "")}%`); - case "ends_with": - return like(column, `%${String(value ?? "")}`); - case "eq": - default: - return value === null ? isNull(column) : eq(column, value as any); - } -} - -function buildWhere(table: any, where: any[] | undefined) { - if (!where || where.length === 0) { - return undefined; - } - - let expr = clauseToExpr(table, where[0]); - for (const clause of where.slice(1)) { - const next = clauseToExpr(table, clause); - expr = clause.connector === "OR" ? or(expr, next) : and(expr, next); - } - return expr; -} - -function applyJoinToRow(c: any, model: string, row: any, join: any) { - if (!row || !join) { - return row; - } - - if (model === "session" && join.user) { - return c.db - .select() - .from(authUsers) - .where(eq(authUsers.id, row.userId)) - .get() - .then((user: any) => ({ ...row, user: user ?? null })); - } - - if (model === "account" && join.user) { - return c.db - .select() - .from(authUsers) - .where(eq(authUsers.id, row.userId)) - .get() - .then((user: any) => ({ ...row, user: user ?? null })); - } - - if (model === "user" && join.account) { - return c.db - .select() - .from(authAccounts) - .where(eq(authAccounts.userId, row.id)) - .all() - .then((accounts: any[]) => ({ ...row, account: accounts })); - } - - return Promise.resolve(row); -} - -async function applyJoinToRows(c: any, model: string, rows: any[], join: any) { - if (!join || rows.length === 0) { - return rows; - } - - if (model === "session" && join.user) { - const userIds = [...new Set(rows.map((row) => row.userId).filter(Boolean))]; - const users = userIds.length > 0 ? await c.db.select().from(authUsers).where(inArray(authUsers.id, userIds)).all() : []; - const userMap = new Map(users.map((user: any) => [user.id, user])); - return rows.map((row) => ({ ...row, user: userMap.get(row.userId) ?? null })); - } - - if (model === "account" && join.user) { - const userIds = [...new Set(rows.map((row) => row.userId).filter(Boolean))]; - const users = userIds.length > 0 ? await c.db.select().from(authUsers).where(inArray(authUsers.id, userIds)).all() : []; - const userMap = new Map(users.map((user: any) => [user.id, user])); - return rows.map((row) => ({ ...row, user: userMap.get(row.userId) ?? null })); - } - - if (model === "user" && join.account) { - const userIds = rows.map((row) => row.id); - const accounts = userIds.length > 0 ? await c.db.select().from(authAccounts).where(inArray(authAccounts.userId, userIds)).all() : []; - const accountsByUserId = new Map(); - for (const account of accounts) { - const entries = accountsByUserId.get(account.userId) ?? []; - entries.push(account); - accountsByUserId.set(account.userId, entries); - } - return rows.map((row) => ({ ...row, account: accountsByUserId.get(row.id) ?? [] })); - } - - return rows; -} - -export const authUser = actor({ - db: authUserDb, - options: { - name: "Auth User", - icon: "shield", - actionTimeout: 60_000, - }, - createState: (_c, input: { userId: string }) => ({ - userId: input.userId, - }), - actions: { - async createAuthRecord(c, input: { model: string; data: Record }) { - const table = tableFor(input.model); - await c.db - .insert(table) - .values(input.data as any) - .run(); - return await c.db - .select() - .from(table) - .where(eq(columnFor(table, "id"), input.data.id as any)) - .get(); - }, - - async findOneAuthRecord(c, input: { model: string; where: any[]; join?: any }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - const row = predicate ? await c.db.select().from(table).where(predicate).get() : await c.db.select().from(table).get(); - return await applyJoinToRow(c, input.model, row ?? null, input.join); - }, - - async findManyAuthRecords(c, input: { model: string; where?: any[]; limit?: number; offset?: number; sortBy?: any; join?: any }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - let query: any = c.db.select().from(table); - if (predicate) { - query = query.where(predicate); - } - if (input.sortBy?.field) { - const column = columnFor(table, input.sortBy.field); - query = query.orderBy(input.sortBy.direction === "asc" ? asc(column) : desc(column)); - } - if (typeof input.limit === "number") { - query = query.limit(input.limit); - } - if (typeof input.offset === "number") { - query = query.offset(input.offset); - } - const rows = await query.all(); - return await applyJoinToRows(c, input.model, rows, input.join); - }, - - async updateAuthRecord(c, input: { model: string; where: any[]; update: Record }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - if (!predicate) { - throw new Error("updateAuthRecord requires a where clause"); - } - await c.db - .update(table) - .set(input.update as any) - .where(predicate) - .run(); - return await c.db.select().from(table).where(predicate).get(); - }, - - async updateManyAuthRecords(c, input: { model: string; where: any[]; update: Record }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - if (!predicate) { - throw new Error("updateManyAuthRecords requires a where clause"); - } - await c.db - .update(table) - .set(input.update as any) - .where(predicate) - .run(); - const row = await c.db.select({ value: sqlCount() }).from(table).where(predicate).get(); - return row?.value ?? 0; - }, - - async deleteAuthRecord(c, input: { model: string; where: any[] }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - if (!predicate) { - throw new Error("deleteAuthRecord requires a where clause"); - } - await c.db.delete(table).where(predicate).run(); - }, - - async deleteManyAuthRecords(c, input: { model: string; where: any[] }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - if (!predicate) { - throw new Error("deleteManyAuthRecords requires a where clause"); - } - const rows = await c.db.select().from(table).where(predicate).all(); - await c.db.delete(table).where(predicate).run(); - return rows.length; - }, - - async countAuthRecords(c, input: { model: string; where?: any[] }) { - const table = tableFor(input.model); - const predicate = buildWhere(table, input.where); - const row = predicate - ? await c.db.select({ value: sqlCount() }).from(table).where(predicate).get() - : await c.db.select({ value: sqlCount() }).from(table).get(); - return row?.value ?? 0; - }, - - async getAppAuthState(c, input: { sessionId: string }) { - const session = await c.db.select().from(authSessions).where(eq(authSessions.id, input.sessionId)).get(); - if (!session) { - return null; - } - const [user, profile, currentSessionState, accounts] = await Promise.all([ - c.db.select().from(authUsers).where(eq(authUsers.id, session.userId)).get(), - c.db.select().from(userProfiles).where(eq(userProfiles.userId, session.userId)).get(), - c.db.select().from(sessionState).where(eq(sessionState.sessionId, input.sessionId)).get(), - c.db.select().from(authAccounts).where(eq(authAccounts.userId, session.userId)).all(), - ]); - return { - session, - user, - profile: profile ?? null, - sessionState: currentSessionState ?? null, - accounts, - }; - }, - - async upsertUserProfile( - c, - input: { - userId: string; - patch: { - githubAccountId?: string | null; - githubLogin?: string | null; - roleLabel?: string; - eligibleOrganizationIdsJson?: string; - starterRepoStatus?: string; - starterRepoStarredAt?: number | null; - starterRepoSkippedAt?: number | null; - }; - }, - ) { - const now = Date.now(); - await c.db - .insert(userProfiles) - .values({ - userId: input.userId, - githubAccountId: input.patch.githubAccountId ?? null, - githubLogin: input.patch.githubLogin ?? null, - roleLabel: input.patch.roleLabel ?? "GitHub user", - eligibleOrganizationIdsJson: input.patch.eligibleOrganizationIdsJson ?? "[]", - starterRepoStatus: input.patch.starterRepoStatus ?? "pending", - starterRepoStarredAt: input.patch.starterRepoStarredAt ?? null, - starterRepoSkippedAt: input.patch.starterRepoSkippedAt ?? null, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: userProfiles.userId, - set: { - ...(input.patch.githubAccountId !== undefined ? { githubAccountId: input.patch.githubAccountId } : {}), - ...(input.patch.githubLogin !== undefined ? { githubLogin: input.patch.githubLogin } : {}), - ...(input.patch.roleLabel !== undefined ? { roleLabel: input.patch.roleLabel } : {}), - ...(input.patch.eligibleOrganizationIdsJson !== undefined ? { eligibleOrganizationIdsJson: input.patch.eligibleOrganizationIdsJson } : {}), - ...(input.patch.starterRepoStatus !== undefined ? { starterRepoStatus: input.patch.starterRepoStatus } : {}), - ...(input.patch.starterRepoStarredAt !== undefined ? { starterRepoStarredAt: input.patch.starterRepoStarredAt } : {}), - ...(input.patch.starterRepoSkippedAt !== undefined ? { starterRepoSkippedAt: input.patch.starterRepoSkippedAt } : {}), - updatedAt: now, - }, - }) - .run(); - - return await c.db.select().from(userProfiles).where(eq(userProfiles.userId, input.userId)).get(); - }, - - async upsertSessionState(c, input: { sessionId: string; activeOrganizationId: string | null }) { - const now = Date.now(); - await c.db - .insert(sessionState) - .values({ - sessionId: input.sessionId, - activeOrganizationId: input.activeOrganizationId, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: sessionState.sessionId, - set: { - activeOrganizationId: input.activeOrganizationId, - updatedAt: now, - }, - }) - .run(); - - return await c.db.select().from(sessionState).where(eq(sessionState.sessionId, input.sessionId)).get(); - }, - }, -}); diff --git a/foundry/packages/backend/src/actors/events.ts b/foundry/packages/backend/src/actors/events.ts deleted file mode 100644 index 4a514ad..0000000 --- a/foundry/packages/backend/src/actors/events.ts +++ /dev/null @@ -1,104 +0,0 @@ -import type { TaskStatus, SandboxProviderId } from "@sandbox-agent/foundry-shared"; - -export interface TaskCreatedEvent { - organizationId: string; - repoId: string; - taskId: string; - sandboxProviderId: SandboxProviderId; - branchName: string; - title: string; -} - -export interface TaskStatusEvent { - organizationId: string; - repoId: string; - taskId: string; - status: TaskStatus; - message: string; -} - -export interface RepositorySnapshotEvent { - organizationId: string; - repoId: string; - updatedAt: number; -} - -export interface AgentStartedEvent { - organizationId: string; - repoId: string; - taskId: string; - sessionId: string; -} - -export interface AgentIdleEvent { - organizationId: string; - repoId: string; - taskId: string; - sessionId: string; -} - -export interface AgentErrorEvent { - organizationId: string; - repoId: string; - taskId: string; - message: string; -} - -export interface PrCreatedEvent { - organizationId: string; - repoId: string; - taskId: string; - prNumber: number; - url: string; -} - -export interface PrClosedEvent { - organizationId: string; - repoId: string; - taskId: string; - prNumber: number; - merged: boolean; -} - -export interface PrReviewEvent { - organizationId: string; - repoId: string; - taskId: string; - prNumber: number; - reviewer: string; - status: string; -} - -export interface CiStatusChangedEvent { - organizationId: string; - repoId: string; - taskId: string; - prNumber: number; - status: string; -} - -export type TaskStepName = "auto_commit" | "push" | "pr_submit"; -export type TaskStepStatus = "started" | "completed" | "skipped" | "failed"; - -export interface TaskStepEvent { - organizationId: string; - repoId: string; - taskId: string; - step: TaskStepName; - status: TaskStepStatus; - message: string; -} - -export interface BranchSwitchedEvent { - organizationId: string; - repoId: string; - taskId: string; - branchName: string; -} - -export interface SessionAttachedEvent { - organizationId: string; - repoId: string; - taskId: string; - sessionId: string; -} diff --git a/foundry/packages/backend/src/actors/github-data/db/migrations.ts b/foundry/packages/backend/src/actors/github-data/db/migrations.ts index 87cc76f..10e3804 100644 --- a/foundry/packages/backend/src/actors/github-data/db/migrations.ts +++ b/foundry/packages/backend/src/actors/github-data/db/migrations.ts @@ -18,6 +18,18 @@ const journal = { tag: "0002_github_branches", breakpoints: true, }, + { + idx: 3, + when: 1773907200000, + tag: "0003_sync_progress", + breakpoints: true, + }, + { + idx: 4, + when: 1773993600000, + tag: "0004_drop_github_branches", + breakpoints: true, + }, ], } as const; @@ -32,7 +44,8 @@ export default { \`installation_id\` integer, \`last_sync_label\` text NOT NULL, \`last_sync_at\` integer, - \`updated_at\` integer NOT NULL + \`updated_at\` integer NOT NULL, + CONSTRAINT \`github_meta_singleton_id_check\` CHECK(\`id\` = 1) ); --> statement-breakpoint CREATE TABLE \`github_repositories\` ( @@ -78,6 +91,24 @@ CREATE TABLE \`github_pull_requests\` ( \`commit_sha\` text NOT NULL, \`updated_at\` integer NOT NULL ); +`, + m0003: `ALTER TABLE \`github_meta\` ADD \`sync_generation\` integer NOT NULL DEFAULT 0; +--> statement-breakpoint +ALTER TABLE \`github_meta\` ADD \`sync_phase\` text; +--> statement-breakpoint +ALTER TABLE \`github_meta\` ADD \`processed_repository_count\` integer NOT NULL DEFAULT 0; +--> statement-breakpoint +ALTER TABLE \`github_meta\` ADD \`total_repository_count\` integer NOT NULL DEFAULT 0; +--> statement-breakpoint +ALTER TABLE \`github_repositories\` ADD \`sync_generation\` integer NOT NULL DEFAULT 0; +--> statement-breakpoint +ALTER TABLE \`github_members\` ADD \`sync_generation\` integer NOT NULL DEFAULT 0; +--> statement-breakpoint +ALTER TABLE \`github_pull_requests\` ADD \`sync_generation\` integer NOT NULL DEFAULT 0; +--> statement-breakpoint +ALTER TABLE \`github_branches\` ADD \`sync_generation\` integer NOT NULL DEFAULT 0; +`, + m0004: `DROP TABLE IF EXISTS \`github_branches\`; `, } as const, }; diff --git a/foundry/packages/backend/src/actors/github-data/db/schema.ts b/foundry/packages/backend/src/actors/github-data/db/schema.ts index fe37863..94b4edc 100644 --- a/foundry/packages/backend/src/actors/github-data/db/schema.ts +++ b/foundry/packages/backend/src/actors/github-data/db/schema.ts @@ -1,15 +1,24 @@ -import { integer, sqliteTable, text } from "rivetkit/db/drizzle"; +import { check, integer, sqliteTable, text } from "rivetkit/db/drizzle"; +import { sql } from "drizzle-orm"; -export const githubMeta = sqliteTable("github_meta", { - id: integer("id").primaryKey(), - connectedAccount: text("connected_account").notNull(), - installationStatus: text("installation_status").notNull(), - syncStatus: text("sync_status").notNull(), - installationId: integer("installation_id"), - lastSyncLabel: text("last_sync_label").notNull(), - lastSyncAt: integer("last_sync_at"), - updatedAt: integer("updated_at").notNull(), -}); +export const githubMeta = sqliteTable( + "github_meta", + { + id: integer("id").primaryKey(), + connectedAccount: text("connected_account").notNull(), + installationStatus: text("installation_status").notNull(), + syncStatus: text("sync_status").notNull(), + installationId: integer("installation_id"), + lastSyncLabel: text("last_sync_label").notNull(), + lastSyncAt: integer("last_sync_at"), + syncGeneration: integer("sync_generation").notNull(), + syncPhase: text("sync_phase"), + processedRepositoryCount: integer("processed_repository_count").notNull(), + totalRepositoryCount: integer("total_repository_count").notNull(), + updatedAt: integer("updated_at").notNull(), + }, + (table) => [check("github_meta_singleton_id_check", sql`${table.id} = 1`)], +); export const githubRepositories = sqliteTable("github_repositories", { repoId: text("repo_id").notNull().primaryKey(), @@ -17,14 +26,7 @@ export const githubRepositories = sqliteTable("github_repositories", { cloneUrl: text("clone_url").notNull(), private: integer("private").notNull(), defaultBranch: text("default_branch").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -export const githubBranches = sqliteTable("github_branches", { - branchId: text("branch_id").notNull().primaryKey(), - repoId: text("repo_id").notNull(), - branchName: text("branch_name").notNull(), - commitSha: text("commit_sha").notNull(), + syncGeneration: integer("sync_generation").notNull(), updatedAt: integer("updated_at").notNull(), }); @@ -35,6 +37,7 @@ export const githubMembers = sqliteTable("github_members", { email: text("email"), role: text("role"), state: text("state").notNull(), + syncGeneration: integer("sync_generation").notNull(), updatedAt: integer("updated_at").notNull(), }); @@ -51,5 +54,6 @@ export const githubPullRequests = sqliteTable("github_pull_requests", { baseRefName: text("base_ref_name").notNull(), authorLogin: text("author_login"), isDraft: integer("is_draft").notNull(), + syncGeneration: integer("sync_generation").notNull(), updatedAt: integer("updated_at").notNull(), }); diff --git a/foundry/packages/backend/src/actors/github-data/index.ts b/foundry/packages/backend/src/actors/github-data/index.ts index 08c815d..d19732a 100644 --- a/foundry/packages/backend/src/actors/github-data/index.ts +++ b/foundry/packages/backend/src/actors/github-data/index.ts @@ -1,16 +1,22 @@ // @ts-nocheck -import { eq } from "drizzle-orm"; +import { eq, inArray } from "drizzle-orm"; import { actor, queue } from "rivetkit"; import { workflow, Loop } from "rivetkit/workflow"; import type { FoundryOrganization } from "@sandbox-agent/foundry-shared"; import { getActorRuntimeContext } from "../context.js"; import { getOrCreateOrganization, getTask } from "../handles.js"; +import { logActorWarning, resolveErrorMessage } from "../logging.js"; +import { taskWorkflowQueueName } from "../task/workflow/queue.js"; import { repoIdFromRemote } from "../../services/repo.js"; import { resolveOrganizationGithubAuth } from "../../services/github-auth.js"; +import { organizationWorkflowQueueName } from "../organization/queues.js"; import { githubDataDb } from "./db/db.js"; -import { githubBranches, githubMembers, githubMeta, githubPullRequests, githubRepositories } from "./db/schema.js"; +import { githubMembers, githubMeta, githubPullRequests, githubRepositories } from "./db/schema.js"; const META_ROW_ID = 1; +const SYNC_REPOSITORY_BATCH_SIZE = 10; + +type GithubSyncPhase = "discovering_repositories" | "syncing_repositories" | "syncing_members" | "syncing_pull_requests"; interface GithubDataInput { organizationId: string; @@ -32,12 +38,6 @@ interface GithubRepositoryRecord { defaultBranch: string; } -interface GithubBranchRecord { - repoId: string; - branchName: string; - commitSha: string; -} - interface GithubPullRequestRecord { repoId: string; repoFullName: string; @@ -70,6 +70,19 @@ interface ClearStateInput { label: string; } +// Queue names for github-data actor +export const GITHUB_DATA_QUEUE_NAMES = [ + "githubData.command.syncRepos", + "githubData.command.handlePullRequestWebhook", + "githubData.command.clearState", +] as const; + +type GithubDataQueueName = (typeof GITHUB_DATA_QUEUE_NAMES)[number]; + +export function githubDataWorkflowQueueName(name: GithubDataQueueName): GithubDataQueueName { + return name; +} + interface PullRequestWebhookInput { connectedAccount: string; installationStatus: FoundryOrganization["github"]["installationStatus"]; @@ -93,6 +106,19 @@ interface PullRequestWebhookInput { }; } +interface GithubMetaState { + connectedAccount: string; + installationStatus: FoundryOrganization["github"]["installationStatus"]; + syncStatus: FoundryOrganization["github"]["syncStatus"]; + installationId: number | null; + lastSyncLabel: string; + lastSyncAt: number | null; + syncGeneration: number; + syncPhase: GithubSyncPhase | null; + processedRepositoryCount: number; + totalRepositoryCount: number; +} + function normalizePrStatus(input: { state: string; isDraft?: boolean; merged?: boolean }): "OPEN" | "DRAFT" | "CLOSED" | "MERGED" { const state = input.state.trim().toUpperCase(); if (input.merged || state === "MERGED") return "MERGED"; @@ -106,6 +132,7 @@ function pullRequestSummaryFromRow(row: any) { repoId: row.repoId, repoFullName: row.repoFullName, number: row.number, + status: Boolean(row.isDraft) ? "draft" : "ready", title: row.title, state: row.state, url: row.url, @@ -117,7 +144,18 @@ function pullRequestSummaryFromRow(row: any) { }; } -async function readMeta(c: any) { +function chunkItems(items: T[], size: number): T[][] { + if (items.length === 0) { + return []; + } + const chunks: T[][] = []; + for (let index = 0; index < items.length; index += size) { + chunks.push(items.slice(index, index + size)); + } + return chunks; +} + +export async function readMeta(c: any): Promise { const row = await c.db.select().from(githubMeta).where(eq(githubMeta.id, META_ROW_ID)).get(); return { connectedAccount: row?.connectedAccount ?? "", @@ -126,10 +164,14 @@ async function readMeta(c: any) { installationId: row?.installationId ?? null, lastSyncLabel: row?.lastSyncLabel ?? "Waiting for first import", lastSyncAt: row?.lastSyncAt ?? null, + syncGeneration: row?.syncGeneration ?? 0, + syncPhase: (row?.syncPhase ?? null) as GithubSyncPhase | null, + processedRepositoryCount: row?.processedRepositoryCount ?? 0, + totalRepositoryCount: row?.totalRepositoryCount ?? 0, }; } -async function writeMeta(c: any, patch: Partial>>) { +async function writeMeta(c: any, patch: Partial) { const current = await readMeta(c); const next = { ...current, @@ -145,6 +187,10 @@ async function writeMeta(c: any, patch: Partial): Promise { + const meta = await writeMeta(c, patch); + const organization = await getOrCreateOrganization(c, c.state.organizationId); + await organization.send( + organizationWorkflowQueueName("organization.command.github.sync_progress.apply"), + { + connectedAccount: meta.connectedAccount, + installationStatus: meta.installationStatus, + installationId: meta.installationId, + syncStatus: meta.syncStatus, + lastSyncLabel: meta.lastSyncLabel, + lastSyncAt: meta.lastSyncAt, + syncGeneration: meta.syncGeneration, + syncPhase: meta.syncPhase, + processedRepositoryCount: meta.processedRepositoryCount, + totalRepositoryCount: meta.totalRepositoryCount, + }, + { wait: false }, + ); + return meta; +} + async function getOrganizationContext(c: any, overrides?: FullSyncInput) { + // Try to read the org profile for fallback values, but don't require it. + // Webhook-triggered syncs can arrive before the user signs in and creates the + // org profile row. The webhook callers already pass the necessary overrides + // (connectedAccount, installationId, githubLogin, kind), so we can proceed + // without the profile as long as overrides cover the required fields. const organizationHandle = await getOrCreateOrganization(c, c.state.organizationId); const organizationState = await organizationHandle.getOrganizationShellStateIfInitialized({}); - if (!organizationState) { - throw new Error(`Organization ${c.state.organizationId} is not initialized`); + + // If the org profile doesn't exist and overrides don't provide enough context, fail. + if (!organizationState && !overrides?.connectedAccount) { + throw new Error(`Organization ${c.state.organizationId} is not initialized and no override context was provided`); } + const auth = await resolveOrganizationGithubAuth(c, c.state.organizationId); return { - kind: overrides?.kind ?? organizationState.snapshot.kind, - githubLogin: overrides?.githubLogin ?? organizationState.githubLogin, - connectedAccount: overrides?.connectedAccount ?? organizationState.snapshot.github.connectedAccount ?? organizationState.githubLogin, - installationId: overrides?.installationId ?? organizationState.githubInstallationId ?? null, + kind: overrides?.kind ?? organizationState?.snapshot.kind, + githubLogin: overrides?.githubLogin ?? organizationState?.githubLogin, + connectedAccount: overrides?.connectedAccount ?? organizationState?.snapshot.github.connectedAccount ?? organizationState?.githubLogin, + installationId: overrides?.installationId ?? organizationState?.githubInstallationId ?? null, installationStatus: overrides?.installationStatus ?? - organizationState.snapshot.github.installationStatus ?? - (organizationState.snapshot.kind === "personal" ? "connected" : "reconnect_required"), + organizationState?.snapshot.github.installationStatus ?? + (organizationState?.snapshot.kind === "personal" ? "connected" : "reconnect_required"), accessToken: overrides?.accessToken ?? auth?.githubToken ?? null, }; } -async function replaceRepositories(c: any, repositories: GithubRepositoryRecord[], updatedAt: number) { - await c.db.delete(githubRepositories).run(); +async function upsertRepositories(c: any, repositories: GithubRepositoryRecord[], updatedAt: number, syncGeneration: number) { for (const repository of repositories) { await c.db .insert(githubRepositories) @@ -194,30 +273,35 @@ async function replaceRepositories(c: any, repositories: GithubRepositoryRecord[ cloneUrl: repository.cloneUrl, private: repository.private ? 1 : 0, defaultBranch: repository.defaultBranch, + syncGeneration, updatedAt, }) + .onConflictDoUpdate({ + target: githubRepositories.repoId, + set: { + fullName: repository.fullName, + cloneUrl: repository.cloneUrl, + private: repository.private ? 1 : 0, + defaultBranch: repository.defaultBranch, + syncGeneration, + updatedAt, + }, + }) .run(); } } -async function replaceBranches(c: any, branches: GithubBranchRecord[], updatedAt: number) { - await c.db.delete(githubBranches).run(); - for (const branch of branches) { - await c.db - .insert(githubBranches) - .values({ - branchId: `${branch.repoId}:${branch.branchName}`, - repoId: branch.repoId, - branchName: branch.branchName, - commitSha: branch.commitSha, - updatedAt, - }) - .run(); +async function sweepRepositories(c: any, syncGeneration: number) { + const rows = await c.db.select({ repoId: githubRepositories.repoId, syncGeneration: githubRepositories.syncGeneration }).from(githubRepositories).all(); + for (const row of rows) { + if (row.syncGeneration === syncGeneration) { + continue; + } + await c.db.delete(githubRepositories).where(eq(githubRepositories.repoId, row.repoId)).run(); } } -async function replaceMembers(c: any, members: GithubMemberRecord[], updatedAt: number) { - await c.db.delete(githubMembers).run(); +async function upsertMembers(c: any, members: GithubMemberRecord[], updatedAt: number, syncGeneration: number) { for (const member of members) { await c.db .insert(githubMembers) @@ -228,14 +312,36 @@ async function replaceMembers(c: any, members: GithubMemberRecord[], updatedAt: email: member.email ?? null, role: member.role ?? null, state: member.state ?? "active", + syncGeneration, updatedAt, }) + .onConflictDoUpdate({ + target: githubMembers.memberId, + set: { + login: member.login, + displayName: member.name || member.login, + email: member.email ?? null, + role: member.role ?? null, + state: member.state ?? "active", + syncGeneration, + updatedAt, + }, + }) .run(); } } -async function replacePullRequests(c: any, pullRequests: GithubPullRequestRecord[]) { - await c.db.delete(githubPullRequests).run(); +async function sweepMembers(c: any, syncGeneration: number) { + const rows = await c.db.select({ memberId: githubMembers.memberId, syncGeneration: githubMembers.syncGeneration }).from(githubMembers).all(); + for (const row of rows) { + if (row.syncGeneration === syncGeneration) { + continue; + } + await c.db.delete(githubMembers).where(eq(githubMembers.memberId, row.memberId)).run(); + } +} + +async function upsertPullRequests(c: any, pullRequests: GithubPullRequestRecord[], syncGeneration: number) { for (const pullRequest of pullRequests) { await c.db .insert(githubPullRequests) @@ -252,19 +358,57 @@ async function replacePullRequests(c: any, pullRequests: GithubPullRequestRecord baseRefName: pullRequest.baseRefName, authorLogin: pullRequest.authorLogin ?? null, isDraft: pullRequest.isDraft ? 1 : 0, + syncGeneration, updatedAt: pullRequest.updatedAt, }) + .onConflictDoUpdate({ + target: githubPullRequests.prId, + set: { + repoId: pullRequest.repoId, + repoFullName: pullRequest.repoFullName, + number: pullRequest.number, + title: pullRequest.title, + body: pullRequest.body ?? null, + state: pullRequest.state, + url: pullRequest.url, + headRefName: pullRequest.headRefName, + baseRefName: pullRequest.baseRefName, + authorLogin: pullRequest.authorLogin ?? null, + isDraft: pullRequest.isDraft ? 1 : 0, + syncGeneration, + updatedAt: pullRequest.updatedAt, + }, + }) .run(); } } -async function refreshTaskSummaryForBranch(c: any, repoId: string, branchName: string) { +async function sweepPullRequests(c: any, syncGeneration: number) { + const rows = await c.db.select({ prId: githubPullRequests.prId, syncGeneration: githubPullRequests.syncGeneration }).from(githubPullRequests).all(); + for (const row of rows) { + if (row.syncGeneration === syncGeneration) { + continue; + } + await c.db.delete(githubPullRequests).where(eq(githubPullRequests.prId, row.prId)).run(); + } +} + +async function refreshTaskSummaryForBranch(c: any, repoId: string, branchName: string, pullRequest: ReturnType | null) { + const repositoryRecord = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, repoId)).get(); + if (!repositoryRecord) { + return; + } const organization = await getOrCreateOrganization(c, c.state.organizationId); - await organization.refreshTaskSummaryForGithubBranch({ repoId, branchName }); + void organization + .send( + organizationWorkflowQueueName("organization.command.refreshTaskSummaryForBranch"), + { repoId, branchName, pullRequest, repoName: repositoryRecord.fullName ?? undefined }, + { wait: false }, + ) + .catch(() => {}); } async function emitPullRequestChangeEvents(c: any, beforeRows: any[], afterRows: any[]) { - const organization = await getOrCreateOrganization(c, c.state.organizationId); const beforeById = new Map(beforeRows.map((row) => [row.prId, row])); const afterById = new Map(afterRows.map((row) => [row.prId, row])); @@ -283,24 +427,24 @@ async function emitPullRequestChangeEvents(c: any, beforeRows: any[], afterRows: if (!changed) { continue; } - await organization.applyOpenPullRequestUpdate({ - pullRequest: pullRequestSummaryFromRow(row), - }); - await refreshTaskSummaryForBranch(c, row.repoId, row.headRefName); + await refreshTaskSummaryForBranch(c, row.repoId, row.headRefName, pullRequestSummaryFromRow(row)); } for (const [prId, row] of beforeById) { if (afterById.has(prId)) { continue; } - await organization.removeOpenPullRequest({ prId }); - await refreshTaskSummaryForBranch(c, row.repoId, row.headRefName); + await refreshTaskSummaryForBranch(c, row.repoId, row.headRefName, null); } } async function autoArchiveTaskForClosedPullRequest(c: any, row: any) { + const repositoryRecord = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, row.repoId)).get(); + if (!repositoryRecord) { + return; + } const organization = await getOrCreateOrganization(c, c.state.organizationId); - const match = await organization.findTaskForGithubBranch({ + const match = await organization.findTaskForBranch({ repoId: row.repoId, branchName: row.headRefName, }); @@ -309,7 +453,7 @@ async function autoArchiveTaskForClosedPullRequest(c: any, row: any) { } try { const task = getTask(c, c.state.organizationId, row.repoId, match.taskId); - await task.archive({ reason: `PR ${String(row.state).toLowerCase()}` }); + void task.send(taskWorkflowQueueName("task.command.archive"), { reason: `PR ${String(row.state).toLowerCase()}` }, { wait: false }).catch(() => {}); } catch { // Best-effort only. Task summary refresh will still clear the PR state. } @@ -361,8 +505,7 @@ async function resolveMembers(c: any, context: Awaited>, repositories: GithubRepositoryRecord[], ): Promise { @@ -416,180 +559,269 @@ async function resolvePullRequests( })); } -async function listRepositoryBranchesForContext( - context: Awaited>, - repository: GithubRepositoryRecord, -): Promise { - const { appShell } = getActorRuntimeContext(); - let branches: Array<{ name: string; commitSha: string }> = []; - - if (context.installationId != null) { - try { - branches = await appShell.github.listInstallationRepositoryBranches(context.installationId, repository.fullName); - } catch (error) { - if (!context.accessToken) { - throw error; - } - } - } - - if (branches.length === 0 && context.accessToken) { - branches = await appShell.github.listUserRepositoryBranches(context.accessToken, repository.fullName); - } - - const repoId = repoIdFromRemote(repository.cloneUrl); - return branches.map((branch) => ({ - repoId, - branchName: branch.name, - commitSha: branch.commitSha, - })); -} - -async function resolveBranches( - _c: any, - context: Awaited>, - repositories: GithubRepositoryRecord[], -): Promise { - return (await Promise.all(repositories.map((repository) => listRepositoryBranchesForContext(context, repository)))).flat(); -} - -async function refreshRepositoryBranches( - c: any, - context: Awaited>, - repository: GithubRepositoryRecord, - updatedAt: number, -): Promise { - const nextBranches = await listRepositoryBranchesForContext(context, repository); - await c.db - .delete(githubBranches) - .where(eq(githubBranches.repoId, repoIdFromRemote(repository.cloneUrl))) - .run(); - - for (const branch of nextBranches) { - await c.db - .insert(githubBranches) - .values({ - branchId: `${branch.repoId}:${branch.branchName}`, - repoId: branch.repoId, - branchName: branch.branchName, - commitSha: branch.commitSha, - updatedAt, - }) - .run(); - } -} - async function readAllPullRequestRows(c: any) { return await c.db.select().from(githubPullRequests).all(); } -async function runFullSync(c: any, input: FullSyncInput = {}) { - const startedAt = Date.now(); - const beforeRows = await readAllPullRequestRows(c); - const context = await getOrganizationContext(c, input); +/** Config returned by fullSyncSetup, passed to subsequent sync phases. */ +export interface FullSyncConfig { + syncGeneration: number; + startedAt: number; + totalRepositoryCount: number; + connectedAccount: string; + installationStatus: string; + installationId: number | null; + beforePrRows: any[]; +} - await writeMeta(c, { +async function readRepositoriesFromDb(c: any): Promise { + const rows = await c.db.select().from(githubRepositories).all(); + return rows.map((r: any) => ({ + fullName: r.fullName, + cloneUrl: r.cloneUrl, + private: Boolean(r.private), + defaultBranch: r.defaultBranch, + })); +} + +/** + * Phase 1: Discover repositories and persist them. + * Returns the config needed by all subsequent phases, or null if nothing to do. + */ +export async function fullSyncSetup(c: any, input: FullSyncInput = {}): Promise { + const startedAt = Date.now(); + const beforePrRows = await readAllPullRequestRows(c); + const currentMeta = await readMeta(c); + const context = await getOrganizationContext(c, input); + const syncGeneration = currentMeta.syncGeneration + 1; + + await publishSyncProgress(c, { connectedAccount: context.connectedAccount, installationStatus: context.installationStatus, installationId: context.installationId, syncStatus: "syncing", lastSyncLabel: input.label?.trim() || "Syncing GitHub data...", + syncGeneration, + syncPhase: "discovering_repositories", + processedRepositoryCount: 0, + totalRepositoryCount: 0, }); const repositories = await resolveRepositories(c, context); - const branches = await resolveBranches(c, context, repositories); - const members = await resolveMembers(c, context); - const pullRequests = await resolvePullRequests(c, context, repositories); + const totalRepositoryCount = repositories.length; - await replaceRepositories(c, repositories, startedAt); - await replaceBranches(c, branches, startedAt); - await replaceMembers(c, members, startedAt); - await replacePullRequests(c, pullRequests); - - const organization = await getOrCreateOrganization(c, c.state.organizationId); - await organization.applyGithubDataProjection({ + await publishSyncProgress(c, { connectedAccount: context.connectedAccount, installationStatus: context.installationStatus, installationId: context.installationId, - syncStatus: "synced", - lastSyncLabel: repositories.length > 0 ? `Synced ${repositories.length} repositories` : "No repositories available", - lastSyncAt: startedAt, - repositories, + syncStatus: "syncing", + lastSyncLabel: totalRepositoryCount > 0 ? `Importing ${totalRepositoryCount} repositories...` : "No repositories available", + syncGeneration, + syncPhase: "syncing_repositories", + processedRepositoryCount: totalRepositoryCount, + totalRepositoryCount, }); - const meta = await writeMeta(c, { - connectedAccount: context.connectedAccount, - installationStatus: context.installationStatus, - installationId: context.installationId, - syncStatus: "synced", - lastSyncLabel: repositories.length > 0 ? `Synced ${repositories.length} repositories` : "No repositories available", - lastSyncAt: startedAt, - }); - - const afterRows = await readAllPullRequestRows(c); - await emitPullRequestChangeEvents(c, beforeRows, afterRows); + await upsertRepositories(c, repositories, startedAt, syncGeneration); return { - ...meta, - repositoryCount: repositories.length, - memberCount: members.length, - pullRequestCount: afterRows.length, + syncGeneration, + startedAt, + totalRepositoryCount, + connectedAccount: context.connectedAccount, + installationStatus: context.installationStatus, + installationId: context.installationId, + beforePrRows, }; } -const GITHUB_DATA_QUEUE_NAMES = ["githubData.command.syncRepos"] as const; - -async function runGithubDataWorkflow(ctx: any): Promise { - // Initial sync: if this actor was just created and has never synced, - // kick off the first full sync automatically. - await ctx.step({ - name: "github-data-initial-sync", - timeout: 5 * 60_000, - run: async () => { - const meta = await readMeta(ctx); - if (meta.syncStatus !== "pending") { - return; // Already synced or syncing — skip initial sync - } - try { - await runFullSync(ctx, { label: "Importing repository catalog..." }); - } catch (error) { - // Best-effort initial sync. Write the error to meta so the client - // sees the failure and can trigger a manual retry. - const currentMeta = await readMeta(ctx); - const organization = await getOrCreateOrganization(ctx, ctx.state.organizationId); - await organization.markOrganizationSyncFailed({ - message: error instanceof Error ? error.message : "GitHub import failed", - installationStatus: currentMeta.installationStatus, - }); - } - }, +/** + * Phase 2: Resolve, upsert, and sweep members. + */ +export async function fullSyncMembers(c: any, config: FullSyncConfig): Promise { + await publishSyncProgress(c, { + connectedAccount: config.connectedAccount, + installationStatus: config.installationStatus, + installationId: config.installationId, + syncStatus: "syncing", + lastSyncLabel: "Syncing GitHub members...", + syncGeneration: config.syncGeneration, + syncPhase: "syncing_members", + processedRepositoryCount: config.totalRepositoryCount, + totalRepositoryCount: config.totalRepositoryCount, }); - // Command loop for explicit sync requests (reload, re-import, etc.) + const context = await getOrganizationContext(c, { + connectedAccount: config.connectedAccount, + installationStatus: config.installationStatus as any, + installationId: config.installationId, + }); + const members = await resolveMembers(c, context); + await upsertMembers(c, members, config.startedAt, config.syncGeneration); + await sweepMembers(c, config.syncGeneration); +} + +/** + * Phase 3 (per-batch): Fetch and upsert pull requests for one batch of repos. + * Returns true when all batches have been processed. + */ +export async function fullSyncPullRequestBatch(c: any, config: FullSyncConfig, batchIndex: number): Promise { + const repos = await readRepositoriesFromDb(c); + const batches = chunkItems(repos, SYNC_REPOSITORY_BATCH_SIZE); + if (batchIndex >= batches.length) return true; + + const batch = batches[batchIndex]!; + const context = await getOrganizationContext(c, { + connectedAccount: config.connectedAccount, + installationStatus: config.installationStatus as any, + installationId: config.installationId, + }); + const batchPRs = await listPullRequestsForRepositories(context, batch); + await upsertPullRequests(c, batchPRs, config.syncGeneration); + + const processedCount = Math.min((batchIndex + 1) * SYNC_REPOSITORY_BATCH_SIZE, repos.length); + await publishSyncProgress(c, { + connectedAccount: config.connectedAccount, + installationStatus: config.installationStatus, + installationId: config.installationId, + syncStatus: "syncing", + lastSyncLabel: `Synced pull requests for ${processedCount} of ${repos.length} repositories`, + syncGeneration: config.syncGeneration, + syncPhase: "syncing_pull_requests", + processedRepositoryCount: processedCount, + totalRepositoryCount: repos.length, + }); + + return false; +} + +/** + * Phase 4: Sweep stale data, publish final state, emit PR change events. + */ +export async function fullSyncFinalize(c: any, config: FullSyncConfig): Promise { + await sweepPullRequests(c, config.syncGeneration); + await sweepRepositories(c, config.syncGeneration); + + await publishSyncProgress(c, { + connectedAccount: config.connectedAccount, + installationStatus: config.installationStatus, + installationId: config.installationId, + syncStatus: "synced", + lastSyncLabel: config.totalRepositoryCount > 0 ? `Synced ${config.totalRepositoryCount} repositories` : "No repositories available", + lastSyncAt: config.startedAt, + syncGeneration: config.syncGeneration, + syncPhase: null, + processedRepositoryCount: config.totalRepositoryCount, + totalRepositoryCount: config.totalRepositoryCount, + }); + + const afterRows = await readAllPullRequestRows(c); + await emitPullRequestChangeEvents(c, config.beforePrRows, afterRows); +} + +/** + * Error handler: publish error sync state when a full sync fails. + */ +/** + * Single-shot full sync: runs all phases (setup, branches, members, PRs, finalize) + * using native JS loops. This must NOT use workflow primitives (step/loop/sleep) + * because it runs inside a workflow step. See workflow.ts for context on why + * sub-loops cause HistoryDivergedError. + */ +export async function runFullSync(c: any, input: FullSyncInput = {}): Promise { + const config = await fullSyncSetup(c, input); + + // Members + await fullSyncMembers(c, config); + + // Pull requests — native loop over batches + for (let i = 0; ; i++) { + const done = await fullSyncPullRequestBatch(c, config, i); + if (done) break; + } + + // Finalize + await fullSyncFinalize(c, config); +} + +export async function fullSyncError(c: any, error: unknown): Promise { + const currentMeta = await readMeta(c); + const message = error instanceof Error ? error.message : "GitHub import failed"; + await publishSyncProgress(c, { + connectedAccount: currentMeta.connectedAccount, + installationStatus: currentMeta.installationStatus, + installationId: currentMeta.installationId, + syncStatus: "error", + lastSyncLabel: message, + syncGeneration: currentMeta.syncGeneration, + syncPhase: null, + processedRepositoryCount: 0, + totalRepositoryCount: 0, + }); +} + +// --------------------------------------------------------------------------- +// Workflow command loop +// --------------------------------------------------------------------------- + +type GithubDataWorkflowHandler = (loopCtx: any, body: any) => Promise; + +const GITHUB_DATA_COMMAND_HANDLERS: Record = { + "githubData.command.syncRepos": async (c, body) => { + try { + await runFullSync(c, body); + return { ok: true }; + } catch (error) { + try { + await fullSyncError(c, error); + } catch { + /* best effort */ + } + throw error; + } + }, + "githubData.command.handlePullRequestWebhook": async (c, body) => { + await handlePullRequestWebhookMutation(c, body); + return { ok: true }; + }, + "githubData.command.clearState": async (c, body) => { + await clearStateMutation(c, body); + return { ok: true }; + }, +}; + +async function runGithubDataWorkflow(ctx: any): Promise { await ctx.loop("github-data-command-loop", async (loopCtx: any) => { const msg = await loopCtx.queue.next("next-github-data-command", { names: [...GITHUB_DATA_QUEUE_NAMES], completable: true, }); + if (!msg) { return Loop.continue(undefined); } + const handler = GITHUB_DATA_COMMAND_HANDLERS[msg.name as GithubDataQueueName]; + if (!handler) { + logActorWarning("github-data", "unknown github-data command", { command: msg.name }); + await msg.complete({ error: `Unknown command: ${msg.name}` }).catch(() => {}); + return Loop.continue(undefined); + } + try { - if (msg.name === "githubData.command.syncRepos") { - await loopCtx.step({ - name: "github-data-sync-repos", - timeout: 5 * 60_000, - run: async () => { - const body = msg.body as FullSyncInput; - await runFullSync(loopCtx, body); - }, - }); - await msg.complete({ ok: true }); - return Loop.continue(undefined); - } + // Wrap in a step so c.state and c.db are accessible inside mutation functions. + const result = await loopCtx.step({ + name: msg.name, + timeout: 10 * 60_000, + run: async () => handler(loopCtx, msg.body), + }); + await msg.complete(result); } catch (error) { - const message = error instanceof Error ? error.message : String(error); + const message = resolveErrorMessage(error); + logActorWarning("github-data", "github-data workflow command failed", { + command: msg.name, + error: message, + }); await msg.complete({ error: message }).catch(() => {}); } @@ -603,22 +835,19 @@ export const githubData = actor({ options: { name: "GitHub Data", icon: "github", - actionTimeout: 5 * 60_000, + actionTimeout: 10 * 60_000, }, createState: (_c, input: GithubDataInput) => ({ organizationId: input.organizationId, }), - run: workflow(runGithubDataWorkflow), actions: { async getSummary(c) { const repositories = await c.db.select().from(githubRepositories).all(); - const branches = await c.db.select().from(githubBranches).all(); const members = await c.db.select().from(githubMembers).all(); const pullRequests = await c.db.select().from(githubPullRequests).all(); return { ...(await readMeta(c)), repositoryCount: repositories.length, - branchCount: branches.length, memberCount: members.length, pullRequestCount: pullRequests.length, }; @@ -649,324 +878,133 @@ export const githubData = actor({ }; }, - async listPullRequestsForRepository(c, input: { repoId: string }) { - const rows = await c.db.select().from(githubPullRequests).where(eq(githubPullRequests.repoId, input.repoId)).all(); - return rows.map(pullRequestSummaryFromRow); - }, - - async listBranchesForRepository(c, input: { repoId: string }) { - const rows = await c.db.select().from(githubBranches).where(eq(githubBranches.repoId, input.repoId)).all(); - return rows - .map((row) => ({ - branchName: row.branchName, - commitSha: row.commitSha, - })) - .sort((left, right) => left.branchName.localeCompare(right.branchName)); - }, - async listOpenPullRequests(c) { - const rows = await c.db.select().from(githubPullRequests).all(); - return rows.map(pullRequestSummaryFromRow).sort((left, right) => right.updatedAtMs - left.updatedAtMs); - }, - - async getPullRequestForBranch(c, input: { repoId: string; branchName: string }) { - const rows = await c.db.select().from(githubPullRequests).where(eq(githubPullRequests.repoId, input.repoId)).all(); - const match = rows.find((candidate) => candidate.headRefName === input.branchName) ?? null; - if (!match) { - return null; - } - return { - number: match.number, - status: match.isDraft ? ("draft" as const) : ("ready" as const), - }; - }, - - async fullSync(c, input: FullSyncInput = {}) { - return await runFullSync(c, input); - }, - - async reloadOrganization(c) { - return await runFullSync(c, { label: "Reloading GitHub organization..." }); - }, - - async reloadAllPullRequests(c) { - return await runFullSync(c, { label: "Reloading GitHub pull requests..." }); - }, - - async reloadRepository(c, input: { repoId: string }) { - const context = await getOrganizationContext(c); - const current = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, input.repoId)).get(); - if (!current) { - throw new Error(`Unknown GitHub repository: ${input.repoId}`); - } - const { appShell } = getActorRuntimeContext(); - const repository = - context.installationId != null - ? await appShell.github.getInstallationRepository(context.installationId, current.fullName) - : context.accessToken - ? await appShell.github.getUserRepository(context.accessToken, current.fullName) - : null; - if (!repository) { - throw new Error(`Unable to reload repository: ${current.fullName}`); - } - - const updatedAt = Date.now(); - await c.db - .insert(githubRepositories) - .values({ - repoId: input.repoId, - fullName: repository.fullName, - cloneUrl: repository.cloneUrl, - private: repository.private ? 1 : 0, - defaultBranch: repository.defaultBranch, - updatedAt, - }) - .onConflictDoUpdate({ - target: githubRepositories.repoId, - set: { - fullName: repository.fullName, - cloneUrl: repository.cloneUrl, - private: repository.private ? 1 : 0, - defaultBranch: repository.defaultBranch, - updatedAt, - }, - }) - .run(); - await refreshRepositoryBranches( - c, - context, - { - fullName: repository.fullName, - cloneUrl: repository.cloneUrl, - private: repository.private, - defaultBranch: repository.defaultBranch, - }, - updatedAt, - ); - - const organization = await getOrCreateOrganization(c, c.state.organizationId); - await organization.applyGithubRepositoryProjection({ - repoId: input.repoId, - remoteUrl: repository.cloneUrl, - }); - return { - repoId: input.repoId, - fullName: repository.fullName, - cloneUrl: repository.cloneUrl, - private: repository.private, - defaultBranch: repository.defaultBranch, - }; - }, - - async reloadPullRequest(c, input: { repoId: string; prNumber: number }) { - const repository = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, input.repoId)).get(); - if (!repository) { - throw new Error(`Unknown GitHub repository: ${input.repoId}`); - } - const context = await getOrganizationContext(c); - const { appShell } = getActorRuntimeContext(); - const pullRequest = - context.installationId != null - ? await appShell.github.getInstallationPullRequest(context.installationId, repository.fullName, input.prNumber) - : context.accessToken - ? await appShell.github.getUserPullRequest(context.accessToken, repository.fullName, input.prNumber) - : null; - if (!pullRequest) { - throw new Error(`Unable to reload pull request #${input.prNumber} for ${repository.fullName}`); - } - - const beforeRows = await readAllPullRequestRows(c); - const updatedAt = Date.now(); - const nextState = normalizePrStatus(pullRequest); - const prId = `${input.repoId}#${input.prNumber}`; - if (nextState === "CLOSED" || nextState === "MERGED") { - await c.db.delete(githubPullRequests).where(eq(githubPullRequests.prId, prId)).run(); - } else { - await c.db - .insert(githubPullRequests) - .values({ - prId, - repoId: input.repoId, - repoFullName: repository.fullName, - number: pullRequest.number, - title: pullRequest.title, - body: pullRequest.body ?? null, - state: nextState, - url: pullRequest.url, - headRefName: pullRequest.headRefName, - baseRefName: pullRequest.baseRefName, - authorLogin: pullRequest.authorLogin ?? null, - isDraft: pullRequest.isDraft ? 1 : 0, - updatedAt, - }) - .onConflictDoUpdate({ - target: githubPullRequests.prId, - set: { - title: pullRequest.title, - body: pullRequest.body ?? null, - state: nextState, - url: pullRequest.url, - headRefName: pullRequest.headRefName, - baseRefName: pullRequest.baseRefName, - authorLogin: pullRequest.authorLogin ?? null, - isDraft: pullRequest.isDraft ? 1 : 0, - updatedAt, - }, - }) - .run(); - } - - const afterRows = await readAllPullRequestRows(c); - await emitPullRequestChangeEvents(c, beforeRows, afterRows); - const closed = afterRows.find((row) => row.prId === prId); - if (!closed && (nextState === "CLOSED" || nextState === "MERGED")) { - const previous = beforeRows.find((row) => row.prId === prId); - if (previous) { - await autoArchiveTaskForClosedPullRequest(c, { - ...previous, - state: nextState, - }); - } - } - return pullRequestSummaryFromRow( - afterRows.find((row) => row.prId === prId) ?? { - prId, - repoId: input.repoId, - repoFullName: repository.fullName, - number: input.prNumber, - title: pullRequest.title, - state: nextState, - url: pullRequest.url, - headRefName: pullRequest.headRefName, - baseRefName: pullRequest.baseRefName, - authorLogin: pullRequest.authorLogin ?? null, - isDraft: pullRequest.isDraft ? 1 : 0, - updatedAt, - }, - ); - }, - - async clearState(c, input: ClearStateInput) { - const beforeRows = await readAllPullRequestRows(c); - await c.db.delete(githubPullRequests).run(); - await c.db.delete(githubBranches).run(); - await c.db.delete(githubRepositories).run(); - await c.db.delete(githubMembers).run(); - await writeMeta(c, { - connectedAccount: input.connectedAccount, - installationStatus: input.installationStatus, - installationId: input.installationId, - syncStatus: "pending", - lastSyncLabel: input.label, - lastSyncAt: null, - }); - - const organization = await getOrCreateOrganization(c, c.state.organizationId); - await organization.applyGithubDataProjection({ - connectedAccount: input.connectedAccount, - installationStatus: input.installationStatus, - installationId: input.installationId, - syncStatus: "pending", - lastSyncLabel: input.label, - lastSyncAt: null, - repositories: [], - }); - await emitPullRequestChangeEvents(c, beforeRows, []); - }, - - async handlePullRequestWebhook(c, input: PullRequestWebhookInput) { - const beforeRows = await readAllPullRequestRows(c); - const repoId = repoIdFromRemote(input.repository.cloneUrl); - const currentRepository = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, repoId)).get(); - const updatedAt = Date.now(); - const state = normalizePrStatus(input.pullRequest); - const prId = `${repoId}#${input.pullRequest.number}`; - - await c.db - .insert(githubRepositories) - .values({ - repoId, - fullName: input.repository.fullName, - cloneUrl: input.repository.cloneUrl, - private: input.repository.private ? 1 : 0, - defaultBranch: currentRepository?.defaultBranch ?? input.pullRequest.baseRefName ?? "main", - updatedAt, - }) - .onConflictDoUpdate({ - target: githubRepositories.repoId, - set: { - fullName: input.repository.fullName, - cloneUrl: input.repository.cloneUrl, - private: input.repository.private ? 1 : 0, - defaultBranch: currentRepository?.defaultBranch ?? input.pullRequest.baseRefName ?? "main", - updatedAt, - }, - }) - .run(); - - if (state === "CLOSED" || state === "MERGED") { - await c.db.delete(githubPullRequests).where(eq(githubPullRequests.prId, prId)).run(); - } else { - await c.db - .insert(githubPullRequests) - .values({ - prId, - repoId, - repoFullName: input.repository.fullName, - number: input.pullRequest.number, - title: input.pullRequest.title, - body: input.pullRequest.body ?? null, - state, - url: input.pullRequest.url, - headRefName: input.pullRequest.headRefName, - baseRefName: input.pullRequest.baseRefName, - authorLogin: input.pullRequest.authorLogin ?? null, - isDraft: input.pullRequest.isDraft ? 1 : 0, - updatedAt, - }) - .onConflictDoUpdate({ - target: githubPullRequests.prId, - set: { - title: input.pullRequest.title, - body: input.pullRequest.body ?? null, - state, - url: input.pullRequest.url, - headRefName: input.pullRequest.headRefName, - baseRefName: input.pullRequest.baseRefName, - authorLogin: input.pullRequest.authorLogin ?? null, - isDraft: input.pullRequest.isDraft ? 1 : 0, - updatedAt, - }, - }) - .run(); - } - - await writeMeta(c, { - connectedAccount: input.connectedAccount, - installationStatus: input.installationStatus, - installationId: input.installationId, - syncStatus: "synced", - lastSyncLabel: "GitHub webhook received", - lastSyncAt: updatedAt, - }); - - const organization = await getOrCreateOrganization(c, c.state.organizationId); - await organization.applyGithubRepositoryProjection({ - repoId, - remoteUrl: input.repository.cloneUrl, - }); - - const afterRows = await readAllPullRequestRows(c); - await emitPullRequestChangeEvents(c, beforeRows, afterRows); - if (state === "CLOSED" || state === "MERGED") { - const previous = beforeRows.find((row) => row.prId === prId); - if (previous) { - await autoArchiveTaskForClosedPullRequest(c, { - ...previous, - state, - }); - } - } + const rows = await c.db + .select() + .from(githubPullRequests) + .where(inArray(githubPullRequests.state, ["OPEN", "DRAFT"])) + .all(); + return rows.map((row) => pullRequestSummaryFromRow(row)); }, }, + run: workflow(runGithubDataWorkflow), }); + +export async function clearStateMutation(c: any, input: ClearStateInput) { + const beforeRows = await readAllPullRequestRows(c); + const currentMeta = await readMeta(c); + await c.db.delete(githubPullRequests).run(); + await c.db.delete(githubRepositories).run(); + await c.db.delete(githubMembers).run(); + await writeMeta(c, { + connectedAccount: input.connectedAccount, + installationStatus: input.installationStatus, + installationId: input.installationId, + syncStatus: "pending", + lastSyncLabel: input.label, + lastSyncAt: null, + syncGeneration: currentMeta.syncGeneration, + syncPhase: null, + processedRepositoryCount: 0, + totalRepositoryCount: 0, + }); + + await emitPullRequestChangeEvents(c, beforeRows, []); +} + +export async function handlePullRequestWebhookMutation(c: any, input: PullRequestWebhookInput) { + const beforeRows = await readAllPullRequestRows(c); + const repoId = repoIdFromRemote(input.repository.cloneUrl); + const currentRepository = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, repoId)).get(); + const updatedAt = Date.now(); + const currentMeta = await readMeta(c); + const state = normalizePrStatus(input.pullRequest); + const prId = `${repoId}#${input.pullRequest.number}`; + + await c.db + .insert(githubRepositories) + .values({ + repoId, + fullName: input.repository.fullName, + cloneUrl: input.repository.cloneUrl, + private: input.repository.private ? 1 : 0, + defaultBranch: currentRepository?.defaultBranch ?? input.pullRequest.baseRefName ?? "main", + syncGeneration: currentMeta.syncGeneration, + updatedAt, + }) + .onConflictDoUpdate({ + target: githubRepositories.repoId, + set: { + fullName: input.repository.fullName, + cloneUrl: input.repository.cloneUrl, + private: input.repository.private ? 1 : 0, + defaultBranch: currentRepository?.defaultBranch ?? input.pullRequest.baseRefName ?? "main", + syncGeneration: currentMeta.syncGeneration, + updatedAt, + }, + }) + .run(); + + if (state === "CLOSED" || state === "MERGED") { + await c.db.delete(githubPullRequests).where(eq(githubPullRequests.prId, prId)).run(); + } else { + await c.db + .insert(githubPullRequests) + .values({ + prId, + repoId, + repoFullName: input.repository.fullName, + number: input.pullRequest.number, + title: input.pullRequest.title, + body: input.pullRequest.body ?? null, + state, + url: input.pullRequest.url, + headRefName: input.pullRequest.headRefName, + baseRefName: input.pullRequest.baseRefName, + authorLogin: input.pullRequest.authorLogin ?? null, + isDraft: input.pullRequest.isDraft ? 1 : 0, + syncGeneration: currentMeta.syncGeneration, + updatedAt, + }) + .onConflictDoUpdate({ + target: githubPullRequests.prId, + set: { + title: input.pullRequest.title, + body: input.pullRequest.body ?? null, + state, + url: input.pullRequest.url, + headRefName: input.pullRequest.headRefName, + baseRefName: input.pullRequest.baseRefName, + authorLogin: input.pullRequest.authorLogin ?? null, + isDraft: input.pullRequest.isDraft ? 1 : 0, + syncGeneration: currentMeta.syncGeneration, + updatedAt, + }, + }) + .run(); + } + + await publishSyncProgress(c, { + connectedAccount: input.connectedAccount, + installationStatus: input.installationStatus, + installationId: input.installationId, + syncStatus: "synced", + lastSyncLabel: "GitHub webhook received", + lastSyncAt: updatedAt, + syncPhase: null, + processedRepositoryCount: 0, + totalRepositoryCount: 0, + }); + + const afterRows = await readAllPullRequestRows(c); + await emitPullRequestChangeEvents(c, beforeRows, afterRows); + if (state === "CLOSED" || state === "MERGED") { + const previous = beforeRows.find((row) => row.prId === prId); + if (previous) { + await autoArchiveTaskForClosedPullRequest(c, { + ...previous, + state, + }); + } + } +} diff --git a/foundry/packages/backend/src/actors/github-data/workflow.ts b/foundry/packages/backend/src/actors/github-data/workflow.ts new file mode 100644 index 0000000..11ece75 --- /dev/null +++ b/foundry/packages/backend/src/actors/github-data/workflow.ts @@ -0,0 +1,73 @@ +// @ts-nocheck +import { logActorWarning, resolveErrorMessage } from "../logging.js"; + +// Dynamic imports to break circular dependency: index.ts imports workflow.ts, +// and workflow.ts needs functions from index.ts. +async function getIndexModule() { + return await import("./index.js"); +} + +export const GITHUB_DATA_QUEUE_NAMES = [ + "githubData.command.syncRepos", + "githubData.command.handlePullRequestWebhook", + "githubData.command.clearState", +] as const; + +export type GithubDataQueueName = (typeof GITHUB_DATA_QUEUE_NAMES)[number]; + +export function githubDataWorkflowQueueName(name: GithubDataQueueName): GithubDataQueueName { + return name; +} + +/** + * Plain run handler (no workflow engine). Drains the queue using `c.queue.iter()` + * with completable messages. This avoids the RivetKit bug where actors created + * from another actor's workflow context never start their `run: workflow(...)`. + */ +export async function runGithubDataCommandLoop(c: any): Promise { + for await (const msg of c.queue.iter({ names: [...GITHUB_DATA_QUEUE_NAMES], completable: true })) { + try { + if (msg.name === "githubData.command.syncRepos") { + try { + const { runFullSync } = await getIndexModule(); + await runFullSync(c, msg.body); + await msg.complete({ ok: true }); + } catch (error) { + const { fullSyncError } = await getIndexModule(); + try { + await fullSyncError(c, error); + } catch { + /* best effort */ + } + const message = error instanceof Error ? error.message : String(error); + await msg.complete({ error: message }).catch(() => {}); + } + continue; + } + + if (msg.name === "githubData.command.handlePullRequestWebhook") { + const { handlePullRequestWebhookMutation } = await getIndexModule(); + await handlePullRequestWebhookMutation(c, msg.body); + await msg.complete({ ok: true }); + continue; + } + + if (msg.name === "githubData.command.clearState") { + const { clearStateMutation } = await getIndexModule(); + await clearStateMutation(c, msg.body); + await msg.complete({ ok: true }); + continue; + } + + logActorWarning("githubData", "unknown queue message", { queueName: msg.name }); + await msg.complete({ error: `Unknown command: ${msg.name}` }); + } catch (error) { + const message = resolveErrorMessage(error); + logActorWarning("githubData", "github-data command failed", { + queueName: msg.name, + error: message, + }); + await msg.complete({ error: message }).catch(() => {}); + } + } +} diff --git a/foundry/packages/backend/src/actors/handles.ts b/foundry/packages/backend/src/actors/handles.ts index bd17fb0..5aa5715 100644 --- a/foundry/packages/backend/src/actors/handles.ts +++ b/foundry/packages/backend/src/actors/handles.ts @@ -1,4 +1,4 @@ -import { authUserKey, githubDataKey, historyKey, organizationKey, repositoryKey, taskKey, taskSandboxKey } from "./keys.js"; +import { auditLogKey, githubDataKey, organizationKey, taskKey, taskSandboxKey, userKey } from "./keys.js"; export function actorClient(c: any) { return c.client(); @@ -10,28 +10,14 @@ export async function getOrCreateOrganization(c: any, organizationId: string) { }); } -export async function getOrCreateAuthUser(c: any, userId: string) { - return await actorClient(c).authUser.getOrCreate(authUserKey(userId), { +export async function getOrCreateUser(c: any, userId: string) { + return await actorClient(c).user.getOrCreate(userKey(userId), { createWithInput: { userId }, }); } -export function getAuthUser(c: any, userId: string) { - return actorClient(c).authUser.get(authUserKey(userId)); -} - -export async function getOrCreateRepository(c: any, organizationId: string, repoId: string, remoteUrl: string) { - return await actorClient(c).repository.getOrCreate(repositoryKey(organizationId, repoId), { - createWithInput: { - organizationId, - repoId, - remoteUrl, - }, - }); -} - -export function getRepository(c: any, organizationId: string, repoId: string) { - return actorClient(c).repository.get(repositoryKey(organizationId, repoId)); +export function getUser(c: any, userId: string) { + return actorClient(c).user.get(userKey(userId)); } export function getTask(c: any, organizationId: string, repoId: string, taskId: string) { @@ -44,11 +30,10 @@ export async function getOrCreateTask(c: any, organizationId: string, repoId: st }); } -export async function getOrCreateHistory(c: any, organizationId: string, repoId: string) { - return await actorClient(c).history.getOrCreate(historyKey(organizationId, repoId), { +export async function getOrCreateAuditLog(c: any, organizationId: string) { + return await actorClient(c).auditLog.getOrCreate(auditLogKey(organizationId), { createWithInput: { organizationId, - repoId, }, }); } @@ -75,8 +60,8 @@ export async function getOrCreateTaskSandbox(c: any, organizationId: string, san }); } -export function selfHistory(c: any) { - return actorClient(c).history.getForId(c.actorId); +export function selfAuditLog(c: any) { + return actorClient(c).auditLog.getForId(c.actorId); } export function selfTask(c: any) { @@ -87,14 +72,14 @@ export function selfOrganization(c: any) { return actorClient(c).organization.getForId(c.actorId); } -export function selfRepository(c: any) { - return actorClient(c).repository.getForId(c.actorId); -} - -export function selfAuthUser(c: any) { - return actorClient(c).authUser.getForId(c.actorId); +export function selfUser(c: any) { + return actorClient(c).user.getForId(c.actorId); } export function selfGithubData(c: any) { return actorClient(c).githubData.getForId(c.actorId); } + +export function selfTaskSandbox(c: any) { + return actorClient(c).taskSandbox.getForId(c.actorId); +} diff --git a/foundry/packages/backend/src/actors/history/db/drizzle.config.ts b/foundry/packages/backend/src/actors/history/db/drizzle.config.ts deleted file mode 100644 index 3b1d8bd..0000000 --- a/foundry/packages/backend/src/actors/history/db/drizzle.config.ts +++ /dev/null @@ -1,6 +0,0 @@ -import { defineConfig } from "rivetkit/db/drizzle"; - -export default defineConfig({ - out: "./src/actors/history/db/drizzle", - schema: "./src/actors/history/db/schema.ts", -}); diff --git a/foundry/packages/backend/src/actors/history/index.ts b/foundry/packages/backend/src/actors/history/index.ts deleted file mode 100644 index fa1373b..0000000 --- a/foundry/packages/backend/src/actors/history/index.ts +++ /dev/null @@ -1,115 +0,0 @@ -// @ts-nocheck -import { and, desc, eq } from "drizzle-orm"; -import { actor, queue } from "rivetkit"; -import { Loop, workflow } from "rivetkit/workflow"; -import type { HistoryEvent } from "@sandbox-agent/foundry-shared"; -import { selfHistory } from "../handles.js"; -import { historyDb } from "./db/db.js"; -import { events } from "./db/schema.js"; - -export interface HistoryInput { - organizationId: string; - repoId: string; -} - -export interface AppendHistoryCommand { - kind: string; - taskId?: string; - branchName?: string; - payload: Record; -} - -export interface ListHistoryParams { - branch?: string; - taskId?: string; - limit?: number; -} - -const HISTORY_QUEUE_NAMES = ["history.command.append"] as const; - -async function appendHistoryRow(loopCtx: any, body: AppendHistoryCommand): Promise { - const now = Date.now(); - await loopCtx.db - .insert(events) - .values({ - taskId: body.taskId ?? null, - branchName: body.branchName ?? null, - kind: body.kind, - payloadJson: JSON.stringify(body.payload), - createdAt: now, - }) - .run(); -} - -async function runHistoryWorkflow(ctx: any): Promise { - await ctx.loop("history-command-loop", async (loopCtx: any) => { - const msg = await loopCtx.queue.next("next-history-command", { - names: [...HISTORY_QUEUE_NAMES], - completable: true, - }); - if (!msg) { - return Loop.continue(undefined); - } - - if (msg.name === "history.command.append") { - await loopCtx.step("append-history-row", async () => appendHistoryRow(loopCtx, msg.body as AppendHistoryCommand)); - await msg.complete({ ok: true }); - } - - return Loop.continue(undefined); - }); -} - -export const history = actor({ - db: historyDb, - queues: { - "history.command.append": queue(), - }, - options: { - name: "History", - icon: "database", - }, - createState: (_c, input: HistoryInput) => ({ - organizationId: input.organizationId, - repoId: input.repoId, - }), - actions: { - async append(c, command: AppendHistoryCommand): Promise { - const self = selfHistory(c); - await self.send("history.command.append", command, { wait: true, timeout: 15_000 }); - }, - - async list(c, params?: ListHistoryParams): Promise { - const whereParts = []; - if (params?.taskId) { - whereParts.push(eq(events.taskId, params.taskId)); - } - if (params?.branch) { - whereParts.push(eq(events.branchName, params.branch)); - } - - const base = c.db - .select({ - id: events.id, - taskId: events.taskId, - branchName: events.branchName, - kind: events.kind, - payloadJson: events.payloadJson, - createdAt: events.createdAt, - }) - .from(events); - - const rows = await (whereParts.length > 0 ? base.where(and(...whereParts)) : base) - .orderBy(desc(events.createdAt)) - .limit(params?.limit ?? 100) - .all(); - - return rows.map((row) => ({ - ...row, - organizationId: c.state.organizationId, - repoId: c.state.repoId, - })); - }, - }, - run: workflow(runHistoryWorkflow), -}); diff --git a/foundry/packages/backend/src/actors/index.ts b/foundry/packages/backend/src/actors/index.ts index 2f9e566..74ede4a 100644 --- a/foundry/packages/backend/src/actors/index.ts +++ b/foundry/packages/backend/src/actors/index.ts @@ -1,43 +1,38 @@ -import { authUser } from "./auth-user/index.js"; +import { user } from "./user/index.js"; import { setup } from "rivetkit"; import { githubData } from "./github-data/index.js"; import { task } from "./task/index.js"; -import { history } from "./history/index.js"; -import { repository } from "./repository/index.js"; +import { auditLog } from "./audit-log/index.js"; import { taskSandbox } from "./sandbox/index.js"; import { organization } from "./organization/index.js"; import { logger } from "../logging.js"; +import { resolveRunnerVersion } from "../config/runner-version.js"; -const RUNNER_VERSION = Math.floor(Date.now() / 1000); +const runnerVersion = resolveRunnerVersion(); export const registry = setup({ serverless: { basePath: "/v1/rivet", }, - runner: { - version: RUNNER_VERSION, - }, + runner: { version: runnerVersion }, logging: { baseLogger: logger, }, use: { - authUser, + user, organization, - repository, task, taskSandbox, - history, + auditLog, githubData, }, }); export * from "./context.js"; -export * from "./events.js"; -export * from "./auth-user/index.js"; +export * from "./audit-log/index.js"; +export * from "./user/index.js"; export * from "./github-data/index.js"; export * from "./task/index.js"; -export * from "./history/index.js"; export * from "./keys.js"; -export * from "./repository/index.js"; export * from "./sandbox/index.js"; export * from "./organization/index.js"; diff --git a/foundry/packages/backend/src/actors/keys.ts b/foundry/packages/backend/src/actors/keys.ts index 59e669e..03bd014 100644 --- a/foundry/packages/backend/src/actors/keys.ts +++ b/foundry/packages/backend/src/actors/keys.ts @@ -4,24 +4,21 @@ export function organizationKey(organizationId: string): ActorKey { return ["org", organizationId]; } -export function authUserKey(userId: string): ActorKey { +export function userKey(userId: string): ActorKey { return ["org", "app", "user", userId]; } -export function repositoryKey(organizationId: string, repoId: string): ActorKey { - return ["org", organizationId, "repository", repoId]; -} - export function taskKey(organizationId: string, repoId: string, taskId: string): ActorKey { - return ["org", organizationId, "repository", repoId, "task", taskId]; + return ["org", organizationId, "task", repoId, taskId]; } export function taskSandboxKey(organizationId: string, sandboxId: string): ActorKey { return ["org", organizationId, "sandbox", sandboxId]; } -export function historyKey(organizationId: string, repoId: string): ActorKey { - return ["org", organizationId, "repository", repoId, "history"]; +/** One audit log per org (not per repo) — see audit-log/index.ts for rationale. */ +export function auditLogKey(organizationId: string): ActorKey { + return ["org", organizationId, "audit-log"]; } export function githubDataKey(organizationId: string): ActorKey { diff --git a/foundry/packages/backend/src/actors/logging.ts b/foundry/packages/backend/src/actors/logging.ts index afc7d37..a61685f 100644 --- a/foundry/packages/backend/src/actors/logging.ts +++ b/foundry/packages/backend/src/actors/logging.ts @@ -22,6 +22,16 @@ export function resolveErrorStack(error: unknown): string | undefined { return undefined; } +export function logActorInfo(scope: string, message: string, context?: Record): void { + logger.info( + { + scope, + ...(context ?? {}), + }, + message, + ); +} + export function logActorWarning(scope: string, message: string, context?: Record): void { logger.warn( { diff --git a/foundry/packages/backend/src/actors/organization/actions.ts b/foundry/packages/backend/src/actors/organization/actions.ts index 70da62b..2298cd9 100644 --- a/foundry/packages/backend/src/actors/organization/actions.ts +++ b/foundry/packages/backend/src/actors/organization/actions.ts @@ -1,78 +1,30 @@ // @ts-nocheck import { desc, eq } from "drizzle-orm"; -import { Loop } from "rivetkit/workflow"; import type { - CreateTaskInput, - HistoryEvent, - HistoryQueryInput, - ListTasksInput, - SandboxProviderId, - RepoOverview, RepoRecord, - StarSandboxAgentRepoInput, - StarSandboxAgentRepoResult, - SwitchResult, - TaskRecord, - TaskSummary, - TaskWorkbenchChangeModelInput, - TaskWorkbenchCreateTaskInput, - TaskWorkbenchDiffInput, - TaskWorkbenchRenameInput, - TaskWorkbenchRenameSessionInput, - TaskWorkbenchSelectInput, - TaskWorkbenchSetSessionUnreadInput, - TaskWorkbenchSendMessageInput, - TaskWorkbenchSessionInput, - TaskWorkbenchUpdateDraftInput, - WorkbenchOpenPrSummary, - WorkbenchRepositorySummary, - WorkbenchSessionSummary, - WorkbenchTaskSummary, + WorkspaceRepositorySummary, + WorkspaceTaskSummary, OrganizationEvent, + OrganizationGithubSummary, OrganizationSummarySnapshot, OrganizationUseInput, } from "@sandbox-agent/foundry-shared"; -import { getActorRuntimeContext } from "../context.js"; -import { getGithubData, getOrCreateGithubData, getTask, getOrCreateHistory, getOrCreateRepository, selfOrganization } from "../handles.js"; import { logActorWarning, resolveErrorMessage } from "../logging.js"; -import { defaultSandboxProviderId } from "../../sandbox-config.js"; -import { repoIdFromRemote } from "../../services/repo.js"; -import { resolveOrganizationGithubAuth } from "../../services/github-auth.js"; -import { organizationProfile, taskLookup, repos, taskSummaries } from "./db/schema.js"; -import { agentTypeForModel } from "../task/workbench.js"; -import { expectQueueResponse } from "../../services/queue.js"; -import { organizationAppActions } from "./app-shell.js"; +import { getOrCreateGithubData } from "../handles.js"; +import { organizationProfile, taskSummaries } from "./db/schema.js"; +import { organizationAppActions } from "./actions/app.js"; +import { organizationBetterAuthActions } from "./actions/better-auth.js"; +import { organizationOnboardingActions } from "./actions/onboarding.js"; +import { organizationGithubActions } from "./actions/github.js"; +import { organizationShellActions } from "./actions/organization.js"; +import { organizationTaskActions } from "./actions/tasks.js"; +import { updateOrganizationShellProfileMutation } from "./app-shell.js"; interface OrganizationState { organizationId: string; } -interface GetTaskInput { - organizationId: string; - taskId: string; -} - -interface TaskProxyActionInput extends GetTaskInput { - reason?: string; -} - -interface RepoOverviewInput { - organizationId: string; - repoId: string; -} - -const ORGANIZATION_QUEUE_NAMES = ["organization.command.createTask", "organization.command.syncGithubSession"] as const; -const SANDBOX_AGENT_REPO = "rivet-dev/sandbox-agent"; - -type OrganizationQueueName = (typeof ORGANIZATION_QUEUE_NAMES)[number]; - -export { ORGANIZATION_QUEUE_NAMES }; - -export function organizationWorkflowQueueName(name: OrganizationQueueName): OrganizationQueueName { - return name; -} - -const ORGANIZATION_PROFILE_ROW_ID = "profile"; +const ORGANIZATION_PROFILE_ROW_ID = 1; function assertOrganization(c: { state: OrganizationState }, organizationId: string): void { if (organizationId !== c.state.organizationId) { @@ -80,64 +32,6 @@ function assertOrganization(c: { state: OrganizationState }, organizationId: str } } -async function resolveRepoId(c: any, taskId: string): Promise { - const row = await c.db.select({ repoId: taskLookup.repoId }).from(taskLookup).where(eq(taskLookup.taskId, taskId)).get(); - - if (!row) { - throw new Error(`Unknown task: ${taskId} (not in lookup)`); - } - - return row.repoId; -} - -async function upsertTaskLookupRow(c: any, taskId: string, repoId: string): Promise { - await c.db - .insert(taskLookup) - .values({ - taskId, - repoId, - }) - .onConflictDoUpdate({ - target: taskLookup.taskId, - set: { repoId }, - }) - .run(); -} - -function parseJsonValue(value: string | null | undefined, fallback: T): T { - if (!value) { - return fallback; - } - - try { - return JSON.parse(value) as T; - } catch { - return fallback; - } -} - -async function collectAllTaskSummaries(c: any): Promise { - const repoRows = await c.db.select({ repoId: repos.repoId, remoteUrl: repos.remoteUrl }).from(repos).orderBy(desc(repos.updatedAt)).all(); - - const all: TaskSummary[] = []; - for (const row of repoRows) { - try { - const repository = await getOrCreateRepository(c, c.state.organizationId, row.repoId, row.remoteUrl); - const snapshot = await repository.listTaskSummaries({ includeArchived: true }); - all.push(...snapshot); - } catch (error) { - logActorWarning("organization", "failed collecting tasks for repo", { - organizationId: c.state.organizationId, - repoId: row.repoId, - error: resolveErrorMessage(error), - }); - } - } - - all.sort((a, b) => b.updatedAt - a.updatedAt); - return all; -} - function repoLabelFromRemote(remoteUrl: string): string { try { const url = new URL(remoteUrl.startsWith("http") ? remoteUrl : `https://${remoteUrl}`); @@ -152,34 +46,43 @@ function repoLabelFromRemote(remoteUrl: string): string { return remoteUrl; } -function buildRepoSummary(repoRow: { repoId: string; remoteUrl: string; updatedAt: number }, taskRows: WorkbenchTaskSummary[]): WorkbenchRepositorySummary { - const repoTasks = taskRows.filter((task) => task.repoId === repoRow.repoId); - const latestActivityMs = repoTasks.reduce((latest, task) => Math.max(latest, task.updatedAtMs), repoRow.updatedAt); - +function buildGithubSummary(profile: any, importedRepoCount: number): OrganizationGithubSummary { return { - id: repoRow.repoId, - label: repoLabelFromRemote(repoRow.remoteUrl), - taskCount: repoTasks.length, - latestActivityMs, + connectedAccount: profile?.githubConnectedAccount ?? "", + installationStatus: profile?.githubInstallationStatus ?? "install_required", + syncStatus: profile?.githubSyncStatus ?? "pending", + importedRepoCount, + lastSyncLabel: profile?.githubLastSyncLabel ?? "Waiting for first import", + lastSyncAt: profile?.githubLastSyncAt ?? null, + lastWebhookAt: profile?.githubLastWebhookAt ?? null, + lastWebhookEvent: profile?.githubLastWebhookEvent ?? "", + syncGeneration: profile?.githubSyncGeneration ?? 0, + syncPhase: profile?.githubSyncPhase ?? null, + processedRepositoryCount: profile?.githubProcessedRepositoryCount ?? 0, + totalRepositoryCount: profile?.githubTotalRepositoryCount ?? 0, }; } -function taskSummaryRowFromSummary(taskSummary: WorkbenchTaskSummary) { - return { - taskId: taskSummary.id, - repoId: taskSummary.repoId, - title: taskSummary.title, - status: taskSummary.status, - repoName: taskSummary.repoName, - updatedAtMs: taskSummary.updatedAtMs, - branch: taskSummary.branch, - pullRequestJson: JSON.stringify(taskSummary.pullRequest), - sessionsSummaryJson: JSON.stringify(taskSummary.sessionsSummary), - }; -} +/** + * Reads the organization sidebar snapshot from local tables only — no fan-out + * to child actors. Task summaries are organization-owned and updated via push + * from task actors. + */ +async function getOrganizationSummarySnapshot(c: any): Promise { + const profile = await c.db.select().from(organizationProfile).where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)).get(); -function taskSummaryFromRow(row: any): WorkbenchTaskSummary { - return { + // Fetch repos + open PRs from github-data actor (single actor, not fan-out) + let repoRows: Array<{ repoId: string; fullName: string; cloneUrl: string; private: boolean; defaultBranch: string }> = []; + let openPullRequests: any[] = []; + try { + const githubData = await getOrCreateGithubData(c, c.state.organizationId); + [repoRows, openPullRequests] = await Promise.all([githubData.listRepositories({}), githubData.listOpenPullRequests({})]); + } catch { + // github-data actor may not exist yet + } + + const summaryRows = await c.db.select().from(taskSummaries).orderBy(desc(taskSummaries.updatedAtMs)).all(); + const summaries = summaryRows.map((row) => ({ id: row.taskId, repoId: row.repoId, title: row.title, @@ -187,219 +90,60 @@ function taskSummaryFromRow(row: any): WorkbenchTaskSummary { repoName: row.repoName, updatedAtMs: row.updatedAtMs, branch: row.branch ?? null, - pullRequest: parseJsonValue(row.pullRequestJson, null), - sessionsSummary: parseJsonValue(row.sessionsSummaryJson, []), - }; -} - -async function listOpenPullRequestsSnapshot(c: any, taskRows: WorkbenchTaskSummary[]): Promise { - const githubData = getGithubData(c, c.state.organizationId); - const openPullRequests = await githubData.listOpenPullRequests({}).catch(() => []); - const claimedBranches = new Set(taskRows.filter((task) => task.branch).map((task) => `${task.repoId}:${task.branch}`)); - - return openPullRequests.filter((pullRequest: WorkbenchOpenPrSummary) => !claimedBranches.has(`${pullRequest.repoId}:${pullRequest.headRefName}`)); -} - -async function reconcileWorkbenchProjection(c: any): Promise { - const repoRows = await c.db - .select({ repoId: repos.repoId, remoteUrl: repos.remoteUrl, updatedAt: repos.updatedAt }) - .from(repos) - .orderBy(desc(repos.updatedAt)) - .all(); - - const taskRows: WorkbenchTaskSummary[] = []; - for (const row of repoRows) { - try { - const repository = await getOrCreateRepository(c, c.state.organizationId, row.repoId, row.remoteUrl); - const summaries = await repository.listTaskSummaries({ includeArchived: true }); - for (const summary of summaries) { - try { - await upsertTaskLookupRow(c, summary.taskId, row.repoId); - const task = getTask(c, c.state.organizationId, row.repoId, summary.taskId); - const taskSummary = await task.getTaskSummary({}); - taskRows.push(taskSummary); - await c.db - .insert(taskSummaries) - .values(taskSummaryRowFromSummary(taskSummary)) - .onConflictDoUpdate({ - target: taskSummaries.taskId, - set: taskSummaryRowFromSummary(taskSummary), - }) - .run(); - } catch (error) { - logActorWarning("organization", "failed collecting task summary during reconciliation", { - organizationId: c.state.organizationId, - repoId: row.repoId, - taskId: summary.taskId, - error: resolveErrorMessage(error), - }); - } - } - } catch (error) { - logActorWarning("organization", "failed collecting repo during workbench reconciliation", { - organizationId: c.state.organizationId, - repoId: row.repoId, - error: resolveErrorMessage(error), - }); - } - } - - taskRows.sort((left, right) => right.updatedAtMs - left.updatedAtMs); - return { - organizationId: c.state.organizationId, - repos: repoRows.map((row) => buildRepoSummary(row, taskRows)).sort((left, right) => right.latestActivityMs - left.latestActivityMs), - taskSummaries: taskRows, - openPullRequests: await listOpenPullRequestsSnapshot(c, taskRows), - }; -} - -async function requireWorkbenchTask(c: any, taskId: string) { - const repoId = await resolveRepoId(c, taskId); - return getTask(c, c.state.organizationId, repoId, taskId); -} - -/** - * Reads the organization sidebar snapshot from the organization actor's local SQLite - * plus the org-scoped GitHub actor for open PRs. Task actors still push - * summary updates into `task_summaries`, so the hot read path stays bounded. - */ -async function getOrganizationSummarySnapshot(c: any): Promise { - const repoRows = await c.db - .select({ - repoId: repos.repoId, - remoteUrl: repos.remoteUrl, - updatedAt: repos.updatedAt, - }) - .from(repos) - .orderBy(desc(repos.updatedAt)) - .all(); - const taskRows = await c.db.select().from(taskSummaries).orderBy(desc(taskSummaries.updatedAtMs)).all(); - const summaries = taskRows.map(taskSummaryFromRow); + pullRequest: row.pullRequestJson + ? (() => { + try { + return JSON.parse(row.pullRequestJson); + } catch { + return null; + } + })() + : null, + sessionsSummary: row.sessionsSummaryJson + ? (() => { + try { + return JSON.parse(row.sessionsSummaryJson); + } catch { + return []; + } + })() + : [], + })); return { organizationId: c.state.organizationId, - repos: repoRows.map((row) => buildRepoSummary(row, summaries)).sort((left, right) => right.latestActivityMs - left.latestActivityMs), + github: buildGithubSummary(profile, repoRows.length), + repos: repoRows + .map((repo) => { + const repoTasks = summaries.filter((t) => t.repoId === repo.repoId); + const latestTaskMs = repoTasks.reduce((latest, t) => Math.max(latest, t.updatedAtMs), 0); + return { + id: repo.repoId, + label: repoLabelFromRemote(repo.cloneUrl), + taskCount: repoTasks.length, + latestActivityMs: latestTaskMs || Date.now(), + }; + }) + .sort((a, b) => b.latestActivityMs - a.latestActivityMs), taskSummaries: summaries, - openPullRequests: await listOpenPullRequestsSnapshot(c, summaries), + openPullRequests, }; } -async function broadcastRepoSummary( - c: any, - type: "repoAdded" | "repoUpdated", - repoRow: { repoId: string; remoteUrl: string; updatedAt: number }, -): Promise { - const matchingTaskRows = await c.db.select().from(taskSummaries).where(eq(taskSummaries.repoId, repoRow.repoId)).all(); - const repo = buildRepoSummary(repoRow, matchingTaskRows.map(taskSummaryFromRow)); - c.broadcast("organizationUpdated", { type, repo } satisfies OrganizationEvent); -} - -async function createTaskMutation(c: any, input: CreateTaskInput): Promise { - assertOrganization(c, input.organizationId); - - const { config } = getActorRuntimeContext(); - const sandboxProviderId = input.sandboxProviderId ?? defaultSandboxProviderId(config); - - const repoId = input.repoId; - const repoRow = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).where(eq(repos.repoId, repoId)).get(); - if (!repoRow) { - throw new Error(`Unknown repo: ${repoId}`); - } - const remoteUrl = repoRow.remoteUrl; - - const repository = await getOrCreateRepository(c, c.state.organizationId, repoId, remoteUrl); - - const created = await repository.createTask({ - task: input.task, - sandboxProviderId, - agentType: input.agentType ?? null, - explicitTitle: input.explicitTitle ?? null, - explicitBranchName: input.explicitBranchName ?? null, - onBranch: input.onBranch ?? null, - }); - - await c.db - .insert(taskLookup) - .values({ - taskId: created.taskId, - repoId, - }) - .onConflictDoUpdate({ - target: taskLookup.taskId, - set: { repoId }, - }) - .run(); - - try { - const task = getTask(c, c.state.organizationId, repoId, created.taskId); - await organizationActions.applyTaskSummaryUpdate(c, { - taskSummary: await task.getTaskSummary({}), - }); - } catch (error) { - logActorWarning("organization", "failed seeding task summary after task creation", { - organizationId: c.state.organizationId, - repoId, - taskId: created.taskId, - error: resolveErrorMessage(error), - }); - } - - return created; -} - -export async function runOrganizationWorkflow(ctx: any): Promise { - await ctx.loop("organization-command-loop", async (loopCtx: any) => { - const msg = await loopCtx.queue.next("next-organization-command", { - names: [...ORGANIZATION_QUEUE_NAMES], - completable: true, - }); - if (!msg) { - return Loop.continue(undefined); - } - - try { - if (msg.name === "organization.command.createTask") { - const result = await loopCtx.step({ - name: "organization-create-task", - timeout: 5 * 60_000, - run: async () => createTaskMutation(loopCtx, msg.body as CreateTaskInput), - }); - await msg.complete(result); - return Loop.continue(undefined); - } - - if (msg.name === "organization.command.syncGithubSession") { - await loopCtx.step({ - name: "organization-sync-github-session", - timeout: 60_000, - run: async () => { - const { syncGithubOrganizations } = await import("./app-shell.js"); - await syncGithubOrganizations(loopCtx, msg.body as { sessionId: string; accessToken: string }); - }, - }); - await msg.complete({ ok: true }); - return Loop.continue(undefined); - } - } catch (error) { - const message = resolveErrorMessage(error); - logActorWarning("organization", "organization workflow command failed", { - queueName: msg.name, - error: message, - }); - await msg.complete({ error: message }).catch((completeError: unknown) => { - logActorWarning("organization", "organization workflow failed completing error response", { - queueName: msg.name, - error: resolveErrorMessage(completeError), - }); - }); - } - - return Loop.continue(undefined); - }); +export async function refreshOrganizationSnapshotMutation(c: any): Promise { + c.broadcast("organizationUpdated", { + type: "organizationUpdated", + snapshot: await getOrganizationSummarySnapshot(c), + } satisfies OrganizationEvent); } export const organizationActions = { + ...organizationBetterAuthActions, + ...organizationGithubActions, + ...organizationOnboardingActions, + ...organizationShellActions, ...organizationAppActions, + ...organizationTaskActions, async useOrganization(c: any, input: OrganizationUseInput): Promise<{ organizationId: string }> { assertOrganization(c, input.organizationId); return { organizationId: c.state.organizationId }; @@ -407,482 +151,103 @@ export const organizationActions = { async listRepos(c: any, input: OrganizationUseInput): Promise { assertOrganization(c, input.organizationId); - - const rows = await c.db - .select({ - repoId: repos.repoId, - remoteUrl: repos.remoteUrl, - createdAt: repos.createdAt, - updatedAt: repos.updatedAt, - }) - .from(repos) - .orderBy(desc(repos.updatedAt)) - .all(); - - return rows.map((row) => ({ - organizationId: c.state.organizationId, - repoId: row.repoId, - remoteUrl: row.remoteUrl, - createdAt: row.createdAt, - updatedAt: row.updatedAt, - })); - }, - - async createTask(c: any, input: CreateTaskInput): Promise { - const self = selfOrganization(c); - return expectQueueResponse( - await self.send(organizationWorkflowQueueName("organization.command.createTask"), input, { - wait: true, - timeout: 10_000, - }), - ); - }, - - async starSandboxAgentRepo(c: any, input: StarSandboxAgentRepoInput): Promise { - assertOrganization(c, input.organizationId); - const { driver } = getActorRuntimeContext(); - const auth = await resolveOrganizationGithubAuth(c, c.state.organizationId); - await driver.github.starRepository(SANDBOX_AGENT_REPO, { - githubToken: auth?.githubToken ?? null, - }); - return { - repo: SANDBOX_AGENT_REPO, - starredAt: Date.now(), - }; - }, - - /** - * Called by task actors when their summary-level state changes. - * This is the write path for the local materialized projection; clients read - * the projection via `getOrganizationSummary`, but only task actors should push - * rows into it. - */ - async applyTaskSummaryUpdate(c: any, input: { taskSummary: WorkbenchTaskSummary }): Promise { - await c.db - .insert(taskSummaries) - .values(taskSummaryRowFromSummary(input.taskSummary)) - .onConflictDoUpdate({ - target: taskSummaries.taskId, - set: taskSummaryRowFromSummary(input.taskSummary), - }) - .run(); - c.broadcast("organizationUpdated", { type: "taskSummaryUpdated", taskSummary: input.taskSummary } satisfies OrganizationEvent); - }, - - async removeTaskSummary(c: any, input: { taskId: string }): Promise { - await c.db.delete(taskSummaries).where(eq(taskSummaries.taskId, input.taskId)).run(); - c.broadcast("organizationUpdated", { type: "taskRemoved", taskId: input.taskId } satisfies OrganizationEvent); - }, - - async findTaskForGithubBranch(c: any, input: { repoId: string; branchName: string }): Promise<{ taskId: string | null }> { - const summaries = await c.db.select().from(taskSummaries).where(eq(taskSummaries.repoId, input.repoId)).all(); - const existing = summaries.find((summary) => summary.branch === input.branchName); - return { taskId: existing?.taskId ?? null }; - }, - - async refreshTaskSummaryForGithubBranch(c: any, input: { repoId: string; branchName: string }): Promise { - const summaries = await c.db.select().from(taskSummaries).where(eq(taskSummaries.repoId, input.repoId)).all(); - const matches = summaries.filter((summary) => summary.branch === input.branchName); - - for (const summary of matches) { - try { - const task = getTask(c, c.state.organizationId, input.repoId, summary.taskId); - await organizationActions.applyTaskSummaryUpdate(c, { - taskSummary: await task.getTaskSummary({}), - }); - } catch (error) { - logActorWarning("organization", "failed refreshing task summary for GitHub branch", { - organizationId: c.state.organizationId, - repoId: input.repoId, - branchName: input.branchName, - taskId: summary.taskId, - error: resolveErrorMessage(error), - }); - } + try { + const githubData = await getOrCreateGithubData(c, c.state.organizationId); + const rows = await githubData.listRepositories({}); + return rows.map((row: any) => ({ + organizationId: c.state.organizationId, + repoId: row.repoId, + remoteUrl: row.cloneUrl, + createdAt: row.updatedAt ?? Date.now(), + updatedAt: row.updatedAt ?? Date.now(), + })); + } catch { + return []; } }, - async applyOpenPullRequestUpdate(c: any, input: { pullRequest: WorkbenchOpenPrSummary }): Promise { - const summaries = await c.db.select().from(taskSummaries).where(eq(taskSummaries.repoId, input.pullRequest.repoId)).all(); - if (summaries.some((summary) => summary.branch === input.pullRequest.headRefName)) { - return; - } - c.broadcast("organizationUpdated", { type: "pullRequestUpdated", pullRequest: input.pullRequest } satisfies OrganizationEvent); - }, - - async removeOpenPullRequest(c: any, input: { prId: string }): Promise { - c.broadcast("organizationUpdated", { type: "pullRequestRemoved", prId: input.prId } satisfies OrganizationEvent); - }, - - async applyGithubRepositoryProjection(c: any, input: { repoId: string; remoteUrl: string }): Promise { - const now = Date.now(); - const existing = await c.db.select({ repoId: repos.repoId }).from(repos).where(eq(repos.repoId, input.repoId)).get(); - await c.db - .insert(repos) - .values({ - repoId: input.repoId, - remoteUrl: input.remoteUrl, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: repos.repoId, - set: { - remoteUrl: input.remoteUrl, - updatedAt: now, - }, - }) - .run(); - await broadcastRepoSummary(c, existing ? "repoUpdated" : "repoAdded", { - repoId: input.repoId, - remoteUrl: input.remoteUrl, - updatedAt: now, - }); - }, - - async applyGithubDataProjection( - c: any, - input: { - connectedAccount: string; - installationStatus: string; - installationId: number | null; - syncStatus: string; - lastSyncLabel: string; - lastSyncAt: number | null; - repositories: Array<{ fullName: string; cloneUrl: string; private: boolean }>; - }, - ): Promise { - const existingRepos = await c.db.select({ repoId: repos.repoId, remoteUrl: repos.remoteUrl, updatedAt: repos.updatedAt }).from(repos).all(); - const existingById = new Map(existingRepos.map((repo) => [repo.repoId, repo])); - const nextRepoIds = new Set(); - const now = Date.now(); - - for (const repository of input.repositories) { - const repoId = repoIdFromRemote(repository.cloneUrl); - nextRepoIds.add(repoId); - await c.db - .insert(repos) - .values({ - repoId, - remoteUrl: repository.cloneUrl, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: repos.repoId, - set: { - remoteUrl: repository.cloneUrl, - updatedAt: now, - }, - }) - .run(); - await broadcastRepoSummary(c, existingById.has(repoId) ? "repoUpdated" : "repoAdded", { - repoId, - remoteUrl: repository.cloneUrl, - updatedAt: now, - }); - } - - for (const repo of existingRepos) { - if (nextRepoIds.has(repo.repoId)) { - continue; - } - await c.db.delete(repos).where(eq(repos.repoId, repo.repoId)).run(); - c.broadcast("organizationUpdated", { type: "repoRemoved", repoId: repo.repoId } satisfies OrganizationEvent); - } - - const profile = await c.db - .select({ id: organizationProfile.id }) - .from(organizationProfile) - .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) - .get(); - if (profile) { - await c.db - .update(organizationProfile) - .set({ - githubConnectedAccount: input.connectedAccount, - githubInstallationStatus: input.installationStatus, - githubSyncStatus: input.syncStatus, - githubInstallationId: input.installationId, - githubLastSyncLabel: input.lastSyncLabel, - githubLastSyncAt: input.lastSyncAt, - updatedAt: now, - }) - .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) - .run(); - } - }, - - async recordGithubWebhookReceipt( - c: any, - input: { - organizationId: string; - event: string; - action?: string | null; - receivedAt?: number; - }, - ): Promise { - assertOrganization(c, input.organizationId); - - const profile = await c.db - .select({ id: organizationProfile.id }) - .from(organizationProfile) - .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) - .get(); - if (!profile) { - return; - } - - await c.db - .update(organizationProfile) - .set({ - githubLastWebhookAt: input.receivedAt ?? Date.now(), - githubLastWebhookEvent: input.action ? `${input.event}.${input.action}` : input.event, - }) - .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) - .run(); - }, - async getOrganizationSummary(c: any, input: OrganizationUseInput): Promise { assertOrganization(c, input.organizationId); return await getOrganizationSummarySnapshot(c); }, - async reconcileWorkbenchState(c: any, input: OrganizationUseInput): Promise { - assertOrganization(c, input.organizationId); - return await reconcileWorkbenchProjection(c); - }, - - async createWorkbenchTask(c: any, input: TaskWorkbenchCreateTaskInput): Promise<{ taskId: string; sessionId?: string }> { - // Step 1: Create the task record (wait: true — local state mutations only). - const created = await organizationActions.createTask(c, { - organizationId: c.state.organizationId, - repoId: input.repoId, - task: input.task, - ...(input.title ? { explicitTitle: input.title } : {}), - ...(input.onBranch ? { onBranch: input.onBranch } : input.branch ? { explicitBranchName: input.branch } : {}), - ...(input.model ? { agentType: agentTypeForModel(input.model) } : {}), - }); - - // Step 2: Enqueue session creation + initial message (wait: false). - // The task workflow creates the session record and sends the message in - // the background. The client observes progress via push events on the - // task subscription topic. - const task = await requireWorkbenchTask(c, created.taskId); - await task.createWorkbenchSessionAndSend({ - model: input.model, - text: input.task, - }); - - return { taskId: created.taskId }; - }, - - async markWorkbenchUnread(c: any, input: TaskWorkbenchSelectInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.markWorkbenchUnread({}); - }, - - async renameWorkbenchTask(c: any, input: TaskWorkbenchRenameInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.renameWorkbenchTask(input); - }, - - async renameWorkbenchBranch(c: any, input: TaskWorkbenchRenameInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.renameWorkbenchBranch(input); - }, - - async createWorkbenchSession(c: any, input: TaskWorkbenchSelectInput & { model?: string }): Promise<{ sessionId: string }> { - const task = await requireWorkbenchTask(c, input.taskId); - return await task.createWorkbenchSession({ ...(input.model ? { model: input.model } : {}) }); - }, - - async renameWorkbenchSession(c: any, input: TaskWorkbenchRenameSessionInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.renameWorkbenchSession(input); - }, - - async setWorkbenchSessionUnread(c: any, input: TaskWorkbenchSetSessionUnreadInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.setWorkbenchSessionUnread(input); - }, - - async updateWorkbenchDraft(c: any, input: TaskWorkbenchUpdateDraftInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.updateWorkbenchDraft(input); - }, - - async changeWorkbenchModel(c: any, input: TaskWorkbenchChangeModelInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.changeWorkbenchModel(input); - }, - - async sendWorkbenchMessage(c: any, input: TaskWorkbenchSendMessageInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.sendWorkbenchMessage(input); - }, - - async stopWorkbenchSession(c: any, input: TaskWorkbenchSessionInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.stopWorkbenchSession(input); - }, - - async closeWorkbenchSession(c: any, input: TaskWorkbenchSessionInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.closeWorkbenchSession(input); - }, - - async publishWorkbenchPr(c: any, input: TaskWorkbenchSelectInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.publishWorkbenchPr({}); - }, - - async revertWorkbenchFile(c: any, input: TaskWorkbenchDiffInput): Promise { - const task = await requireWorkbenchTask(c, input.taskId); - await task.revertWorkbenchFile(input); - }, - - async reloadGithubOrganization(c: any): Promise { - await getOrCreateGithubData(c, c.state.organizationId).reloadOrganization({}); - }, - - async reloadGithubPullRequests(c: any): Promise { - await getOrCreateGithubData(c, c.state.organizationId).reloadAllPullRequests({}); - }, - - async reloadGithubRepository(c: any, input: { repoId: string }): Promise { - await getOrCreateGithubData(c, c.state.organizationId).reloadRepository(input); - }, - - async reloadGithubPullRequest(c: any, input: { repoId: string; prNumber: number }): Promise { - await getOrCreateGithubData(c, c.state.organizationId).reloadPullRequest(input); - }, - - async listTasks(c: any, input: ListTasksInput): Promise { - assertOrganization(c, input.organizationId); - - if (input.repoId) { - const repoRow = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).where(eq(repos.repoId, input.repoId)).get(); - if (!repoRow) { - throw new Error(`Unknown repo: ${input.repoId}`); - } - - const repository = await getOrCreateRepository(c, c.state.organizationId, input.repoId, repoRow.remoteUrl); - return await repository.listTaskSummaries({ includeArchived: true }); - } - - return await collectAllTaskSummaries(c); - }, - - async getRepoOverview(c: any, input: RepoOverviewInput): Promise { - assertOrganization(c, input.organizationId); - - const repoRow = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).where(eq(repos.repoId, input.repoId)).get(); - if (!repoRow) { - throw new Error(`Unknown repo: ${input.repoId}`); - } - - const repository = await getOrCreateRepository(c, c.state.organizationId, input.repoId, repoRow.remoteUrl); - return await repository.getRepoOverview({}); - }, - - async switchTask(c: any, taskId: string): Promise { - const repoId = await resolveRepoId(c, taskId); - const h = getTask(c, c.state.organizationId, repoId, taskId); - const record = await h.get(); - const switched = await h.switch(); - - return { - organizationId: c.state.organizationId, - taskId, - sandboxProviderId: record.sandboxProviderId, - switchTarget: switched.switchTarget, - }; - }, - - async history(c: any, input: HistoryQueryInput): Promise { - assertOrganization(c, input.organizationId); - - const limit = input.limit ?? 20; - const repoRows = await c.db.select({ repoId: repos.repoId }).from(repos).all(); - - const allEvents: HistoryEvent[] = []; - - for (const row of repoRows) { - try { - const hist = await getOrCreateHistory(c, c.state.organizationId, row.repoId); - const items = await hist.list({ - branch: input.branch, - taskId: input.taskId, - limit, - }); - allEvents.push(...items); - } catch (error) { - logActorWarning("organization", "history lookup failed for repo", { - organizationId: c.state.organizationId, - repoId: row.repoId, - error: resolveErrorMessage(error), - }); - } - } - - allEvents.sort((a, b) => b.createdAt - a.createdAt); - return allEvents.slice(0, limit); - }, - - async getTask(c: any, input: GetTaskInput): Promise { - assertOrganization(c, input.organizationId); - - const repoId = await resolveRepoId(c, input.taskId); - - const repoRow = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).where(eq(repos.repoId, repoId)).get(); - if (!repoRow) { - throw new Error(`Unknown repo: ${repoId}`); - } - - const repository = await getOrCreateRepository(c, c.state.organizationId, repoId, repoRow.remoteUrl); - return await repository.getTaskEnriched({ taskId: input.taskId }); - }, - - async attachTask(c: any, input: TaskProxyActionInput): Promise<{ target: string; sessionId: string | null }> { - assertOrganization(c, input.organizationId); - const repoId = await resolveRepoId(c, input.taskId); - const h = getTask(c, c.state.organizationId, repoId, input.taskId); - return await h.attach({ reason: input.reason }); - }, - - async pushTask(c: any, input: TaskProxyActionInput): Promise { - assertOrganization(c, input.organizationId); - const repoId = await resolveRepoId(c, input.taskId); - const h = getTask(c, c.state.organizationId, repoId, input.taskId); - await h.push({ reason: input.reason }); - }, - - async syncTask(c: any, input: TaskProxyActionInput): Promise { - assertOrganization(c, input.organizationId); - const repoId = await resolveRepoId(c, input.taskId); - const h = getTask(c, c.state.organizationId, repoId, input.taskId); - await h.sync({ reason: input.reason }); - }, - - async mergeTask(c: any, input: TaskProxyActionInput): Promise { - assertOrganization(c, input.organizationId); - const repoId = await resolveRepoId(c, input.taskId); - const h = getTask(c, c.state.organizationId, repoId, input.taskId); - await h.merge({ reason: input.reason }); - }, - - async archiveTask(c: any, input: TaskProxyActionInput): Promise { - assertOrganization(c, input.organizationId); - const repoId = await resolveRepoId(c, input.taskId); - const h = getTask(c, c.state.organizationId, repoId, input.taskId); - await h.archive({ reason: input.reason }); - }, - - async killTask(c: any, input: TaskProxyActionInput): Promise { - assertOrganization(c, input.organizationId); - const repoId = await resolveRepoId(c, input.taskId); - const h = getTask(c, c.state.organizationId, repoId, input.taskId); - await h.kill({ reason: input.reason }); + // updateShellProfile stays as a direct action — called with await from HTTP handler where the user can retry + async updateShellProfile(c: any, input: { displayName?: string; slug?: string; primaryDomain?: string }): Promise { + await updateOrganizationShellProfileMutation(c, input); }, }; + +export async function applyGithubSyncProgressMutation( + c: any, + input: { + connectedAccount: string; + installationStatus: string; + installationId: number | null; + syncStatus: string; + lastSyncLabel: string; + lastSyncAt: number | null; + syncGeneration: number; + syncPhase: string | null; + processedRepositoryCount: number; + totalRepositoryCount: number; + }, +): Promise { + const profile = await c.db + .select({ id: organizationProfile.id }) + .from(organizationProfile) + .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) + .get(); + if (!profile) { + return; + } + + await c.db + .update(organizationProfile) + .set({ + githubConnectedAccount: input.connectedAccount, + githubInstallationStatus: input.installationStatus, + githubSyncStatus: input.syncStatus, + githubInstallationId: input.installationId, + githubLastSyncLabel: input.lastSyncLabel, + githubLastSyncAt: input.lastSyncAt, + githubSyncGeneration: input.syncGeneration, + githubSyncPhase: input.syncPhase, + githubProcessedRepositoryCount: input.processedRepositoryCount, + githubTotalRepositoryCount: input.totalRepositoryCount, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) + .run(); + + await refreshOrganizationSnapshotMutation(c); +} + +export async function recordGithubWebhookReceiptMutation( + c: any, + input: { + organizationId: string; + event: string; + action?: string | null; + receivedAt?: number; + }, +): Promise { + assertOrganization(c, input.organizationId); + + const profile = await c.db + .select({ id: organizationProfile.id }) + .from(organizationProfile) + .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) + .get(); + if (!profile) { + return; + } + + await c.db + .update(organizationProfile) + .set({ + githubLastWebhookAt: input.receivedAt ?? Date.now(), + githubLastWebhookEvent: input.action ? `${input.event}.${input.action}` : input.event, + }) + .where(eq(organizationProfile.id, ORGANIZATION_PROFILE_ROW_ID)) + .run(); +} diff --git a/foundry/packages/backend/src/actors/organization/actions/app.ts b/foundry/packages/backend/src/actors/organization/actions/app.ts new file mode 100644 index 0000000..d3cc329 --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/actions/app.ts @@ -0,0 +1 @@ +export { organizationAppActions } from "../app-shell.js"; diff --git a/foundry/packages/backend/src/actors/organization/actions/better-auth.ts b/foundry/packages/backend/src/actors/organization/actions/better-auth.ts new file mode 100644 index 0000000..060ceed --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/actions/better-auth.ts @@ -0,0 +1,360 @@ +import { and, asc, count as sqlCount, desc, eq, gt, gte, inArray, isNotNull, isNull, like, lt, lte, ne, notInArray, or } from "drizzle-orm"; +import { authAccountIndex, authEmailIndex, authSessionIndex, authVerification } from "../db/schema.js"; +import { APP_SHELL_ORGANIZATION_ID } from "../constants.js"; + +function assertAppOrganization(c: any): void { + if (c.state.organizationId !== APP_SHELL_ORGANIZATION_ID) { + throw new Error(`App shell action requires organization ${APP_SHELL_ORGANIZATION_ID}, got ${c.state.organizationId}`); + } +} + +function organizationAuthColumn(table: any, field: string): any { + const column = table[field]; + if (!column) { + throw new Error(`Unknown auth table field: ${field}`); + } + return column; +} + +function normalizeAuthValue(value: unknown): unknown { + if (value instanceof Date) { + return value.getTime(); + } + if (Array.isArray(value)) { + return value.map((entry) => normalizeAuthValue(entry)); + } + return value; +} + +function organizationAuthClause(table: any, clause: { field: string; value: unknown; operator?: string }): any { + const column = organizationAuthColumn(table, clause.field); + const value = normalizeAuthValue(clause.value); + switch (clause.operator) { + case "ne": + return value === null ? isNotNull(column) : ne(column, value as any); + case "lt": + return lt(column, value as any); + case "lte": + return lte(column, value as any); + case "gt": + return gt(column, value as any); + case "gte": + return gte(column, value as any); + case "in": + return inArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); + case "not_in": + return notInArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); + case "contains": + return like(column, `%${String(value ?? "")}%`); + case "starts_with": + return like(column, `${String(value ?? "")}%`); + case "ends_with": + return like(column, `%${String(value ?? "")}`); + case "eq": + default: + return value === null ? isNull(column) : eq(column, value as any); + } +} + +function organizationBetterAuthWhere(table: any, clauses: any[] | undefined): any { + if (!clauses || clauses.length === 0) { + return undefined; + } + let expr = organizationAuthClause(table, clauses[0]); + for (const clause of clauses.slice(1)) { + const next = organizationAuthClause(table, clause); + expr = clause.connector === "OR" ? or(expr, next) : and(expr, next); + } + return expr; +} + +export async function betterAuthUpsertSessionIndexMutation(c: any, input: { sessionId: string; sessionToken: string; userId: string }) { + assertAppOrganization(c); + + const now = Date.now(); + await c.db + .insert(authSessionIndex) + .values({ + sessionId: input.sessionId, + sessionToken: input.sessionToken, + userId: input.userId, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: authSessionIndex.sessionId, + set: { + sessionToken: input.sessionToken, + userId: input.userId, + updatedAt: now, + }, + }) + .run(); + return await c.db.select().from(authSessionIndex).where(eq(authSessionIndex.sessionId, input.sessionId)).get(); +} + +export async function betterAuthDeleteSessionIndexMutation(c: any, input: { sessionId?: string; sessionToken?: string }) { + assertAppOrganization(c); + + const clauses = [ + ...(input.sessionId ? [{ field: "sessionId", value: input.sessionId }] : []), + ...(input.sessionToken ? [{ field: "sessionToken", value: input.sessionToken }] : []), + ]; + if (clauses.length === 0) { + return; + } + const predicate = organizationBetterAuthWhere(authSessionIndex, clauses); + await c.db.delete(authSessionIndex).where(predicate!).run(); +} + +export async function betterAuthUpsertEmailIndexMutation(c: any, input: { email: string; userId: string }) { + assertAppOrganization(c); + + const now = Date.now(); + await c.db + .insert(authEmailIndex) + .values({ + email: input.email, + userId: input.userId, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: authEmailIndex.email, + set: { + userId: input.userId, + updatedAt: now, + }, + }) + .run(); + return await c.db.select().from(authEmailIndex).where(eq(authEmailIndex.email, input.email)).get(); +} + +export async function betterAuthDeleteEmailIndexMutation(c: any, input: { email: string }) { + assertAppOrganization(c); + await c.db.delete(authEmailIndex).where(eq(authEmailIndex.email, input.email)).run(); +} + +export async function betterAuthUpsertAccountIndexMutation(c: any, input: { id: string; providerId: string; accountId: string; userId: string }) { + assertAppOrganization(c); + + const now = Date.now(); + await c.db + .insert(authAccountIndex) + .values({ + id: input.id, + providerId: input.providerId, + accountId: input.accountId, + userId: input.userId, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: authAccountIndex.id, + set: { + providerId: input.providerId, + accountId: input.accountId, + userId: input.userId, + updatedAt: now, + }, + }) + .run(); + return await c.db.select().from(authAccountIndex).where(eq(authAccountIndex.id, input.id)).get(); +} + +export async function betterAuthDeleteAccountIndexMutation(c: any, input: { id?: string; providerId?: string; accountId?: string }) { + assertAppOrganization(c); + + if (input.id) { + await c.db.delete(authAccountIndex).where(eq(authAccountIndex.id, input.id)).run(); + return; + } + if (input.providerId && input.accountId) { + await c.db + .delete(authAccountIndex) + .where(and(eq(authAccountIndex.providerId, input.providerId), eq(authAccountIndex.accountId, input.accountId))) + .run(); + } +} + +export async function betterAuthCreateVerificationMutation(c: any, input: { data: Record }) { + assertAppOrganization(c); + + await c.db + .insert(authVerification) + .values(input.data as any) + .run(); + return await c.db + .select() + .from(authVerification) + .where(eq(authVerification.id, input.data.id as string)) + .get(); +} + +export async function betterAuthUpdateVerificationMutation(c: any, input: { where: any[]; update: Record }) { + assertAppOrganization(c); + + const predicate = organizationBetterAuthWhere(authVerification, input.where); + if (!predicate) { + return null; + } + await c.db + .update(authVerification) + .set(input.update as any) + .where(predicate) + .run(); + return await c.db.select().from(authVerification).where(predicate).get(); +} + +export async function betterAuthUpdateManyVerificationMutation(c: any, input: { where: any[]; update: Record }) { + assertAppOrganization(c); + + const predicate = organizationBetterAuthWhere(authVerification, input.where); + if (!predicate) { + return 0; + } + await c.db + .update(authVerification) + .set(input.update as any) + .where(predicate) + .run(); + const row = await c.db.select({ value: sqlCount() }).from(authVerification).where(predicate).get(); + return row?.value ?? 0; +} + +export async function betterAuthDeleteVerificationMutation(c: any, input: { where: any[] }) { + assertAppOrganization(c); + + const predicate = organizationBetterAuthWhere(authVerification, input.where); + if (!predicate) { + return; + } + await c.db.delete(authVerification).where(predicate).run(); +} + +export async function betterAuthDeleteManyVerificationMutation(c: any, input: { where: any[] }) { + assertAppOrganization(c); + + const predicate = organizationBetterAuthWhere(authVerification, input.where); + if (!predicate) { + return 0; + } + const rows = await c.db.select().from(authVerification).where(predicate).all(); + await c.db.delete(authVerification).where(predicate).run(); + return rows.length; +} + +// Exception to the CLAUDE.md queue-for-mutations rule: Better Auth adapter operations +// use direct actions even for mutations. Better Auth runs during OAuth callbacks on the +// HTTP request path, not through the normal organization lifecycle. Routing through the +// queue adds multiple sequential round-trips (each with actor wake-up + step overhead) +// that cause 30-second OAuth callbacks and proxy retry storms. These mutations are simple +// SQLite upserts/deletes with no cross-actor coordination or broadcast side effects. +export const organizationBetterAuthActions = { + // --- Mutation actions (called by the Better Auth adapter in better-auth.ts) --- + async betterAuthUpsertSessionIndex(c: any, input: { sessionId: string; sessionToken: string; userId: string }) { + return await betterAuthUpsertSessionIndexMutation(c, input); + }, + async betterAuthDeleteSessionIndex(c: any, input: { sessionId?: string; sessionToken?: string }) { + await betterAuthDeleteSessionIndexMutation(c, input); + }, + async betterAuthUpsertEmailIndex(c: any, input: { email: string; userId: string }) { + return await betterAuthUpsertEmailIndexMutation(c, input); + }, + async betterAuthDeleteEmailIndex(c: any, input: { email: string }) { + await betterAuthDeleteEmailIndexMutation(c, input); + }, + async betterAuthUpsertAccountIndex(c: any, input: { id: string; providerId: string; accountId: string; userId: string }) { + return await betterAuthUpsertAccountIndexMutation(c, input); + }, + async betterAuthDeleteAccountIndex(c: any, input: { id?: string; providerId?: string; accountId?: string }) { + await betterAuthDeleteAccountIndexMutation(c, input); + }, + async betterAuthCreateVerification(c: any, input: { data: Record }) { + return await betterAuthCreateVerificationMutation(c, input); + }, + async betterAuthUpdateVerification(c: any, input: { where: any[]; update: Record }) { + return await betterAuthUpdateVerificationMutation(c, input); + }, + async betterAuthUpdateManyVerification(c: any, input: { where: any[]; update: Record }) { + return await betterAuthUpdateManyVerificationMutation(c, input); + }, + async betterAuthDeleteVerification(c: any, input: { where: any[] }) { + await betterAuthDeleteVerificationMutation(c, input); + }, + async betterAuthDeleteManyVerification(c: any, input: { where: any[] }) { + return await betterAuthDeleteManyVerificationMutation(c, input); + }, + + // --- Read actions --- + async betterAuthFindSessionIndex(c: any, input: { sessionId?: string; sessionToken?: string }) { + assertAppOrganization(c); + + const clauses = [ + ...(input.sessionId ? [{ field: "sessionId", value: input.sessionId }] : []), + ...(input.sessionToken ? [{ field: "sessionToken", value: input.sessionToken }] : []), + ]; + if (clauses.length === 0) { + return null; + } + const predicate = organizationBetterAuthWhere(authSessionIndex, clauses); + return await c.db.select().from(authSessionIndex).where(predicate!).get(); + }, + + async betterAuthFindEmailIndex(c: any, input: { email: string }) { + assertAppOrganization(c); + return await c.db.select().from(authEmailIndex).where(eq(authEmailIndex.email, input.email)).get(); + }, + + async betterAuthFindAccountIndex(c: any, input: { id?: string; providerId?: string; accountId?: string }) { + assertAppOrganization(c); + + if (input.id) { + return await c.db.select().from(authAccountIndex).where(eq(authAccountIndex.id, input.id)).get(); + } + if (!input.providerId || !input.accountId) { + return null; + } + return await c.db + .select() + .from(authAccountIndex) + .where(and(eq(authAccountIndex.providerId, input.providerId), eq(authAccountIndex.accountId, input.accountId))) + .get(); + }, + + async betterAuthFindOneVerification(c: any, input: { where: any[] }) { + assertAppOrganization(c); + + const predicate = organizationBetterAuthWhere(authVerification, input.where); + return predicate ? await c.db.select().from(authVerification).where(predicate).get() : null; + }, + + async betterAuthFindManyVerification(c: any, input: { where?: any[]; limit?: number; sortBy?: any; offset?: number }) { + assertAppOrganization(c); + + const predicate = organizationBetterAuthWhere(authVerification, input.where); + let query = c.db.select().from(authVerification); + if (predicate) { + query = query.where(predicate); + } + if (input.sortBy?.field) { + const column = organizationAuthColumn(authVerification, input.sortBy.field); + query = query.orderBy(input.sortBy.direction === "asc" ? asc(column) : desc(column)); + } + if (typeof input.limit === "number") { + query = query.limit(input.limit); + } + if (typeof input.offset === "number") { + query = query.offset(input.offset); + } + return await query.all(); + }, + + async betterAuthCountVerification(c: any, input: { where?: any[] }) { + assertAppOrganization(c); + + const predicate = organizationBetterAuthWhere(authVerification, input.where); + const row = predicate + ? await c.db.select({ value: sqlCount() }).from(authVerification).where(predicate).get() + : await c.db.select({ value: sqlCount() }).from(authVerification).get(); + return row?.value ?? 0; + }, +}; diff --git a/foundry/packages/backend/src/actors/organization/actions/github.ts b/foundry/packages/backend/src/actors/organization/actions/github.ts new file mode 100644 index 0000000..43818c0 --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/actions/github.ts @@ -0,0 +1,80 @@ +import { desc } from "drizzle-orm"; +import type { FoundryAppSnapshot } from "@sandbox-agent/foundry-shared"; +import { getOrCreateGithubData, getOrCreateOrganization } from "../../handles.js"; +import { githubDataWorkflowQueueName } from "../../github-data/index.js"; +import { authSessionIndex } from "../db/schema.js"; +import { assertAppOrganization, buildAppSnapshot, requireEligibleOrganization, requireSignedInSession } from "../app-shell.js"; +import { getBetterAuthService } from "../../../services/better-auth.js"; +import { refreshOrganizationSnapshotMutation } from "../actions.js"; +import { organizationWorkflowQueueName } from "../queues.js"; + +export const organizationGithubActions = { + async resolveAppGithubToken( + c: any, + input: { organizationId: string; requireRepoScope?: boolean }, + ): Promise<{ accessToken: string; scopes: string[] } | null> { + assertAppOrganization(c); + const auth = getBetterAuthService(); + const rows = await c.db.select().from(authSessionIndex).orderBy(desc(authSessionIndex.updatedAt)).all(); + + for (const row of rows) { + const authState = await auth.getAuthState(row.sessionId); + if (authState?.sessionState?.activeOrganizationId !== input.organizationId) { + continue; + } + + const token = await auth.getAccessTokenForSession(row.sessionId); + if (!token?.accessToken) { + continue; + } + + const scopes = token.scopes; + if (input.requireRepoScope !== false && scopes.length > 0 && !scopes.some((scope) => scope === "repo" || scope.startsWith("repo:"))) { + continue; + } + + return { + accessToken: token.accessToken, + scopes, + }; + } + + return null; + }, + + async triggerAppRepoImport(c: any, input: { sessionId: string; organizationId: string }): Promise { + assertAppOrganization(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + + const githubData = await getOrCreateGithubData(c, input.organizationId); + const summary = await githubData.getSummary({}); + if (summary.syncStatus === "syncing") { + return await buildAppSnapshot(c, input.sessionId); + } + + const organizationHandle = await getOrCreateOrganization(c, input.organizationId); + await organizationHandle.send( + organizationWorkflowQueueName("organization.command.shell.sync_started.mark"), + { label: "Importing repository catalog..." }, + { wait: false }, + ); + await organizationHandle.send(organizationWorkflowQueueName("organization.command.snapshot.broadcast"), {}, { wait: false }); + + void githubData + .send(githubDataWorkflowQueueName("githubData.command.syncRepos"), { label: "Importing repository catalog..." }, { wait: false }) + .catch(() => {}); + + return await buildAppSnapshot(c, input.sessionId); + }, + + async adminReloadGithubOrganization(c: any): Promise { + const githubData = await getOrCreateGithubData(c, c.state.organizationId); + await githubData.send(githubDataWorkflowQueueName("githubData.command.syncRepos"), { label: "Reloading GitHub organization..." }, { wait: false }); + }, + + async adminReloadGithubRepository(c: any, _input: { repoId: string }): Promise { + const githubData = await getOrCreateGithubData(c, c.state.organizationId); + await githubData.send(githubDataWorkflowQueueName("githubData.command.syncRepos"), { label: "Reloading repository..." }, { wait: false }); + }, +}; diff --git a/foundry/packages/backend/src/actors/organization/actions/onboarding.ts b/foundry/packages/backend/src/actors/organization/actions/onboarding.ts new file mode 100644 index 0000000..22153f4 --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/actions/onboarding.ts @@ -0,0 +1,82 @@ +import { randomUUID } from "node:crypto"; +import type { FoundryAppSnapshot, StarSandboxAgentRepoInput, StarSandboxAgentRepoResult } from "@sandbox-agent/foundry-shared"; +import { getOrCreateGithubData, getOrCreateOrganization } from "../../handles.js"; +import { + assertAppOrganization, + buildAppSnapshot, + getOrganizationState, + requireEligibleOrganization, + requireSignedInSession, +} from "../app-shell.js"; +import { getBetterAuthService } from "../../../services/better-auth.js"; +import { getActorRuntimeContext } from "../../context.js"; +import { resolveOrganizationGithubAuth } from "../../../services/github-auth.js"; + +const SANDBOX_AGENT_REPO = "rivet-dev/sandbox-agent"; + +export const organizationOnboardingActions = { + async skipAppStarterRepo(c: any, input: { sessionId: string }): Promise { + assertAppOrganization(c); + const session = await requireSignedInSession(c, input.sessionId); + await getBetterAuthService().upsertUserProfile(session.authUserId, { + starterRepoStatus: "skipped", + starterRepoSkippedAt: Date.now(), + starterRepoStarredAt: null, + }); + return await buildAppSnapshot(c, input.sessionId); + }, + + async starAppStarterRepo(c: any, input: { sessionId: string; organizationId: string }): Promise { + assertAppOrganization(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + const organization = await getOrCreateOrganization(c, input.organizationId); + await organization.starSandboxAgentRepo({ + organizationId: input.organizationId, + }); + await getBetterAuthService().upsertUserProfile(session.authUserId, { + starterRepoStatus: "starred", + starterRepoStarredAt: Date.now(), + starterRepoSkippedAt: null, + }); + return await buildAppSnapshot(c, input.sessionId); + }, + + async selectAppOrganization(c: any, input: { sessionId: string; organizationId: string }): Promise { + assertAppOrganization(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + await getBetterAuthService().setActiveOrganization(input.sessionId, input.organizationId); + await getOrCreateGithubData(c, input.organizationId); + return await buildAppSnapshot(c, input.sessionId); + }, + + async beginAppGithubInstall(c: any, input: { sessionId: string; organizationId: string }): Promise<{ url: string }> { + assertAppOrganization(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + const { appShell } = getActorRuntimeContext(); + const organizationHandle = await getOrCreateOrganization(c, input.organizationId); + const organizationState = await getOrganizationState(organizationHandle); + if (organizationState.snapshot.kind !== "organization") { + return { + url: `${appShell.appUrl}/organizations/${input.organizationId}`, + }; + } + return { + url: await appShell.github.buildInstallationUrl(organizationState.githubLogin, randomUUID()), + }; + }, + + async starSandboxAgentRepo(c: any, input: StarSandboxAgentRepoInput): Promise { + const { driver } = getActorRuntimeContext(); + const auth = await resolveOrganizationGithubAuth(c, c.state.organizationId); + await driver.github.starRepository(SANDBOX_AGENT_REPO, { + githubToken: auth?.githubToken ?? null, + }); + return { + repo: SANDBOX_AGENT_REPO, + starredAt: Date.now(), + }; + }, +}; diff --git a/foundry/packages/backend/src/actors/organization/actions/organization.ts b/foundry/packages/backend/src/actors/organization/actions/organization.ts new file mode 100644 index 0000000..9e1cbd6 --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/actions/organization.ts @@ -0,0 +1,53 @@ +import type { FoundryAppSnapshot, UpdateFoundryOrganizationProfileInput, WorkspaceModelId } from "@sandbox-agent/foundry-shared"; +import { getBetterAuthService } from "../../../services/better-auth.js"; +import { getOrCreateOrganization } from "../../handles.js"; +import { + assertAppOrganization, + assertOrganizationShell, + buildAppSnapshot, + buildOrganizationState, + buildOrganizationStateIfInitialized, + requireEligibleOrganization, + requireSignedInSession, +} from "../app-shell.js"; + +export const organizationShellActions = { + async getAppSnapshot(c: any, input: { sessionId: string }): Promise { + return await buildAppSnapshot(c, input.sessionId); + }, + + async setAppDefaultModel(c: any, input: { sessionId: string; defaultModel: WorkspaceModelId }): Promise { + assertAppOrganization(c); + const session = await requireSignedInSession(c, input.sessionId); + await getBetterAuthService().upsertUserProfile(session.authUserId, { + defaultModel: input.defaultModel, + }); + return await buildAppSnapshot(c, input.sessionId); + }, + + async updateAppOrganizationProfile( + c: any, + input: { sessionId: string; organizationId: string } & UpdateFoundryOrganizationProfileInput, + ): Promise { + assertAppOrganization(c); + const session = await requireSignedInSession(c, input.sessionId); + requireEligibleOrganization(session, input.organizationId); + const organization = await getOrCreateOrganization(c, input.organizationId); + await organization.updateShellProfile({ + displayName: input.displayName, + slug: input.slug, + primaryDomain: input.primaryDomain, + }); + return await buildAppSnapshot(c, input.sessionId); + }, + + async getOrganizationShellState(c: any): Promise { + assertOrganizationShell(c); + return await buildOrganizationState(c); + }, + + async getOrganizationShellStateIfInitialized(c: any): Promise { + assertOrganizationShell(c); + return await buildOrganizationStateIfInitialized(c); + }, +}; diff --git a/foundry/packages/backend/src/actors/organization/actions/task-mutations.ts b/foundry/packages/backend/src/actors/organization/actions/task-mutations.ts new file mode 100644 index 0000000..3affccd --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/actions/task-mutations.ts @@ -0,0 +1,478 @@ +// @ts-nocheck +import { randomUUID } from "node:crypto"; +import { and, desc, eq, isNotNull, ne } from "drizzle-orm"; +import type { + RepoOverview, + SandboxProviderId, + TaskRecord, + TaskSummary, + WorkspacePullRequestSummary, + WorkspaceSessionSummary, + WorkspaceTaskSummary, +} from "@sandbox-agent/foundry-shared"; +import { getActorRuntimeContext } from "../../context.js"; +import { getGithubData, getOrCreateAuditLog, getOrCreateTask, getTask } from "../../handles.js"; +// task actions called directly (no queue) +import { deriveFallbackTitle, resolveCreateFlowDecision } from "../../../services/create-flow.js"; +// actions return directly (no queue response unwrapping) +import { isActorNotFoundError, logActorWarning, resolveErrorMessage } from "../../logging.js"; +import { defaultSandboxProviderId } from "../../../sandbox-config.js"; +import { taskWorkflowQueueName } from "../../task/workflow/queue.js"; +import { expectQueueResponse } from "../../../services/queue.js"; +import { taskIndex, taskSummaries } from "../db/schema.js"; +import { refreshOrganizationSnapshotMutation } from "../actions.js"; + +interface CreateTaskCommand { + repoId: string; + task: string; + sandboxProviderId: SandboxProviderId; + explicitTitle: string | null; + explicitBranchName: string | null; + onBranch: string | null; +} + +interface RegisterTaskBranchCommand { + repoId: string; + taskId: string; + branchName: string; +} + +function isStaleTaskReferenceError(error: unknown): boolean { + const message = resolveErrorMessage(error); + return isActorNotFoundError(error) || message.startsWith("Task not found:"); +} + +function parseJsonValue(value: string | null | undefined, fallback: T): T { + if (!value) { + return fallback; + } + + try { + return JSON.parse(value) as T; + } catch { + return fallback; + } +} + +function taskSummaryRowFromSummary(taskSummary: WorkspaceTaskSummary) { + return { + taskId: taskSummary.id, + repoId: taskSummary.repoId, + title: taskSummary.title, + status: taskSummary.status, + repoName: taskSummary.repoName, + updatedAtMs: taskSummary.updatedAtMs, + branch: taskSummary.branch, + pullRequestJson: JSON.stringify(taskSummary.pullRequest), + sessionsSummaryJson: JSON.stringify(taskSummary.sessionsSummary), + primaryUserLogin: taskSummary.primaryUserLogin ?? null, + primaryUserAvatarUrl: taskSummary.primaryUserAvatarUrl ?? null, + }; +} + +export function taskSummaryFromRow(repoId: string, row: any): WorkspaceTaskSummary { + return { + id: row.taskId, + repoId, + title: row.title, + status: row.status, + repoName: row.repoName, + updatedAtMs: row.updatedAtMs, + branch: row.branch ?? null, + pullRequest: parseJsonValue(row.pullRequestJson, null), + sessionsSummary: parseJsonValue(row.sessionsSummaryJson, []), + primaryUserLogin: row.primaryUserLogin ?? null, + primaryUserAvatarUrl: row.primaryUserAvatarUrl ?? null, + }; +} + +export async function upsertTaskSummary(c: any, taskSummary: WorkspaceTaskSummary): Promise { + await c.db + .insert(taskSummaries) + .values(taskSummaryRowFromSummary(taskSummary)) + .onConflictDoUpdate({ + target: taskSummaries.taskId, + set: taskSummaryRowFromSummary(taskSummary), + }) + .run(); +} + +async function deleteStaleTaskIndexRow(c: any, taskId: string): Promise { + try { + await c.db.delete(taskIndex).where(eq(taskIndex.taskId, taskId)).run(); + } catch { + // Best effort cleanup only. + } +} + +async function listKnownTaskBranches(c: any, repoId: string): Promise { + const rows = await c.db + .select({ branchName: taskIndex.branchName }) + .from(taskIndex) + .where(and(eq(taskIndex.repoId, repoId), isNotNull(taskIndex.branchName))) + .all(); + return rows.map((row) => row.branchName).filter((value): value is string => typeof value === "string" && value.trim().length > 0); +} + +async function resolveGitHubRepository(c: any, repoId: string) { + const githubData = getGithubData(c, c.state.organizationId); + return await githubData.getRepository({ repoId }).catch(() => null); +} + +async function resolveRepositoryRemoteUrl(c: any, repoId: string): Promise { + const repository = await resolveGitHubRepository(c, repoId); + const remoteUrl = repository?.cloneUrl?.trim(); + if (!remoteUrl) { + throw new Error(`Missing remote URL for repo ${repoId}`); + } + return remoteUrl; +} + +/** + * The ONLY backend code path that creates a task actor via getOrCreateTask. + * Called when a user explicitly creates a new task (not during sync/webhooks). + * + * All other code must use getTask (handles.ts) which calls .get() and will + * error if the actor doesn't exist. Virtual tasks created during PR sync + * are materialized lazily by the client's getOrCreate in backend-client.ts. + * + * NEVER call this from a sync loop or webhook handler. + */ +export async function createTaskMutation(c: any, cmd: CreateTaskCommand): Promise { + const organizationId = c.state.organizationId; + const repoId = cmd.repoId; + await resolveRepositoryRemoteUrl(c, repoId); + const onBranch = cmd.onBranch?.trim() || null; + const taskId = randomUUID(); + let initialBranchName: string | null = null; + let initialTitle: string | null = null; + + if (onBranch) { + initialBranchName = onBranch; + initialTitle = deriveFallbackTitle(cmd.task, cmd.explicitTitle ?? undefined); + + await registerTaskBranchMutation(c, { + repoId, + taskId, + branchName: onBranch, + }); + } else { + const reservedBranches = await listKnownTaskBranches(c, repoId); + const resolved = resolveCreateFlowDecision({ + task: cmd.task, + explicitTitle: cmd.explicitTitle ?? undefined, + explicitBranchName: cmd.explicitBranchName ?? undefined, + localBranches: [], + taskBranches: reservedBranches, + }); + + initialBranchName = resolved.branchName; + initialTitle = resolved.title; + + const now = Date.now(); + await c.db + .insert(taskIndex) + .values({ + taskId, + repoId, + branchName: resolved.branchName, + createdAt: now, + updatedAt: now, + }) + .onConflictDoNothing() + .run(); + } + + let taskHandle: Awaited>; + try { + taskHandle = await getOrCreateTask(c, organizationId, repoId, taskId, { + organizationId, + repoId, + taskId, + }); + } catch (error) { + if (initialBranchName) { + await deleteStaleTaskIndexRow(c, taskId); + } + throw error; + } + + const created = expectQueueResponse( + await taskHandle.send( + taskWorkflowQueueName("task.command.initialize"), + { + sandboxProviderId: cmd.sandboxProviderId, + branchName: initialBranchName, + title: initialTitle, + task: cmd.task, + }, + { wait: true, timeout: 10_000 }, + ), + ); + + try { + await upsertTaskSummary(c, await taskHandle.getTaskSummary({})); + await refreshOrganizationSnapshotMutation(c); + } catch (error) { + logActorWarning("organization", "failed seeding task summary after task creation", { + organizationId, + repoId, + taskId, + error: resolveErrorMessage(error), + }); + } + + const auditLog = await getOrCreateAuditLog(c, organizationId); + void auditLog.append({ + kind: "task.created", + repoId, + taskId, + payload: { + repoId, + sandboxProviderId: cmd.sandboxProviderId, + }, + }); + + try { + const taskSummary = await taskHandle.getTaskSummary({}); + await upsertTaskSummary(c, taskSummary); + } catch (error) { + logActorWarning("organization", "failed seeding organization task projection", { + organizationId, + repoId, + taskId, + error: resolveErrorMessage(error), + }); + } + + return created; +} + +export async function registerTaskBranchMutation(c: any, cmd: RegisterTaskBranchCommand): Promise<{ branchName: string }> { + const branchName = cmd.branchName.trim(); + if (!branchName) { + throw new Error("branchName is required"); + } + + const existingOwner = await c.db + .select({ taskId: taskIndex.taskId }) + .from(taskIndex) + .where(and(eq(taskIndex.branchName, branchName), eq(taskIndex.repoId, cmd.repoId), ne(taskIndex.taskId, cmd.taskId))) + .get(); + + if (existingOwner) { + let ownerMissing = false; + try { + await getTask(c, c.state.organizationId, cmd.repoId, existingOwner.taskId).get(); + } catch (error) { + if (isStaleTaskReferenceError(error)) { + ownerMissing = true; + await deleteStaleTaskIndexRow(c, existingOwner.taskId); + } else { + throw error; + } + } + if (!ownerMissing) { + throw new Error(`branch is already assigned to a different task: ${branchName}`); + } + } + + const now = Date.now(); + await c.db + .insert(taskIndex) + .values({ + taskId: cmd.taskId, + repoId: cmd.repoId, + branchName, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: taskIndex.taskId, + set: { + branchName, + updatedAt: now, + }, + }) + .run(); + + return { branchName }; +} + +export async function applyTaskSummaryUpdateMutation(c: any, input: { taskSummary: WorkspaceTaskSummary }): Promise { + await upsertTaskSummary(c, input.taskSummary); + await refreshOrganizationSnapshotMutation(c); +} + +export async function removeTaskSummaryMutation(c: any, input: { taskId: string }): Promise { + await c.db.delete(taskSummaries).where(eq(taskSummaries.taskId, input.taskId)).run(); + await refreshOrganizationSnapshotMutation(c); +} + +/** + * Called for every changed PR during sync and on webhook PR events. + * Runs in a bulk loop — MUST NOT create task actors or make cross-actor calls + * to task actors. Only writes to the org's local taskIndex/taskSummaries tables. + * Task actors are created lazily when the user views the task. + */ +export async function refreshTaskSummaryForBranchMutation( + c: any, + input: { repoId: string; branchName: string; pullRequest?: WorkspacePullRequestSummary | null; repoName?: string }, +): Promise { + const pullRequest = input.pullRequest ?? null; + let rows = await c.db + .select({ taskId: taskSummaries.taskId }) + .from(taskSummaries) + .where(and(eq(taskSummaries.branch, input.branchName), eq(taskSummaries.repoId, input.repoId))) + .all(); + + if (rows.length === 0 && pullRequest) { + // Create a virtual task entry in the org's local tables only. + // No task actor is spawned — it will be created lazily when the user + // clicks on the task in the sidebar (the "materialize" path). + const taskId = randomUUID(); + const now = Date.now(); + const title = pullRequest.title?.trim() || input.branchName; + const repoName = input.repoName ?? `${c.state.organizationId}/${input.repoId}`; + + await c.db + .insert(taskIndex) + .values({ taskId, repoId: input.repoId, branchName: input.branchName, createdAt: now, updatedAt: now }) + .onConflictDoNothing() + .run(); + + await c.db + .insert(taskSummaries) + .values({ + taskId, + repoId: input.repoId, + title, + status: "init_complete", + repoName, + updatedAtMs: pullRequest.updatedAtMs ?? now, + branch: input.branchName, + pullRequestJson: JSON.stringify(pullRequest), + sessionsSummaryJson: "[]", + }) + .onConflictDoNothing() + .run(); + + rows = [{ taskId }]; + } else { + // Update PR data on existing task summaries locally. + // If a real task actor exists, also notify it. + for (const row of rows) { + // Update the local summary with the new PR data + await c.db + .update(taskSummaries) + .set({ + pullRequestJson: pullRequest ? JSON.stringify(pullRequest) : null, + updatedAtMs: pullRequest?.updatedAtMs ?? Date.now(), + }) + .where(eq(taskSummaries.taskId, row.taskId)) + .run(); + + // Best-effort notify the task actor if it exists (fire-and-forget) + try { + const task = getTask(c, c.state.organizationId, input.repoId, row.taskId); + void task.syncPullRequest({ pullRequest }).catch(() => {}); + } catch { + // Task actor doesn't exist yet — that's fine, it's virtual + } + } + } + + await refreshOrganizationSnapshotMutation(c); +} + +export async function listTaskSummariesForRepo(c: any, repoId: string, includeArchived = false): Promise { + const rows = await c.db.select().from(taskSummaries).where(eq(taskSummaries.repoId, repoId)).orderBy(desc(taskSummaries.updatedAtMs)).all(); + return rows + .map((row) => ({ + organizationId: c.state.organizationId, + repoId, + taskId: row.taskId, + branchName: row.branch ?? null, + title: row.title, + status: row.status, + updatedAt: row.updatedAtMs, + pullRequest: parseJsonValue(row.pullRequestJson, null), + })) + .filter((row) => includeArchived || row.status !== "archived"); +} + +export async function listAllTaskSummaries(c: any, includeArchived = false): Promise { + const rows = await c.db.select().from(taskSummaries).orderBy(desc(taskSummaries.updatedAtMs)).all(); + return rows + .map((row) => ({ + organizationId: c.state.organizationId, + repoId: row.repoId, + taskId: row.taskId, + branchName: row.branch ?? null, + title: row.title, + status: row.status, + updatedAt: row.updatedAtMs, + pullRequest: parseJsonValue(row.pullRequestJson, null), + })) + .filter((row) => includeArchived || row.status !== "archived"); +} + +export async function listWorkspaceTaskSummaries(c: any): Promise { + const rows = await c.db.select().from(taskSummaries).orderBy(desc(taskSummaries.updatedAtMs)).all(); + return rows.map((row) => taskSummaryFromRow(row.repoId, row)); +} + +export async function getRepoOverviewFromOrg(c: any, repoId: string): Promise { + const now = Date.now(); + const repository = await resolveGitHubRepository(c, repoId); + const remoteUrl = await resolveRepositoryRemoteUrl(c, repoId); + const taskRows = await c.db.select().from(taskSummaries).where(eq(taskSummaries.repoId, repoId)).all(); + + const branches = taskRows + .filter((row: any) => row.branch) + .map((row: any) => { + const pr = parseJsonValue(row.pullRequestJson, null); + return { + branchName: row.branch!, + commitSha: "", + taskId: row.taskId, + taskTitle: row.title ?? null, + taskStatus: row.status ?? null, + pullRequest: pr, + ciStatus: null, + updatedAt: Math.max(row.updatedAtMs ?? 0, pr?.updatedAtMs ?? 0, now), + }; + }) + .sort((a: any, b: any) => b.updatedAt - a.updatedAt); + + return { + organizationId: c.state.organizationId, + repoId, + remoteUrl, + baseRef: repository?.defaultBranch ?? null, + fetchedAt: now, + branches, + }; +} + +export async function getRepositoryMetadataFromOrg( + c: any, + repoId: string, +): Promise<{ defaultBranch: string | null; fullName: string | null; remoteUrl: string }> { + const repository = await resolveGitHubRepository(c, repoId); + const remoteUrl = await resolveRepositoryRemoteUrl(c, repoId); + return { + defaultBranch: repository?.defaultBranch ?? null, + fullName: repository?.fullName ?? null, + remoteUrl, + }; +} + +export async function findTaskForBranch(c: any, repoId: string, branchName: string): Promise<{ taskId: string | null }> { + const row = await c.db + .select({ taskId: taskSummaries.taskId }) + .from(taskSummaries) + .where(and(eq(taskSummaries.branch, branchName), eq(taskSummaries.repoId, repoId))) + .get(); + return { taskId: row?.taskId ?? null }; +} diff --git a/foundry/packages/backend/src/actors/organization/actions/tasks.ts b/foundry/packages/backend/src/actors/organization/actions/tasks.ts new file mode 100644 index 0000000..80bb2f9 --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/actions/tasks.ts @@ -0,0 +1,377 @@ +// @ts-nocheck +import { desc, eq } from "drizzle-orm"; +import type { + AuditLogEvent, + CreateTaskInput, + HistoryQueryInput, + ListTasksInput, + RepoOverview, + SwitchResult, + TaskRecord, + TaskSummary, + TaskWorkspaceChangeModelInput, + TaskWorkspaceChangeOwnerInput, + TaskWorkspaceCreateTaskInput, + TaskWorkspaceDiffInput, + TaskWorkspaceRenameInput, + TaskWorkspaceRenameSessionInput, + TaskWorkspaceSelectInput, + TaskWorkspaceSetSessionUnreadInput, + TaskWorkspaceSendMessageInput, + TaskWorkspaceSessionInput, + TaskWorkspaceUpdateDraftInput, +} from "@sandbox-agent/foundry-shared"; +import { getActorRuntimeContext } from "../../context.js"; +import { getOrCreateAuditLog, getOrCreateTask, getTask as getTaskHandle } from "../../handles.js"; +import { defaultSandboxProviderId } from "../../../sandbox-config.js"; +import { logActorWarning, resolveErrorMessage } from "../../logging.js"; +import { taskWorkflowQueueName } from "../../task/workflow/queue.js"; +import { expectQueueResponse } from "../../../services/queue.js"; +import { taskIndex, taskSummaries } from "../db/schema.js"; +import { + createTaskMutation, + getRepoOverviewFromOrg, + getRepositoryMetadataFromOrg, + findTaskForBranch, + listTaskSummariesForRepo, + listAllTaskSummaries, +} from "./task-mutations.js"; + +function assertOrganization(c: { state: { organizationId: string } }, organizationId: string): void { + if (organizationId !== c.state.organizationId) { + throw new Error(`Organization actor mismatch: actor=${c.state.organizationId} command=${organizationId}`); + } +} + +/** + * Look up the repoId for a task from the local task index. + * Used when callers (e.g. sandbox actor) only have taskId but need repoId + * to construct the task actor key. + */ +async function resolveTaskRepoId(c: any, taskId: string): Promise { + const row = await c.db.select({ repoId: taskIndex.repoId }).from(taskIndex).where(eq(taskIndex.taskId, taskId)).get(); + if (!row) { + throw new Error(`Task ${taskId} not found in task index`); + } + return row.repoId; +} + +/** + * Get or lazily create a task actor for a user-initiated action. + * Uses getOrCreate because the user may be interacting with a virtual task + * (PR-driven) that has no actor yet. The task actor self-initializes in + * getCurrentRecord() from the org's getTaskIndexEntry data. + * + * This is safe because requireWorkspaceTask is only called from user-initiated + * actions (createSession, sendMessage, etc.), never from sync loops. + * See CLAUDE.md "Lazy Task Actor Creation". + */ +async function requireWorkspaceTask(c: any, repoId: string, taskId: string) { + return getOrCreateTask(c, c.state.organizationId, repoId, taskId, { + organizationId: c.state.organizationId, + repoId, + taskId, + }); +} + +interface GetTaskInput { + organizationId: string; + repoId: string; + taskId: string; +} + +interface TaskProxyActionInput extends GetTaskInput { + reason?: string; +} + +interface RepoOverviewInput { + organizationId: string; + repoId: string; +} + +export { createTaskMutation }; + +export const organizationTaskActions = { + async createTask(c: any, input: CreateTaskInput): Promise { + assertOrganization(c, input.organizationId); + const { config } = getActorRuntimeContext(); + const sandboxProviderId = input.sandboxProviderId ?? defaultSandboxProviderId(config); + + // Self-call: call the mutation directly since we're inside the org actor + return await createTaskMutation(c, { + repoId: input.repoId, + task: input.task, + sandboxProviderId, + explicitTitle: input.explicitTitle ?? null, + explicitBranchName: input.explicitBranchName ?? null, + onBranch: input.onBranch ?? null, + }); + }, + + async materializeTask(c: any, input: { organizationId: string; repoId: string; virtualTaskId: string }): Promise { + assertOrganization(c, input.organizationId); + const { config } = getActorRuntimeContext(); + // Self-call: call the mutation directly + return await createTaskMutation(c, { + repoId: input.repoId, + task: input.virtualTaskId, + sandboxProviderId: defaultSandboxProviderId(config), + explicitTitle: null, + explicitBranchName: null, + onBranch: null, + }); + }, + + async createWorkspaceTask(c: any, input: TaskWorkspaceCreateTaskInput): Promise<{ taskId: string; sessionId?: string }> { + const created = await organizationTaskActions.createTask(c, { + organizationId: c.state.organizationId, + repoId: input.repoId, + task: input.task, + ...(input.title ? { explicitTitle: input.title } : {}), + ...(input.onBranch ? { onBranch: input.onBranch } : input.branch ? { explicitBranchName: input.branch } : {}), + }); + + const task = await requireWorkspaceTask(c, input.repoId, created.taskId); + void task + .send( + taskWorkflowQueueName("task.command.workspace.create_session_and_send"), + { + model: input.model, + text: input.task, + authSessionId: input.authSessionId, + }, + { wait: false }, + ) + .catch(() => {}); + + return { taskId: created.taskId }; + }, + + async markWorkspaceUnread(c: any, input: TaskWorkspaceSelectInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + await task.markUnread({ authSessionId: input.authSessionId }); + }, + + async renameWorkspaceTask(c: any, input: TaskWorkspaceRenameInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + await task.renameTask({ value: input.value }); + }, + + async createWorkspaceSession(c: any, input: TaskWorkspaceSelectInput & { model?: string }): Promise<{ sessionId: string }> { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + return expectQueueResponse( + await task.send( + taskWorkflowQueueName("task.command.workspace.create_session"), + { + ...(input.model ? { model: input.model } : {}), + ...(input.authSessionId ? { authSessionId: input.authSessionId } : {}), + }, + { wait: true, timeout: 10_000 }, + ), + ); + }, + + async renameWorkspaceSession(c: any, input: TaskWorkspaceRenameSessionInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + await task.renameSession({ sessionId: input.sessionId, title: input.title }); + }, + + async selectWorkspaceSession(c: any, input: TaskWorkspaceSessionInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + await task.selectSession({ sessionId: input.sessionId, authSessionId: input.authSessionId }); + }, + + async setWorkspaceSessionUnread(c: any, input: TaskWorkspaceSetSessionUnreadInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + await task.setSessionUnread({ sessionId: input.sessionId, unread: input.unread, authSessionId: input.authSessionId }); + }, + + async updateWorkspaceDraft(c: any, input: TaskWorkspaceUpdateDraftInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + void task + .updateDraft({ + sessionId: input.sessionId, + text: input.text, + attachments: input.attachments, + authSessionId: input.authSessionId, + }) + .catch(() => {}); + }, + + async changeWorkspaceModel(c: any, input: TaskWorkspaceChangeModelInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + await task.changeModel({ sessionId: input.sessionId, model: input.model, authSessionId: input.authSessionId }); + }, + + async sendWorkspaceMessage(c: any, input: TaskWorkspaceSendMessageInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + void task + .send( + taskWorkflowQueueName("task.command.workspace.send_message"), + { + sessionId: input.sessionId, + text: input.text, + attachments: input.attachments, + authSessionId: input.authSessionId, + }, + { wait: false }, + ) + .catch(() => {}); + }, + + async stopWorkspaceSession(c: any, input: TaskWorkspaceSessionInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + void task + .send(taskWorkflowQueueName("task.command.workspace.stop_session"), { sessionId: input.sessionId, authSessionId: input.authSessionId }, { wait: false }) + .catch(() => {}); + }, + + async closeWorkspaceSession(c: any, input: TaskWorkspaceSessionInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + void task + .send(taskWorkflowQueueName("task.command.workspace.close_session"), { sessionId: input.sessionId, authSessionId: input.authSessionId }, { wait: false }) + .catch(() => {}); + }, + + async publishWorkspacePr(c: any, input: TaskWorkspaceSelectInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + void task.send(taskWorkflowQueueName("task.command.workspace.publish_pr"), {}, { wait: false }).catch(() => {}); + }, + + async changeWorkspaceTaskOwner(c: any, input: TaskWorkspaceChangeOwnerInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + await task.send( + taskWorkflowQueueName("task.command.workspace.change_owner"), + { + primaryUserId: input.targetUserId, + primaryGithubLogin: input.targetUserName, + primaryGithubEmail: input.targetUserEmail, + primaryGithubAvatarUrl: null, + }, + { wait: false }, + ); + }, + + async revertWorkspaceFile(c: any, input: TaskWorkspaceDiffInput): Promise { + const task = await requireWorkspaceTask(c, input.repoId, input.taskId); + void task.send(taskWorkflowQueueName("task.command.workspace.revert_file"), input, { wait: false }).catch(() => {}); + }, + + async getRepoOverview(c: any, input: RepoOverviewInput): Promise { + assertOrganization(c, input.organizationId); + + return await getRepoOverviewFromOrg(c, input.repoId); + }, + + async listTasks(c: any, input: ListTasksInput): Promise { + assertOrganization(c, input.organizationId); + if (input.repoId) { + return await listTaskSummariesForRepo(c, input.repoId, true); + } + return await listAllTaskSummaries(c, true); + }, + + async switchTask(c: any, input: { repoId: string; taskId: string }): Promise { + const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); + const record = await h.get(); + const switched = expectQueueResponse<{ switchTarget: string | null }>( + await h.send(taskWorkflowQueueName("task.command.switch"), {}, { wait: true, timeout: 10_000 }), + ); + return { + organizationId: c.state.organizationId, + taskId: input.taskId, + sandboxProviderId: record.sandboxProviderId, + switchTarget: switched.switchTarget, + }; + }, + + async auditLog(c: any, input: HistoryQueryInput): Promise { + assertOrganization(c, input.organizationId); + const auditLog = await getOrCreateAuditLog(c, c.state.organizationId); + return await auditLog.list({ + repoId: input.repoId, + branch: input.branch, + taskId: input.taskId, + limit: input.limit ?? 20, + }); + }, + + async getTask(c: any, input: GetTaskInput): Promise { + assertOrganization(c, input.organizationId); + // Resolve repoId from local task index if not provided (e.g. sandbox actor only has taskId) + const repoId = input.repoId || (await resolveTaskRepoId(c, input.taskId)); + // Use getOrCreate — the task may be virtual (PR-driven, no actor yet). + // The task actor self-initializes in getCurrentRecord(). + const handle = await getOrCreateTask(c, c.state.organizationId, repoId, input.taskId, { + organizationId: c.state.organizationId, + repoId, + taskId: input.taskId, + }); + return await handle.get(); + }, + + async attachTask(c: any, input: TaskProxyActionInput): Promise<{ target: string; sessionId: string | null }> { + assertOrganization(c, input.organizationId); + + const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); + return expectQueueResponse(await h.send(taskWorkflowQueueName("task.command.attach"), { reason: input.reason }, { wait: true, timeout: 10_000 })); + }, + + async pushTask(c: any, input: TaskProxyActionInput): Promise { + assertOrganization(c, input.organizationId); + + const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); + void h.send(taskWorkflowQueueName("task.command.push"), { reason: input.reason }, { wait: false }).catch(() => {}); + }, + + async syncTask(c: any, input: TaskProxyActionInput): Promise { + assertOrganization(c, input.organizationId); + + const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); + void h.send(taskWorkflowQueueName("task.command.sync"), { reason: input.reason }, { wait: false }).catch(() => {}); + }, + + async mergeTask(c: any, input: TaskProxyActionInput): Promise { + assertOrganization(c, input.organizationId); + + const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); + void h.send(taskWorkflowQueueName("task.command.merge"), { reason: input.reason }, { wait: false }).catch(() => {}); + }, + + async archiveTask(c: any, input: TaskProxyActionInput): Promise { + assertOrganization(c, input.organizationId); + + const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); + void h.send(taskWorkflowQueueName("task.command.archive"), { reason: input.reason }, { wait: false }).catch(() => {}); + }, + + async killTask(c: any, input: TaskProxyActionInput): Promise { + assertOrganization(c, input.organizationId); + + const h = getTaskHandle(c, c.state.organizationId, input.repoId, input.taskId); + void h.send(taskWorkflowQueueName("task.command.kill"), { reason: input.reason }, { wait: false }).catch(() => {}); + }, + + async getRepositoryMetadata(c: any, input: { repoId: string }): Promise<{ defaultBranch: string | null; fullName: string | null; remoteUrl: string }> { + return await getRepositoryMetadataFromOrg(c, input.repoId); + }, + + async findTaskForBranch(c: any, input: { repoId: string; branchName: string }): Promise<{ taskId: string | null }> { + return await findTaskForBranch(c, input.repoId, input.branchName); + }, + + /** + * Lightweight read of task index + summary data. Used by the task actor + * to self-initialize when lazily materialized from a virtual task. + * Does NOT trigger materialization — no circular dependency. + */ + async getTaskIndexEntry(c: any, input: { taskId: string }): Promise<{ branchName: string | null; title: string | null } | null> { + const idx = await c.db.select({ branchName: taskIndex.branchName }).from(taskIndex).where(eq(taskIndex.taskId, input.taskId)).get(); + const summary = await c.db.select({ title: taskSummaries.title }).from(taskSummaries).where(eq(taskSummaries.taskId, input.taskId)).get(); + if (!idx && !summary) return null; + return { + branchName: idx?.branchName ?? null, + title: summary?.title ?? null, + }; + }, +}; diff --git a/foundry/packages/backend/src/actors/organization/app-shell.ts b/foundry/packages/backend/src/actors/organization/app-shell.ts index 3339590..ed1005a 100644 --- a/foundry/packages/backend/src/actors/organization/app-shell.ts +++ b/foundry/packages/backend/src/actors/organization/app-shell.ts @@ -1,4 +1,4 @@ -import { and, asc, count as sqlCount, desc, eq, gt, gte, inArray, isNotNull, isNull, like, lt, lte, ne, notInArray, or } from "drizzle-orm"; +import { desc, eq } from "drizzle-orm"; import { randomUUID } from "node:crypto"; import type { FoundryAppSnapshot, @@ -8,109 +8,37 @@ import type { FoundryOrganizationMember, FoundryUser, UpdateFoundryOrganizationProfileInput, + WorkspaceModelId, } from "@sandbox-agent/foundry-shared"; +import { DEFAULT_WORKSPACE_MODEL_ID } from "@sandbox-agent/foundry-shared"; import { getActorRuntimeContext } from "../context.js"; import { getOrCreateGithubData, getOrCreateOrganization, selfOrganization } from "../handles.js"; import { GitHubAppError } from "../../services/app-github.js"; import { getBetterAuthService } from "../../services/better-auth.js"; -import { repoIdFromRemote, repoLabelFromRemote } from "../../services/repo.js"; +import { repoLabelFromRemote } from "../../services/repo.js"; import { logger } from "../../logging.js"; -import { - authAccountIndex, - authEmailIndex, - authSessionIndex, - authVerification, - invoices, - organizationMembers, - organizationProfile, - repos, - seatAssignments, - stripeLookup, -} from "./db/schema.js"; - -export const APP_SHELL_ORGANIZATION_ID = "app"; - -// ── Better Auth adapter where-clause helpers ── -// These convert the adapter's `{ field, value, operator }` clause arrays into -// Drizzle predicates for organization-level auth index / verification tables. - -function organizationAuthColumn(table: any, field: string): any { - const column = table[field]; - if (!column) { - throw new Error(`Unknown auth table field: ${field}`); - } - return column; -} - -function normalizeAuthValue(value: unknown): unknown { - if (value instanceof Date) { - return value.getTime(); - } - if (Array.isArray(value)) { - return value.map((entry) => normalizeAuthValue(entry)); - } - return value; -} - -function organizationAuthClause(table: any, clause: { field: string; value: unknown; operator?: string }): any { - const column = organizationAuthColumn(table, clause.field); - const value = normalizeAuthValue(clause.value); - switch (clause.operator) { - case "ne": - return value === null ? isNotNull(column) : ne(column, value as any); - case "lt": - return lt(column, value as any); - case "lte": - return lte(column, value as any); - case "gt": - return gt(column, value as any); - case "gte": - return gte(column, value as any); - case "in": - return inArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); - case "not_in": - return notInArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); - case "contains": - return like(column, `%${String(value ?? "")}%`); - case "starts_with": - return like(column, `${String(value ?? "")}%`); - case "ends_with": - return like(column, `%${String(value ?? "")}`); - case "eq": - default: - return value === null ? isNull(column) : eq(column, value as any); - } -} - -function organizationAuthWhere(table: any, clauses: any[] | undefined): any { - if (!clauses || clauses.length === 0) { - return undefined; - } - let expr = organizationAuthClause(table, clauses[0]); - for (const clause of clauses.slice(1)) { - const next = organizationAuthClause(table, clause); - expr = clause.connector === "OR" ? or(expr, next) : and(expr, next); - } - return expr; -} +import { githubDataWorkflowQueueName } from "../github-data/index.js"; +import { organizationWorkflowQueueName } from "./queues.js"; +import { invoices, organizationMembers, organizationProfile, seatAssignments, stripeLookup } from "./db/schema.js"; +import { APP_SHELL_ORGANIZATION_ID } from "./constants.js"; const githubWebhookLogger = logger.child({ scope: "github-webhook", }); -const PROFILE_ROW_ID = "profile"; +const PROFILE_ROW_ID = 1; function roundDurationMs(start: number): number { return Math.round((performance.now() - start) * 100) / 100; } -function assertAppOrganization(c: any): void { +export function assertAppOrganization(c: any): void { if (c.state.organizationId !== APP_SHELL_ORGANIZATION_ID) { throw new Error(`App shell action requires organization ${APP_SHELL_ORGANIZATION_ID}, got ${c.state.organizationId}`); } } -function assertOrganizationShell(c: any): void { +export function assertOrganizationShell(c: any): void { if (c.state.organizationId === APP_SHELL_ORGANIZATION_ID) { throw new Error("Organization action cannot run on the reserved app organization"); } @@ -132,10 +60,6 @@ function organizationOrganizationId(kind: FoundryOrganization["kind"], login: st return kind === "personal" ? personalOrganizationId(login) : slugify(login); } -function hasRepoScope(scopes: string[]): boolean { - return scopes.some((scope) => scope === "repo" || scope.startsWith("repo:")); -} - function parseEligibleOrganizationIds(value: string): string[] { try { const parsed = JSON.parse(value); @@ -217,7 +141,9 @@ function stripeWebhookSubscription(event: any) { }; } -async function getOrganizationState(organization: any) { +// sendOrganizationCommand removed — org actions called directly + +export async function getOrganizationState(organization: any) { return await organization.getOrganizationShellState({}); } @@ -290,7 +216,7 @@ async function listSnapshotOrganizations(c: any, sessionId: string, organization }; } -async function buildAppSnapshot(c: any, sessionId: string, allowOrganizationRepair = true): Promise { +export async function buildAppSnapshot(c: any, sessionId: string, allowOrganizationRepair = true): Promise { assertAppOrganization(c); const startedAt = performance.now(); const auth = getBetterAuthService(); @@ -359,6 +285,7 @@ async function buildAppSnapshot(c: any, sessionId: string, allowOrganizationRepa githubLogin: profile?.githubLogin ?? "", roleLabel: profile?.roleLabel ?? "GitHub user", eligibleOrganizationIds, + defaultModel: profile?.defaultModel ?? DEFAULT_WORKSPACE_MODEL_ID, } : null; @@ -404,7 +331,7 @@ async function buildAppSnapshot(c: any, sessionId: string, allowOrganizationRepa return snapshot; } -async function requireSignedInSession(c: any, sessionId: string) { +export async function requireSignedInSession(c: any, sessionId: string) { const auth = getBetterAuthService(); const authState = await auth.getAuthState(sessionId); const user = authState?.user ?? null; @@ -431,7 +358,7 @@ async function requireSignedInSession(c: any, sessionId: string) { }; } -function requireEligibleOrganization(session: any, organizationId: string): void { +export function requireEligibleOrganization(session: any, organizationId: string): void { const eligibleOrganizationIds = parseEligibleOrganizationIds(session.eligibleOrganizationIdsJson); if (!eligibleOrganizationIds.includes(organizationId)) { throw new Error(`Organization ${organizationId} is not available in this app session`); @@ -557,19 +484,23 @@ async function syncGithubOrganizationsInternal(c: any, input: { sessionId: strin const organizationId = organizationOrganizationId(account.kind, account.githubLogin); const installation = installations.find((candidate) => candidate.accountLogin === account.githubLogin) ?? null; const organization = await getOrCreateOrganization(c, organizationId); - await organization.syncOrganizationShellFromGithub({ - userId: githubUserId, - userName: viewer.name || viewer.login, - userEmail: viewer.email ?? `${viewer.login}@users.noreply.github.com`, - githubUserLogin: viewer.login, - githubAccountId: account.githubAccountId, - githubLogin: account.githubLogin, - githubAccountType: account.githubAccountType, - kind: account.kind, - displayName: account.displayName, - installationId: installation?.id ?? null, - appConfigured: appShell.github.isAppConfigured(), - }); + await organization.send( + organizationWorkflowQueueName("organization.command.github.organization_shell.sync_from_github"), + { + userId: githubUserId, + userName: viewer.name || viewer.login, + userEmail: viewer.email ?? `${viewer.login}@users.noreply.github.com`, + githubUserLogin: viewer.login, + githubAccountId: account.githubAccountId, + githubLogin: account.githubLogin, + githubAccountType: account.githubAccountType, + kind: account.kind, + displayName: account.displayName, + installationId: installation?.id ?? null, + appConfigured: appShell.github.isAppConfigured(), + }, + { wait: true, timeout: 10_000 }, + ); linkedOrganizationIds.push(organizationId); } @@ -641,17 +572,22 @@ async function listOrganizationInvoices(c: any): Promise { assertOrganizationShell(c); - const rows = await c.db.select({ remoteUrl: repos.remoteUrl }).from(repos).orderBy(desc(repos.updatedAt)).all(); - return rows.map((row) => repoLabelFromRemote(row.remoteUrl)).sort((left, right) => left.localeCompare(right)); + try { + const githubData = await getOrCreateGithubData(c, c.state.organizationId); + const rows = await githubData.listRepositories({}); + return rows.map((row: any) => repoLabelFromRemote(row.cloneUrl)).sort((a: string, b: string) => a.localeCompare(b)); + } catch { + return []; + } } -async function buildOrganizationState(c: any) { +export async function buildOrganizationState(c: any) { const startedAt = performance.now(); const row = await requireOrganizationProfileRow(c); return await buildOrganizationStateFromRow(c, row, startedAt); } -async function buildOrganizationStateIfInitialized(c: any) { +export async function buildOrganizationStateIfInitialized(c: any) { const startedAt = performance.now(); const row = await readOrganizationProfileRow(c); if (!row) { @@ -685,7 +621,6 @@ async function buildOrganizationStateFromRow(c: any, row: any, startedAt: number slug: row.slug, primaryDomain: row.primaryDomain, seatAccrualMode: "first_prompt", - defaultModel: row.defaultModel, autoImportRepos: row.autoImportRepos === 1, }, github: { @@ -697,6 +632,10 @@ async function buildOrganizationStateFromRow(c: any, row: any, startedAt: number lastSyncAt: row.githubLastSyncAt ?? null, lastWebhookAt: row.githubLastWebhookAt ?? null, lastWebhookEvent: row.githubLastWebhookEvent ?? "", + syncGeneration: row.githubSyncGeneration ?? 0, + syncPhase: row.githubSyncPhase ?? null, + processedRepositoryCount: row.githubProcessedRepositoryCount ?? 0, + totalRepositoryCount: row.githubTotalRepositoryCount ?? 0, }, billing: { planId: row.billingPlanId, @@ -744,396 +683,14 @@ async function applySubscriptionState( }, fallbackPlanId: FoundryBillingPlanId, ): Promise { - await organization.applyOrganizationStripeSubscription({ - subscription, - fallbackPlanId, - }); + await organization.send( + organizationWorkflowQueueName("organization.command.billing.stripe_subscription.apply"), + { subscription, fallbackPlanId }, + { wait: true, timeout: 10_000 }, + ); } export const organizationAppActions = { - async authFindSessionIndex(c: any, input: { sessionId?: string; sessionToken?: string }) { - assertAppOrganization(c); - - const clauses = [ - ...(input.sessionId ? [{ field: "sessionId", value: input.sessionId }] : []), - ...(input.sessionToken ? [{ field: "sessionToken", value: input.sessionToken }] : []), - ]; - if (clauses.length === 0) { - return null; - } - const predicate = organizationAuthWhere(authSessionIndex, clauses); - return await c.db.select().from(authSessionIndex).where(predicate!).get(); - }, - - async authUpsertSessionIndex(c: any, input: { sessionId: string; sessionToken: string; userId: string }) { - assertAppOrganization(c); - - const now = Date.now(); - await c.db - .insert(authSessionIndex) - .values({ - sessionId: input.sessionId, - sessionToken: input.sessionToken, - userId: input.userId, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: authSessionIndex.sessionId, - set: { - sessionToken: input.sessionToken, - userId: input.userId, - updatedAt: now, - }, - }) - .run(); - return await c.db.select().from(authSessionIndex).where(eq(authSessionIndex.sessionId, input.sessionId)).get(); - }, - - async authDeleteSessionIndex(c: any, input: { sessionId?: string; sessionToken?: string }) { - assertAppOrganization(c); - - const clauses = [ - ...(input.sessionId ? [{ field: "sessionId", value: input.sessionId }] : []), - ...(input.sessionToken ? [{ field: "sessionToken", value: input.sessionToken }] : []), - ]; - if (clauses.length === 0) { - return; - } - const predicate = organizationAuthWhere(authSessionIndex, clauses); - await c.db.delete(authSessionIndex).where(predicate!).run(); - }, - - async authFindEmailIndex(c: any, input: { email: string }) { - assertAppOrganization(c); - - return await c.db.select().from(authEmailIndex).where(eq(authEmailIndex.email, input.email)).get(); - }, - - async authUpsertEmailIndex(c: any, input: { email: string; userId: string }) { - assertAppOrganization(c); - - const now = Date.now(); - await c.db - .insert(authEmailIndex) - .values({ - email: input.email, - userId: input.userId, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: authEmailIndex.email, - set: { - userId: input.userId, - updatedAt: now, - }, - }) - .run(); - return await c.db.select().from(authEmailIndex).where(eq(authEmailIndex.email, input.email)).get(); - }, - - async authDeleteEmailIndex(c: any, input: { email: string }) { - assertAppOrganization(c); - - await c.db.delete(authEmailIndex).where(eq(authEmailIndex.email, input.email)).run(); - }, - - async authFindAccountIndex(c: any, input: { id?: string; providerId?: string; accountId?: string }) { - assertAppOrganization(c); - - if (input.id) { - return await c.db.select().from(authAccountIndex).where(eq(authAccountIndex.id, input.id)).get(); - } - if (!input.providerId || !input.accountId) { - return null; - } - return await c.db - .select() - .from(authAccountIndex) - .where(and(eq(authAccountIndex.providerId, input.providerId), eq(authAccountIndex.accountId, input.accountId))) - .get(); - }, - - async authUpsertAccountIndex(c: any, input: { id: string; providerId: string; accountId: string; userId: string }) { - assertAppOrganization(c); - - const now = Date.now(); - await c.db - .insert(authAccountIndex) - .values({ - id: input.id, - providerId: input.providerId, - accountId: input.accountId, - userId: input.userId, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: authAccountIndex.id, - set: { - providerId: input.providerId, - accountId: input.accountId, - userId: input.userId, - updatedAt: now, - }, - }) - .run(); - return await c.db.select().from(authAccountIndex).where(eq(authAccountIndex.id, input.id)).get(); - }, - - async authDeleteAccountIndex(c: any, input: { id?: string; providerId?: string; accountId?: string }) { - assertAppOrganization(c); - - if (input.id) { - await c.db.delete(authAccountIndex).where(eq(authAccountIndex.id, input.id)).run(); - return; - } - if (input.providerId && input.accountId) { - await c.db - .delete(authAccountIndex) - .where(and(eq(authAccountIndex.providerId, input.providerId), eq(authAccountIndex.accountId, input.accountId))) - .run(); - } - }, - - async authCreateVerification(c: any, input: { data: Record }) { - assertAppOrganization(c); - - await c.db - .insert(authVerification) - .values(input.data as any) - .run(); - return await c.db - .select() - .from(authVerification) - .where(eq(authVerification.id, input.data.id as string)) - .get(); - }, - - async authFindOneVerification(c: any, input: { where: any[] }) { - assertAppOrganization(c); - - const predicate = organizationAuthWhere(authVerification, input.where); - return predicate ? await c.db.select().from(authVerification).where(predicate).get() : null; - }, - - async authFindManyVerification(c: any, input: { where?: any[]; limit?: number; sortBy?: any; offset?: number }) { - assertAppOrganization(c); - - const predicate = organizationAuthWhere(authVerification, input.where); - let query = c.db.select().from(authVerification); - if (predicate) { - query = query.where(predicate); - } - if (input.sortBy?.field) { - const column = organizationAuthColumn(authVerification, input.sortBy.field); - query = query.orderBy(input.sortBy.direction === "asc" ? asc(column) : desc(column)); - } - if (typeof input.limit === "number") { - query = query.limit(input.limit); - } - if (typeof input.offset === "number") { - query = query.offset(input.offset); - } - return await query.all(); - }, - - async authUpdateVerification(c: any, input: { where: any[]; update: Record }) { - assertAppOrganization(c); - - const predicate = organizationAuthWhere(authVerification, input.where); - if (!predicate) { - return null; - } - await c.db - .update(authVerification) - .set(input.update as any) - .where(predicate) - .run(); - return await c.db.select().from(authVerification).where(predicate).get(); - }, - - async authUpdateManyVerification(c: any, input: { where: any[]; update: Record }) { - assertAppOrganization(c); - - const predicate = organizationAuthWhere(authVerification, input.where); - if (!predicate) { - return 0; - } - await c.db - .update(authVerification) - .set(input.update as any) - .where(predicate) - .run(); - const row = await c.db.select({ value: sqlCount() }).from(authVerification).where(predicate).get(); - return row?.value ?? 0; - }, - - async authDeleteVerification(c: any, input: { where: any[] }) { - assertAppOrganization(c); - - const predicate = organizationAuthWhere(authVerification, input.where); - if (!predicate) { - return; - } - await c.db.delete(authVerification).where(predicate).run(); - }, - - async authDeleteManyVerification(c: any, input: { where: any[] }) { - assertAppOrganization(c); - - const predicate = organizationAuthWhere(authVerification, input.where); - if (!predicate) { - return 0; - } - const rows = await c.db.select().from(authVerification).where(predicate).all(); - await c.db.delete(authVerification).where(predicate).run(); - return rows.length; - }, - - async authCountVerification(c: any, input: { where?: any[] }) { - assertAppOrganization(c); - - const predicate = organizationAuthWhere(authVerification, input.where); - const row = predicate - ? await c.db.select({ value: sqlCount() }).from(authVerification).where(predicate).get() - : await c.db.select({ value: sqlCount() }).from(authVerification).get(); - return row?.value ?? 0; - }, - - async getAppSnapshot(c: any, input: { sessionId: string }): Promise { - return await buildAppSnapshot(c, input.sessionId); - }, - - async resolveAppGithubToken( - c: any, - input: { organizationId: string; requireRepoScope?: boolean }, - ): Promise<{ accessToken: string; scopes: string[] } | null> { - assertAppOrganization(c); - const auth = getBetterAuthService(); - const rows = await c.db.select().from(authSessionIndex).orderBy(desc(authSessionIndex.updatedAt)).all(); - - for (const row of rows) { - const authState = await auth.getAuthState(row.sessionId); - if (authState?.sessionState?.activeOrganizationId !== input.organizationId) { - continue; - } - - const token = await auth.getAccessTokenForSession(row.sessionId); - if (!token?.accessToken) { - continue; - } - - const scopes = token.scopes; - if (input.requireRepoScope !== false && scopes.length > 0 && !hasRepoScope(scopes)) { - continue; - } - - return { - accessToken: token.accessToken, - scopes, - }; - } - - return null; - }, - - async skipAppStarterRepo(c: any, input: { sessionId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - await getBetterAuthService().upsertUserProfile(session.authUserId, { - starterRepoStatus: "skipped", - starterRepoSkippedAt: Date.now(), - starterRepoStarredAt: null, - }); - return await buildAppSnapshot(c, input.sessionId); - }, - - async starAppStarterRepo(c: any, input: { sessionId: string; organizationId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const organization = await getOrCreateOrganization(c, input.organizationId); - await organization.starSandboxAgentRepo({ - organizationId: input.organizationId, - }); - await getBetterAuthService().upsertUserProfile(session.authUserId, { - starterRepoStatus: "starred", - starterRepoStarredAt: Date.now(), - starterRepoSkippedAt: null, - }); - return await buildAppSnapshot(c, input.sessionId); - }, - - async selectAppOrganization(c: any, input: { sessionId: string; organizationId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - await getBetterAuthService().setActiveOrganization(input.sessionId, input.organizationId); - - // Ensure the GitHub data actor exists. If it's newly created, its own - // workflow will detect the pending sync status and run the initial - // full sync automatically — no orchestration needed here. - await getOrCreateGithubData(c, input.organizationId); - - return await buildAppSnapshot(c, input.sessionId); - }, - - async updateAppOrganizationProfile( - c: any, - input: { sessionId: string; organizationId: string } & UpdateFoundryOrganizationProfileInput, - ): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const organization = await getOrCreateOrganization(c, input.organizationId); - await organization.updateOrganizationShellProfile({ - displayName: input.displayName, - slug: input.slug, - primaryDomain: input.primaryDomain, - }); - return await buildAppSnapshot(c, input.sessionId); - }, - - async triggerAppRepoImport(c: any, input: { sessionId: string; organizationId: string }): Promise { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - - const githubData = await getOrCreateGithubData(c, input.organizationId); - const summary = await githubData.getSummary({}); - if (summary.syncStatus === "syncing") { - return await buildAppSnapshot(c, input.sessionId); - } - - // Mark sync started on the organization, then send directly to the - // GitHub data actor's own workflow queue. - const organizationHandle = await getOrCreateOrganization(c, input.organizationId); - await organizationHandle.markOrganizationSyncStarted({ - label: "Importing repository catalog...", - }); - - await githubData.send("githubData.command.syncRepos", { label: "Importing repository catalog..." }, { wait: false }); - - return await buildAppSnapshot(c, input.sessionId); - }, - - async beginAppGithubInstall(c: any, input: { sessionId: string; organizationId: string }): Promise<{ url: string }> { - assertAppOrganization(c); - const session = await requireSignedInSession(c, input.sessionId); - requireEligibleOrganization(session, input.organizationId); - const { appShell } = getActorRuntimeContext(); - const organizationHandle = await getOrCreateOrganization(c, input.organizationId); - const organizationState = await getOrganizationState(organizationHandle); - if (organizationState.snapshot.kind !== "organization") { - return { - url: `${appShell.appUrl}/organizations/${input.organizationId}`, - }; - } - return { - url: await appShell.github.buildInstallationUrl(organizationState.githubLogin, randomUUID()), - }; - }, - async createAppCheckoutSession(c: any, input: { sessionId: string; organizationId: string; planId: FoundryBillingPlanId }): Promise<{ url: string }> { assertAppOrganization(c); const session = await requireSignedInSession(c, input.sessionId); @@ -1143,7 +700,11 @@ export const organizationAppActions = { const organizationState = await getOrganizationState(organizationHandle); if (input.planId === "free") { - await organizationHandle.applyOrganizationFreePlan({ clearSubscription: false }); + await organizationHandle.send( + organizationWorkflowQueueName("organization.command.billing.free_plan.apply"), + { clearSubscription: false }, + { wait: true, timeout: 10_000 }, + ); return { url: `${appShell.appUrl}/organizations/${input.organizationId}/billing`, }; @@ -1162,7 +723,11 @@ export const organizationAppActions = { email: session.currentUserEmail, }) ).id; - await organizationHandle.applyOrganizationStripeCustomer({ customerId }); + await organizationHandle.send( + organizationWorkflowQueueName("organization.command.billing.stripe_customer.apply"), + { customerId }, + { wait: true, timeout: 10_000 }, + ); await upsertStripeLookupEntries(c, input.organizationId, customerId, null); } @@ -1190,7 +755,11 @@ export const organizationAppActions = { const completion = await appShell.stripe.retrieveCheckoutCompletion(input.checkoutSessionId); if (completion.customerId) { - await organizationHandle.applyOrganizationStripeCustomer({ customerId: completion.customerId }); + await organizationHandle.send( + organizationWorkflowQueueName("organization.command.billing.stripe_customer.apply"), + { customerId: completion.customerId }, + { wait: true, timeout: 10_000 }, + ); } await upsertStripeLookupEntries(c, input.organizationId, completion.customerId, completion.subscriptionId); @@ -1200,9 +769,11 @@ export const organizationAppActions = { } if (completion.paymentMethodLabel) { - await organizationHandle.setOrganizationBillingPaymentMethod({ - label: completion.paymentMethodLabel, - }); + await organizationHandle.send( + organizationWorkflowQueueName("organization.command.billing.payment_method.set"), + { label: completion.paymentMethodLabel }, + { wait: true, timeout: 10_000 }, + ); } return { @@ -1240,7 +811,11 @@ export const organizationAppActions = { await applySubscriptionState(organizationHandle, subscription, organizationState.billingPlanId); await upsertStripeLookupEntries(c, input.organizationId, subscription.customerId ?? organizationState.stripeCustomerId, subscription.id); } else { - await organizationHandle.setOrganizationBillingStatus({ status: "scheduled_cancel" }); + await organizationHandle.send( + organizationWorkflowQueueName("organization.command.billing.status.set"), + { status: "scheduled_cancel" }, + { wait: true, timeout: 10_000 }, + ); } return await buildAppSnapshot(c, input.sessionId); @@ -1259,7 +834,11 @@ export const organizationAppActions = { await applySubscriptionState(organizationHandle, subscription, organizationState.billingPlanId); await upsertStripeLookupEntries(c, input.organizationId, subscription.customerId ?? organizationState.stripeCustomerId, subscription.id); } else { - await organizationHandle.setOrganizationBillingStatus({ status: "active" }); + await organizationHandle.send( + organizationWorkflowQueueName("organization.command.billing.status.set"), + { status: "active" }, + { wait: true, timeout: 10_000 }, + ); } return await buildAppSnapshot(c, input.sessionId); @@ -1270,9 +849,11 @@ export const organizationAppActions = { const session = await requireSignedInSession(c, input.sessionId); requireEligibleOrganization(session, input.organizationId); const organization = await getOrCreateOrganization(c, input.organizationId); - await organization.recordOrganizationSeatUsage({ - email: session.currentUserEmail, - }); + await organization.send( + organizationWorkflowQueueName("organization.command.billing.seat_usage.record"), + { email: session.currentUserEmail }, + { wait: true, timeout: 10_000 }, + ); return await buildAppSnapshot(c, input.sessionId); }, @@ -1293,7 +874,11 @@ export const organizationAppActions = { if (organizationId) { const organization = await getOrCreateOrganization(c, organizationId); if (typeof object.customer === "string") { - await organization.applyOrganizationStripeCustomer({ customerId: object.customer }); + await organization.send( + organizationWorkflowQueueName("organization.command.billing.stripe_customer.apply"), + { customerId: object.customer }, + { wait: true, timeout: 10_000 }, + ); } await upsertStripeLookupEntries( c, @@ -1326,7 +911,11 @@ export const organizationAppActions = { const organizationId = await findOrganizationIdForStripeEvent(c, subscription.customerId, subscription.id); if (organizationId) { const organization = await getOrCreateOrganization(c, organizationId); - await organization.applyOrganizationFreePlan({ clearSubscription: true }); + await organization.send( + organizationWorkflowQueueName("organization.command.billing.free_plan.apply"), + { clearSubscription: true }, + { wait: true, timeout: 10_000 }, + ); } return { ok: true }; } @@ -1338,13 +927,17 @@ export const organizationAppActions = { const organization = await getOrCreateOrganization(c, organizationId); const rawAmount = typeof invoice.amount_paid === "number" ? invoice.amount_paid : invoice.amount_due; const amountUsd = Math.round((typeof rawAmount === "number" ? rawAmount : 0) / 100); - await organization.upsertOrganizationInvoice({ - id: String(invoice.id), - label: typeof invoice.number === "string" ? `Invoice ${invoice.number}` : "Stripe invoice", - issuedAt: formatUnixDate(typeof invoice.created === "number" ? invoice.created : Math.floor(Date.now() / 1000)), - amountUsd: Number.isFinite(amountUsd) ? amountUsd : 0, - status: event.type === "invoice.paid" ? "paid" : "open", - }); + await organization.send( + organizationWorkflowQueueName("organization.command.billing.invoice.upsert"), + { + id: String(invoice.id), + label: typeof invoice.number === "string" ? `Invoice ${invoice.number}` : "Stripe invoice", + issuedAt: formatUnixDate(typeof invoice.created === "number" ? invoice.created : Math.floor(Date.now() / 1000)), + amountUsd: Number.isFinite(amountUsd) ? amountUsd : 0, + status: event.type === "invoice.paid" ? "paid" : "open", + }, + { wait: true, timeout: 10_000 }, + ); } } @@ -1374,12 +967,11 @@ export const organizationAppActions = { const organizationId = organizationOrganizationId(kind, accountLogin); const receivedAt = Date.now(); const organization = await getOrCreateOrganization(c, organizationId); - await organization.recordGithubWebhookReceipt({ - organizationId: organizationId, - event, - action: body.action ?? null, - receivedAt, - }); + await organization.send( + organizationWorkflowQueueName("organization.command.github.webhook_receipt.record"), + { organizationId, event, action: body.action ?? null, receivedAt }, + { wait: false }, + ); const githubData = await getOrCreateGithubData(c, organizationId); if (event === "installation" && (body.action === "created" || body.action === "deleted" || body.action === "suspend" || body.action === "unsuspend")) { @@ -1393,37 +985,52 @@ export const organizationAppActions = { "installation_event", ); if (body.action === "deleted") { - await githubData.clearState({ - connectedAccount: accountLogin, - installationStatus: "install_required", - installationId: null, - label: "GitHub App installation removed", - }); + await githubData.send( + githubDataWorkflowQueueName("githubData.command.clearState"), + { connectedAccount: accountLogin, installationStatus: "install_required", installationId: null, label: "GitHub App installation removed" }, + { wait: false }, + ); } else if (body.action === "created") { - await githubData.fullSync({ - connectedAccount: accountLogin, - installationStatus: "connected", - installationId: body.installation?.id ?? null, - githubLogin: accountLogin, - kind, - label: "Syncing GitHub data from installation webhook...", - }); + void githubData + .send( + githubDataWorkflowQueueName("githubData.command.syncRepos"), + { + connectedAccount: accountLogin, + installationStatus: "connected", + installationId: body.installation?.id ?? null, + githubLogin: accountLogin, + kind, + label: "Syncing GitHub data from installation webhook...", + }, + { wait: false }, + ) + .catch(() => {}); } else if (body.action === "suspend") { - await githubData.clearState({ - connectedAccount: accountLogin, - installationStatus: "reconnect_required", - installationId: body.installation?.id ?? null, - label: "GitHub App installation suspended", - }); + await githubData.send( + githubDataWorkflowQueueName("githubData.command.clearState"), + { + connectedAccount: accountLogin, + installationStatus: "reconnect_required", + installationId: body.installation?.id ?? null, + label: "GitHub App installation suspended", + }, + { wait: false }, + ); } else if (body.action === "unsuspend") { - await githubData.fullSync({ - connectedAccount: accountLogin, - installationStatus: "connected", - installationId: body.installation?.id ?? null, - githubLogin: accountLogin, - kind, - label: "Resyncing GitHub data after unsuspend...", - }); + void githubData + .send( + githubDataWorkflowQueueName("githubData.command.syncRepos"), + { + connectedAccount: accountLogin, + installationStatus: "connected", + installationId: body.installation?.id ?? null, + githubLogin: accountLogin, + kind, + label: "Resyncing GitHub data after unsuspend...", + }, + { wait: false }, + ) + .catch(() => {}); } return { ok: true }; } @@ -1440,14 +1047,20 @@ export const organizationAppActions = { }, "repository_membership_changed", ); - await githubData.fullSync({ - connectedAccount: accountLogin, - installationStatus: "connected", - installationId: body.installation?.id ?? null, - githubLogin: accountLogin, - kind, - label: "Resyncing GitHub data after repository access change...", - }); + void githubData + .send( + githubDataWorkflowQueueName("githubData.command.syncRepos"), + { + connectedAccount: accountLogin, + installationStatus: "connected", + installationId: body.installation?.id ?? null, + githubLogin: accountLogin, + kind, + label: "Resyncing GitHub data after repository access change...", + }, + { wait: false }, + ) + .catch(() => {}); return { ok: true }; } @@ -1475,35 +1088,33 @@ export const organizationAppActions = { "repository_event", ); if (event === "pull_request" && body.repository?.clone_url && body.pull_request) { - await githubData.handlePullRequestWebhook({ - connectedAccount: accountLogin, - installationStatus: "connected", - installationId: body.installation?.id ?? null, - repository: { - fullName: body.repository.full_name, - cloneUrl: body.repository.clone_url, - private: Boolean(body.repository.private), + await githubData.send( + githubDataWorkflowQueueName("githubData.command.handlePullRequestWebhook"), + { + connectedAccount: accountLogin, + installationStatus: "connected", + installationId: body.installation?.id ?? null, + repository: { + fullName: body.repository.full_name, + cloneUrl: body.repository.clone_url, + private: Boolean(body.repository.private), + }, + pullRequest: { + number: body.pull_request.number, + status: body.pull_request.draft ? "draft" : "ready", + title: body.pull_request.title ?? "", + body: body.pull_request.body ?? null, + state: body.pull_request.state ?? "open", + url: body.pull_request.html_url ?? `https://github.com/${body.repository.full_name}/pull/${body.pull_request.number}`, + headRefName: body.pull_request.head?.ref ?? "", + baseRefName: body.pull_request.base?.ref ?? "", + authorLogin: body.pull_request.user?.login ?? null, + isDraft: Boolean(body.pull_request.draft), + merged: Boolean(body.pull_request.merged), + }, }, - pullRequest: { - number: body.pull_request.number, - title: body.pull_request.title ?? "", - body: body.pull_request.body ?? null, - state: body.pull_request.state ?? "open", - url: body.pull_request.html_url ?? `https://github.com/${body.repository.full_name}/pull/${body.pull_request.number}`, - headRefName: body.pull_request.head?.ref ?? "", - baseRefName: body.pull_request.base?.ref ?? "", - authorLogin: body.pull_request.user?.login ?? null, - isDraft: Boolean(body.pull_request.draft), - merged: Boolean(body.pull_request.merged), - }, - }); - } - if ((event === "push" || event === "create" || event === "delete") && body.repository?.clone_url) { - const repoId = repoIdFromRemote(body.repository.clone_url); - const knownRepository = await githubData.getRepository({ repoId }); - if (knownRepository) { - await githubData.reloadRepository({ repoId }); - } + { wait: false }, + ); } } return { ok: true }; @@ -1520,422 +1131,325 @@ export const organizationAppActions = { ); return { ok: true }; }, +}; - async syncOrganizationShellFromGithub( - c: any, - input: { - userId: string; - userName: string; - userEmail: string; - githubUserLogin: string; - githubAccountId: string; - githubLogin: string; - githubAccountType: string; - kind: FoundryOrganization["kind"]; - displayName: string; - installationId: number | null; - appConfigured: boolean; - }, - ): Promise<{ organizationId: string }> { - assertOrganizationShell(c); - const now = Date.now(); - const existing = await readOrganizationProfileRow(c); - const slug = existing?.slug ?? slugify(input.githubLogin); - const organizationId = organizationOrganizationId(input.kind, input.githubLogin); - if (organizationId !== c.state.organizationId) { - throw new Error(`Organization actor mismatch: actor=${c.state.organizationId} github=${organizationId}`); - } +export async function syncOrganizationShellFromGithubMutation( + c: any, + input: { + userId: string; + userName: string; + userEmail: string; + githubUserLogin: string; + githubAccountId: string; + githubLogin: string; + githubAccountType: string; + kind: FoundryOrganization["kind"]; + displayName: string; + installationId: number | null; + appConfigured: boolean; + }, +): Promise<{ organizationId: string }> { + assertOrganizationShell(c); + const now = Date.now(); + const existing = await readOrganizationProfileRow(c); + const slug = existing?.slug ?? slugify(input.githubLogin); + const organizationId = organizationOrganizationId(input.kind, input.githubLogin); + if (organizationId !== c.state.organizationId) { + throw new Error(`Organization actor mismatch: actor=${c.state.organizationId} github=${organizationId}`); + } - const installationStatus = - input.kind === "personal" ? "connected" : input.installationId ? "connected" : input.appConfigured ? "install_required" : "reconnect_required"; - const syncStatus = existing?.githubSyncStatus ?? legacyRepoImportStatusToGithubSyncStatus(existing?.repoImportStatus); - const lastSyncLabel = - syncStatus === "synced" - ? existing.githubLastSyncLabel - : installationStatus === "connected" - ? "Waiting for first import" - : installationStatus === "install_required" - ? "GitHub App installation required" - : "GitHub App configuration incomplete"; - const hasStripeBillingState = Boolean(existing?.stripeCustomerId || existing?.stripeSubscriptionId || existing?.stripePriceId); - const defaultBillingPlanId = input.kind === "personal" || !hasStripeBillingState ? "free" : (existing?.billingPlanId ?? "team"); - const defaultSeatsIncluded = input.kind === "personal" || !hasStripeBillingState ? 1 : (existing?.billingSeatsIncluded ?? 5); - const defaultPaymentMethodLabel = - input.kind === "personal" - ? "No card required" - : hasStripeBillingState - ? (existing?.billingPaymentMethodLabel ?? "Payment method on file") - : "No payment method on file"; + const installationStatus = + input.kind === "personal" ? "connected" : input.installationId ? "connected" : input.appConfigured ? "install_required" : "reconnect_required"; + const syncStatus = existing?.githubSyncStatus ?? legacyRepoImportStatusToGithubSyncStatus(existing?.repoImportStatus); + const lastSyncLabel = + syncStatus === "synced" + ? existing.githubLastSyncLabel + : installationStatus === "connected" + ? "Waiting for first import" + : installationStatus === "install_required" + ? "GitHub App installation required" + : "GitHub App configuration incomplete"; + const hasStripeBillingState = Boolean(existing?.stripeCustomerId || existing?.stripeSubscriptionId || existing?.stripePriceId); + const defaultBillingPlanId = input.kind === "personal" || !hasStripeBillingState ? "free" : (existing?.billingPlanId ?? "team"); + const defaultSeatsIncluded = input.kind === "personal" || !hasStripeBillingState ? 1 : (existing?.billingSeatsIncluded ?? 5); + const defaultPaymentMethodLabel = + input.kind === "personal" + ? "No card required" + : hasStripeBillingState + ? (existing?.billingPaymentMethodLabel ?? "Payment method on file") + : "No payment method on file"; - await c.db - .insert(organizationProfile) - .values({ - id: PROFILE_ROW_ID, + await c.db + .insert(organizationProfile) + .values({ + id: PROFILE_ROW_ID, + kind: input.kind, + githubAccountId: input.githubAccountId, + githubLogin: input.githubLogin, + githubAccountType: input.githubAccountType, + displayName: input.displayName, + slug, + defaultModel: existing?.defaultModel ?? DEFAULT_WORKSPACE_MODEL_ID, + primaryDomain: existing?.primaryDomain ?? (input.kind === "personal" ? "personal" : `${slug}.github`), + autoImportRepos: existing?.autoImportRepos ?? 1, + repoImportStatus: existing?.repoImportStatus ?? "not_started", + githubConnectedAccount: input.githubLogin, + githubInstallationStatus: installationStatus, + githubSyncStatus: syncStatus, + githubInstallationId: input.installationId, + githubLastSyncLabel: lastSyncLabel, + githubLastSyncAt: existing?.githubLastSyncAt ?? null, + githubSyncGeneration: existing?.githubSyncGeneration ?? 0, + githubSyncPhase: existing?.githubSyncPhase ?? null, + githubProcessedRepositoryCount: existing?.githubProcessedRepositoryCount ?? 0, + githubTotalRepositoryCount: existing?.githubTotalRepositoryCount ?? 0, + stripeCustomerId: existing?.stripeCustomerId ?? null, + stripeSubscriptionId: existing?.stripeSubscriptionId ?? null, + stripePriceId: existing?.stripePriceId ?? null, + billingPlanId: defaultBillingPlanId, + billingStatus: existing?.billingStatus ?? "active", + billingSeatsIncluded: defaultSeatsIncluded, + billingTrialEndsAt: existing?.billingTrialEndsAt ?? null, + billingRenewalAt: existing?.billingRenewalAt ?? null, + billingPaymentMethodLabel: defaultPaymentMethodLabel, + createdAt: existing?.createdAt ?? now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: organizationProfile.id, + set: { kind: input.kind, githubAccountId: input.githubAccountId, githubLogin: input.githubLogin, githubAccountType: input.githubAccountType, displayName: input.displayName, - slug, - primaryDomain: existing?.primaryDomain ?? (input.kind === "personal" ? "personal" : `${slug}.github`), - defaultModel: existing?.defaultModel ?? "claude-sonnet-4", - autoImportRepos: existing?.autoImportRepos ?? 1, - repoImportStatus: existing?.repoImportStatus ?? "not_started", githubConnectedAccount: input.githubLogin, githubInstallationStatus: installationStatus, githubSyncStatus: syncStatus, githubInstallationId: input.installationId, githubLastSyncLabel: lastSyncLabel, githubLastSyncAt: existing?.githubLastSyncAt ?? null, - stripeCustomerId: existing?.stripeCustomerId ?? null, - stripeSubscriptionId: existing?.stripeSubscriptionId ?? null, - stripePriceId: existing?.stripePriceId ?? null, + githubSyncGeneration: existing?.githubSyncGeneration ?? 0, + githubSyncPhase: existing?.githubSyncPhase ?? null, + githubProcessedRepositoryCount: existing?.githubProcessedRepositoryCount ?? 0, + githubTotalRepositoryCount: existing?.githubTotalRepositoryCount ?? 0, billingPlanId: defaultBillingPlanId, - billingStatus: existing?.billingStatus ?? "active", billingSeatsIncluded: defaultSeatsIncluded, - billingTrialEndsAt: existing?.billingTrialEndsAt ?? null, - billingRenewalAt: existing?.billingRenewalAt ?? null, billingPaymentMethodLabel: defaultPaymentMethodLabel, - createdAt: existing?.createdAt ?? now, updatedAt: now, - }) - .onConflictDoUpdate({ - target: organizationProfile.id, - set: { - kind: input.kind, - githubAccountId: input.githubAccountId, - githubLogin: input.githubLogin, - githubAccountType: input.githubAccountType, - displayName: input.displayName, - githubConnectedAccount: input.githubLogin, - githubInstallationStatus: installationStatus, - githubSyncStatus: syncStatus, - githubInstallationId: input.installationId, - githubLastSyncLabel: lastSyncLabel, - githubLastSyncAt: existing?.githubLastSyncAt ?? null, - billingPlanId: defaultBillingPlanId, - billingSeatsIncluded: defaultSeatsIncluded, - billingPaymentMethodLabel: defaultPaymentMethodLabel, - updatedAt: now, - }, - }) - .run(); + }, + }) + .run(); - await c.db - .insert(organizationMembers) - .values({ - id: input.userId, + await c.db + .insert(organizationMembers) + .values({ + id: input.userId, + name: input.userName, + email: input.userEmail, + role: input.kind === "personal" ? "owner" : "admin", + state: "active", + updatedAt: now, + }) + .onConflictDoUpdate({ + target: organizationMembers.id, + set: { name: input.userName, email: input.userEmail, role: input.kind === "personal" ? "owner" : "admin", state: "active", updatedAt: now, - }) - .onConflictDoUpdate({ - target: organizationMembers.id, - set: { - name: input.userName, - email: input.userEmail, - role: input.kind === "personal" ? "owner" : "admin", - state: "active", - updatedAt: now, + }, + }) + .run(); + + // Auto-trigger github-data sync when the org has a connected installation + // but hasn't synced yet. This handles the common case where a personal + // account or an org with an existing GitHub App installation signs in for + // the first time on a fresh DB — the installation webhook already fired + // before the org actor existed, so we kick off the sync here instead. + const needsInitialSync = installationStatus === "connected" && syncStatus === "pending"; + if (needsInitialSync) { + const githubData = await getOrCreateGithubData(c, organizationId); + void githubData + .send( + githubDataWorkflowQueueName("githubData.command.syncRepos"), + { + connectedAccount: input.githubLogin, + installationStatus: "connected", + installationId: input.installationId, + githubLogin: input.githubLogin, + kind: input.kind, + label: "Initial repository sync...", }, - }) - .run(); + { wait: false }, + ) + .catch(() => {}); + } - return { organizationId }; - }, + return { organizationId }; +} - async getOrganizationShellState(c: any): Promise { - assertOrganizationShell(c); - return await buildOrganizationState(c); - }, - - async getOrganizationShellStateIfInitialized(c: any): Promise { - assertOrganizationShell(c); - return await buildOrganizationStateIfInitialized(c); - }, - - async updateOrganizationShellProfile(c: any, input: Pick): Promise { - assertOrganizationShell(c); - const existing = await requireOrganizationProfileRow(c); - await c.db - .update(organizationProfile) - .set({ - displayName: input.displayName.trim() || existing.displayName, - slug: input.slug.trim() || existing.slug, - primaryDomain: input.primaryDomain.trim() || existing.primaryDomain, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, - - async markOrganizationSyncStarted(c: any, input: { label: string }): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - githubSyncStatus: "syncing", - githubLastSyncLabel: input.label, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, - - async applyOrganizationSyncCompleted( - c: any, - input: { - repositories: Array<{ fullName: string; cloneUrl: string; private: boolean }>; - installationStatus: FoundryOrganization["github"]["installationStatus"]; - lastSyncLabel: string; - }, - ): Promise { - assertOrganizationShell(c); - const now = Date.now(); - for (const repository of input.repositories) { - const remoteUrl = repository.cloneUrl; - await c.db - .insert(repos) - .values({ - repoId: repoIdFromRemote(remoteUrl), - remoteUrl, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: repos.repoId, - set: { - remoteUrl, - updatedAt: now, - }, - }) - .run(); - } - await c.db - .update(organizationProfile) - .set({ - githubInstallationStatus: input.installationStatus, - githubSyncStatus: "synced", - githubLastSyncLabel: input.lastSyncLabel, - githubLastSyncAt: now, - updatedAt: now, - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, - - async markOrganizationSyncFailed(c: any, input: { message: string; installationStatus: FoundryOrganization["github"]["installationStatus"] }): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - githubInstallationStatus: input.installationStatus, - githubSyncStatus: "error", - githubLastSyncLabel: input.message, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, - - async applyOrganizationStripeCustomer(c: any, input: { customerId: string }): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - stripeCustomerId: input.customerId, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, - - async applyOrganizationStripeSubscription( - c: any, - input: { - subscription: { - id: string; - customerId: string; - priceId: string | null; - status: string; - cancelAtPeriodEnd: boolean; - currentPeriodEnd: number | null; - trialEnd: number | null; - defaultPaymentMethodLabel: string; - }; - fallbackPlanId: FoundryBillingPlanId; - }, - ): Promise { - assertOrganizationShell(c); - const { appShell } = getActorRuntimeContext(); - const planId = appShell.stripe.planIdForPriceId(input.subscription.priceId ?? "") ?? input.fallbackPlanId; - await c.db - .update(organizationProfile) - .set({ - stripeCustomerId: input.subscription.customerId || null, - stripeSubscriptionId: input.subscription.id || null, - stripePriceId: input.subscription.priceId, - billingPlanId: planId, - billingStatus: stripeStatusToBillingStatus(input.subscription.status, input.subscription.cancelAtPeriodEnd), - billingSeatsIncluded: seatsIncludedForPlan(planId), - billingTrialEndsAt: input.subscription.trialEnd ? new Date(input.subscription.trialEnd * 1000).toISOString() : null, - billingRenewalAt: input.subscription.currentPeriodEnd ? new Date(input.subscription.currentPeriodEnd * 1000).toISOString() : null, - billingPaymentMethodLabel: input.subscription.defaultPaymentMethodLabel || "Payment method on file", - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, - - async applyOrganizationFreePlan(c: any, input: { clearSubscription: boolean }): Promise { - assertOrganizationShell(c); - const patch: Record = { - billingPlanId: "free", - billingStatus: "active", - billingSeatsIncluded: 1, - billingTrialEndsAt: null, - billingRenewalAt: null, - billingPaymentMethodLabel: "No card required", +export async function updateOrganizationShellProfileMutation( + c: any, + input: Pick, +): Promise { + assertOrganizationShell(c); + const existing = await requireOrganizationProfileRow(c); + await c.db + .update(organizationProfile) + .set({ + displayName: input.displayName.trim() || existing.displayName, + slug: input.slug.trim() || existing.slug, + primaryDomain: input.primaryDomain.trim() || existing.primaryDomain, updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); +} + +export async function markOrganizationSyncStartedMutation(c: any, input: { label: string }): Promise { + assertOrganizationShell(c); + await c.db + .update(organizationProfile) + .set({ + githubSyncStatus: "syncing", + githubLastSyncLabel: input.label, + githubSyncPhase: "discovering_repositories", + githubProcessedRepositoryCount: 0, + githubTotalRepositoryCount: 0, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); +} + +export async function applyOrganizationStripeCustomerMutation(c: any, input: { customerId: string }): Promise { + assertOrganizationShell(c); + await c.db + .update(organizationProfile) + .set({ + stripeCustomerId: input.customerId, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); +} + +export async function applyOrganizationStripeSubscriptionMutation( + c: any, + input: { + subscription: { + id: string; + customerId: string; + priceId: string | null; + status: string; + cancelAtPeriodEnd: boolean; + currentPeriodEnd: number | null; + trialEnd: number | null; + defaultPaymentMethodLabel: string; }; - if (input.clearSubscription) { - patch.stripeSubscriptionId = null; - patch.stripePriceId = null; - } - await c.db.update(organizationProfile).set(patch).where(eq(organizationProfile.id, PROFILE_ROW_ID)).run(); + fallbackPlanId: FoundryBillingPlanId; }, +): Promise { + assertOrganizationShell(c); + const { appShell } = getActorRuntimeContext(); + const planId = appShell.stripe.planIdForPriceId(input.subscription.priceId ?? "") ?? input.fallbackPlanId; + await c.db + .update(organizationProfile) + .set({ + stripeCustomerId: input.subscription.customerId || null, + stripeSubscriptionId: input.subscription.id || null, + stripePriceId: input.subscription.priceId, + billingPlanId: planId, + billingStatus: stripeStatusToBillingStatus(input.subscription.status, input.subscription.cancelAtPeriodEnd), + billingSeatsIncluded: seatsIncludedForPlan(planId), + billingTrialEndsAt: input.subscription.trialEnd ? new Date(input.subscription.trialEnd * 1000).toISOString() : null, + billingRenewalAt: input.subscription.currentPeriodEnd ? new Date(input.subscription.currentPeriodEnd * 1000).toISOString() : null, + billingPaymentMethodLabel: input.subscription.defaultPaymentMethodLabel || "Payment method on file", + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); +} - async setOrganizationBillingPaymentMethod(c: any, input: { label: string }): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - billingPaymentMethodLabel: input.label, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, +export async function applyOrganizationFreePlanMutation(c: any, input: { clearSubscription: boolean }): Promise { + assertOrganizationShell(c); + const patch: Record = { + billingPlanId: "free", + billingStatus: "active", + billingSeatsIncluded: 1, + billingTrialEndsAt: null, + billingRenewalAt: null, + billingPaymentMethodLabel: "No card required", + updatedAt: Date.now(), + }; + if (input.clearSubscription) { + patch.stripeSubscriptionId = null; + patch.stripePriceId = null; + } + await c.db.update(organizationProfile).set(patch).where(eq(organizationProfile.id, PROFILE_ROW_ID)).run(); +} - async setOrganizationBillingStatus(c: any, input: { status: FoundryBillingState["status"] }): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - billingStatus: input.status, - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, +export async function setOrganizationBillingPaymentMethodMutation(c: any, input: { label: string }): Promise { + assertOrganizationShell(c); + await c.db + .update(organizationProfile) + .set({ + billingPaymentMethodLabel: input.label, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); +} - async upsertOrganizationInvoice(c: any, input: { id: string; label: string; issuedAt: string; amountUsd: number; status: "paid" | "open" }): Promise { - assertOrganizationShell(c); - await c.db - .insert(invoices) - .values({ - id: input.id, +export async function setOrganizationBillingStatusMutation(c: any, input: { status: FoundryBillingState["status"] }): Promise { + assertOrganizationShell(c); + await c.db + .update(organizationProfile) + .set({ + billingStatus: input.status, + updatedAt: Date.now(), + }) + .where(eq(organizationProfile.id, PROFILE_ROW_ID)) + .run(); +} + +export async function upsertOrganizationInvoiceMutation( + c: any, + input: { id: string; label: string; issuedAt: string; amountUsd: number; status: "paid" | "open" }, +): Promise { + assertOrganizationShell(c); + await c.db + .insert(invoices) + .values({ + id: input.id, + label: input.label, + issuedAt: input.issuedAt, + amountUsd: input.amountUsd, + status: input.status, + createdAt: Date.now(), + }) + .onConflictDoUpdate({ + target: invoices.id, + set: { label: input.label, issuedAt: input.issuedAt, amountUsd: input.amountUsd, status: input.status, - createdAt: Date.now(), - }) - .onConflictDoUpdate({ - target: invoices.id, - set: { - label: input.label, - issuedAt: input.issuedAt, - amountUsd: input.amountUsd, - status: input.status, - }, - }) - .run(); - }, + }, + }) + .run(); +} - async recordOrganizationSeatUsage(c: any, input: { email: string }): Promise { - assertOrganizationShell(c); - await c.db - .insert(seatAssignments) - .values({ - email: input.email, - createdAt: Date.now(), - }) - .onConflictDoNothing() - .run(); - }, - - async applyGithubInstallationCreated(c: any, input: { installationId: number }): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - githubInstallationId: input.installationId, - githubInstallationStatus: "connected", - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, - - async applyGithubInstallationRemoved(c: any, _input: {}): Promise { - assertOrganizationShell(c); - await c.db - .update(organizationProfile) - .set({ - githubInstallationId: null, - githubInstallationStatus: "install_required", - githubSyncStatus: "pending", - githubLastSyncLabel: "GitHub App installation removed", - updatedAt: Date.now(), - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, - - async applyGithubRepositoryChanges(c: any, input: { added: Array<{ fullName: string; private: boolean }>; removed: string[] }): Promise { - assertOrganizationShell(c); - const now = Date.now(); - - for (const repo of input.added) { - const remoteUrl = `https://github.com/${repo.fullName}.git`; - const repoId = repoIdFromRemote(remoteUrl); - await c.db - .insert(repos) - .values({ - repoId, - remoteUrl, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: repos.repoId, - set: { - remoteUrl, - updatedAt: now, - }, - }) - .run(); - } - - for (const fullName of input.removed) { - const remoteUrl = `https://github.com/${fullName}.git`; - const repoId = repoIdFromRemote(remoteUrl); - await c.db.delete(repos).where(eq(repos.repoId, repoId)).run(); - } - - const repoCount = (await c.db.select().from(repos).all()).length; - await c.db - .update(organizationProfile) - .set({ - githubSyncStatus: "synced", - githubLastSyncLabel: `${repoCount} repositories synced`, - githubLastSyncAt: now, - updatedAt: now, - }) - .where(eq(organizationProfile.id, PROFILE_ROW_ID)) - .run(); - }, -}; +export async function recordOrganizationSeatUsageMutation(c: any, input: { email: string }): Promise { + assertOrganizationShell(c); + await c.db + .insert(seatAssignments) + .values({ + email: input.email, + createdAt: Date.now(), + }) + .onConflictDoNothing() + .run(); +} diff --git a/foundry/packages/backend/src/actors/organization/constants.ts b/foundry/packages/backend/src/actors/organization/constants.ts new file mode 100644 index 0000000..0b8e3c0 --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/constants.ts @@ -0,0 +1 @@ +export const APP_SHELL_ORGANIZATION_ID = "app"; diff --git a/foundry/packages/backend/src/actors/organization/db/drizzle/0000_melted_viper.sql b/foundry/packages/backend/src/actors/organization/db/drizzle/0000_melted_viper.sql index 09b77f9..80be04f 100644 --- a/foundry/packages/backend/src/actors/organization/db/drizzle/0000_melted_viper.sql +++ b/foundry/packages/backend/src/actors/organization/db/drizzle/0000_melted_viper.sql @@ -56,6 +56,10 @@ CREATE TABLE `organization_profile` ( `github_last_sync_at` integer, `github_last_webhook_at` integer, `github_last_webhook_event` text, + `github_sync_generation` integer NOT NULL, + `github_sync_phase` text, + `github_processed_repository_count` integer NOT NULL, + `github_total_repository_count` integer NOT NULL, `stripe_customer_id` text, `stripe_subscription_id` text, `stripe_price_id` text, @@ -86,8 +90,3 @@ CREATE TABLE `stripe_lookup` ( `organization_id` text NOT NULL, `updated_at` integer NOT NULL ); ---> statement-breakpoint -CREATE TABLE `task_lookup` ( - `task_id` text PRIMARY KEY NOT NULL, - `repo_id` text NOT NULL -); diff --git a/foundry/packages/backend/src/actors/organization/db/drizzle/0001_add_auth_and_task_tables.sql b/foundry/packages/backend/src/actors/organization/db/drizzle/0001_add_auth_and_task_tables.sql new file mode 100644 index 0000000..fcd1b60 --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/db/drizzle/0001_add_auth_and_task_tables.sql @@ -0,0 +1,50 @@ +CREATE TABLE IF NOT EXISTS `auth_session_index` ( + `session_id` text PRIMARY KEY NOT NULL, + `session_token` text NOT NULL, + `user_id` text NOT NULL, + `created_at` integer NOT NULL, + `updated_at` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE IF NOT EXISTS `auth_email_index` ( + `email` text PRIMARY KEY NOT NULL, + `user_id` text NOT NULL, + `updated_at` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE IF NOT EXISTS `auth_account_index` ( + `id` text PRIMARY KEY NOT NULL, + `provider_id` text NOT NULL, + `account_id` text NOT NULL, + `user_id` text NOT NULL, + `updated_at` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE IF NOT EXISTS `auth_verification` ( + `id` text PRIMARY KEY NOT NULL, + `identifier` text NOT NULL, + `value` text NOT NULL, + `expires_at` integer NOT NULL, + `created_at` integer NOT NULL, + `updated_at` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE IF NOT EXISTS `task_index` ( + `task_id` text PRIMARY KEY NOT NULL, + `repo_id` text NOT NULL, + `branch_name` text, + `created_at` integer NOT NULL, + `updated_at` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE IF NOT EXISTS `task_summaries` ( + `task_id` text PRIMARY KEY NOT NULL, + `repo_id` text NOT NULL, + `title` text NOT NULL, + `status` text NOT NULL, + `repo_name` text NOT NULL, + `updated_at_ms` integer NOT NULL, + `branch` text, + `pull_request_json` text, + `sessions_summary_json` text DEFAULT '[]' NOT NULL +); diff --git a/foundry/packages/backend/src/actors/organization/db/drizzle/meta/0000_snapshot.json b/foundry/packages/backend/src/actors/organization/db/drizzle/meta/0000_snapshot.json index cdcc44c..a29c546 100644 --- a/foundry/packages/backend/src/actors/organization/db/drizzle/meta/0000_snapshot.json +++ b/foundry/packages/backend/src/actors/organization/db/drizzle/meta/0000_snapshot.json @@ -373,6 +373,34 @@ "notNull": false, "autoincrement": false }, + "github_sync_generation": { + "name": "github_sync_generation", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "github_sync_phase": { + "name": "github_sync_phase", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "github_processed_repository_count": { + "name": "github_processed_repository_count", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "github_total_repository_count": { + "name": "github_total_repository_count", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, "stripe_customer_id": { "name": "stripe_customer_id", "type": "text", @@ -549,30 +577,6 @@ "compositePrimaryKeys": {}, "uniqueConstraints": {}, "checkConstraints": {} - }, - "task_lookup": { - "name": "task_lookup", - "columns": { - "task_id": { - "name": "task_id", - "type": "text", - "primaryKey": true, - "notNull": true, - "autoincrement": false - }, - "repo_id": { - "name": "repo_id", - "type": "text", - "primaryKey": false, - "notNull": true, - "autoincrement": false - } - }, - "indexes": {}, - "foreignKeys": {}, - "compositePrimaryKeys": {}, - "uniqueConstraints": {}, - "checkConstraints": {} } }, "views": {}, diff --git a/foundry/packages/backend/src/actors/organization/db/drizzle/meta/_journal.json b/foundry/packages/backend/src/actors/organization/db/drizzle/meta/_journal.json index e3668a1..41ea23b 100644 --- a/foundry/packages/backend/src/actors/organization/db/drizzle/meta/_journal.json +++ b/foundry/packages/backend/src/actors/organization/db/drizzle/meta/_journal.json @@ -8,6 +8,13 @@ "when": 1773376221152, "tag": "0000_melted_viper", "breakpoints": true + }, + { + "idx": 1, + "version": "6", + "when": 1773840000000, + "tag": "0001_add_auth_and_task_tables", + "breakpoints": true } ] } diff --git a/foundry/packages/backend/src/actors/organization/db/migrations.ts b/foundry/packages/backend/src/actors/organization/db/migrations.ts index b3e09f1..2e8570b 100644 --- a/foundry/packages/backend/src/actors/organization/db/migrations.ts +++ b/foundry/packages/backend/src/actors/organization/db/migrations.ts @@ -12,20 +12,14 @@ const journal = { }, { idx: 1, - when: 1773638400000, - tag: "0001_auth_index_tables", + when: 1773840000000, + tag: "0001_add_auth_and_task_tables", breakpoints: true, }, { idx: 2, - when: 1773720000000, - tag: "0002_task_summaries", - breakpoints: true, - }, - { - idx: 3, - when: 1773810001000, - tag: "0003_drop_provider_profiles", + when: 1773984000000, + tag: "0002_add_task_owner_columns", breakpoints: true, }, ], @@ -92,6 +86,10 @@ CREATE TABLE \`organization_profile\` ( \`github_last_sync_at\` integer, \`github_last_webhook_at\` integer, \`github_last_webhook_event\` text, + \`github_sync_generation\` integer NOT NULL, + \`github_sync_phase\` text, + \`github_processed_repository_count\` integer NOT NULL, + \`github_total_repository_count\` integer NOT NULL, \`stripe_customer_id\` text, \`stripe_subscription_id\` text, \`stripe_price_id\` text, @@ -122,11 +120,6 @@ CREATE TABLE \`stripe_lookup\` ( \`organization_id\` text NOT NULL, \`updated_at\` integer NOT NULL ); ---> statement-breakpoint -CREATE TABLE \`task_lookup\` ( - \`task_id\` text PRIMARY KEY NOT NULL, - \`repo_id\` text NOT NULL -); `, m0001: `CREATE TABLE IF NOT EXISTS \`auth_session_index\` ( \`session_id\` text PRIMARY KEY NOT NULL, @@ -158,8 +151,16 @@ CREATE TABLE IF NOT EXISTS \`auth_verification\` ( \`created_at\` integer NOT NULL, \`updated_at\` integer NOT NULL ); -`, - m0002: `CREATE TABLE IF NOT EXISTS \`task_summaries\` ( +--> statement-breakpoint +CREATE TABLE IF NOT EXISTS \`task_index\` ( + \`task_id\` text PRIMARY KEY NOT NULL, + \`repo_id\` text NOT NULL, + \`branch_name\` text, + \`created_at\` integer NOT NULL, + \`updated_at\` integer NOT NULL +); +--> statement-breakpoint +CREATE TABLE IF NOT EXISTS \`task_summaries\` ( \`task_id\` text PRIMARY KEY NOT NULL, \`repo_id\` text NOT NULL, \`title\` text NOT NULL, @@ -171,7 +172,9 @@ CREATE TABLE IF NOT EXISTS \`auth_verification\` ( \`sessions_summary_json\` text DEFAULT '[]' NOT NULL ); `, - m0003: `DROP TABLE IF EXISTS \`provider_profiles\`; + m0002: `ALTER TABLE \`task_summaries\` ADD COLUMN \`primary_user_login\` text; +--> statement-breakpoint +ALTER TABLE \`task_summaries\` ADD COLUMN \`primary_user_avatar_url\` text; `, } as const, }; diff --git a/foundry/packages/backend/src/actors/organization/db/schema.ts b/foundry/packages/backend/src/actors/organization/db/schema.ts index dd4fa40..3978a5f 100644 --- a/foundry/packages/backend/src/actors/organization/db/schema.ts +++ b/foundry/packages/backend/src/actors/organization/db/schema.ts @@ -1,34 +1,34 @@ -import { integer, sqliteTable, text } from "rivetkit/db/drizzle"; +import { check, integer, sqliteTable, text } from "rivetkit/db/drizzle"; +import { sql } from "drizzle-orm"; +import { DEFAULT_WORKSPACE_MODEL_ID } from "@sandbox-agent/foundry-shared"; // SQLite is per organization actor instance, so no organizationId column needed. /** - * Coordinator index of RepositoryActor instances. - * The organization actor is the coordinator for repositories. - * Rows are created/removed when repos are added/removed from the organization. + * Coordinator index of TaskActor instances. + * The organization actor is the direct coordinator for tasks (not a per-repo + * actor) because the sidebar needs to query all tasks across all repos on + * every snapshot. With many repos, fanning out to N repo actors on the hot + * read path is too expensive — owning the index here keeps that a single + * local table scan. Each row maps a taskId to its repo and immutable branch + * name. Used for branch conflict checking (scoped by repoId) and + * task-by-branch lookups. */ -export const repos = sqliteTable("repos", { - repoId: text("repo_id").notNull().primaryKey(), - remoteUrl: text("remote_url").notNull(), +export const taskIndex = sqliteTable("task_index", { + taskId: text("task_id").notNull().primaryKey(), + repoId: text("repo_id").notNull(), + branchName: text("branch_name"), createdAt: integer("created_at").notNull(), updatedAt: integer("updated_at").notNull(), }); /** - * Coordinator index of TaskActor instances. - * Fast taskId → repoId lookup so the organization can route requests - * to the correct RepositoryActor without scanning all repos. - */ -export const taskLookup = sqliteTable("task_lookup", { - taskId: text("task_id").notNull().primaryKey(), - repoId: text("repo_id").notNull(), -}); - -/** - * Coordinator index of TaskActor instances — materialized sidebar projection. - * Task actors push summary updates to the organization actor via - * applyTaskSummaryUpdate(). Source of truth lives on each TaskActor; - * this table exists so organization reads stay local without fan-out. + * Organization-owned materialized task summary projection. + * Task actors push summary updates directly to the organization coordinator, + * which keeps this table local for fast list/lookups without fan-out. + * Same rationale as taskIndex: the sidebar repeatedly reads all tasks across + * all repos, so the org must own the materialized view to avoid O(repos) + * actor fan-out on the hot read path. */ export const taskSummaries = sqliteTable("task_summaries", { taskId: text("task_id").notNull().primaryKey(), @@ -40,40 +40,50 @@ export const taskSummaries = sqliteTable("task_summaries", { branch: text("branch"), pullRequestJson: text("pull_request_json"), sessionsSummaryJson: text("sessions_summary_json").notNull().default("[]"), + primaryUserLogin: text("primary_user_login"), + primaryUserAvatarUrl: text("primary_user_avatar_url"), }); -export const organizationProfile = sqliteTable("organization_profile", { - id: text("id").notNull().primaryKey(), - kind: text("kind").notNull(), - githubAccountId: text("github_account_id").notNull(), - githubLogin: text("github_login").notNull(), - githubAccountType: text("github_account_type").notNull(), - displayName: text("display_name").notNull(), - slug: text("slug").notNull(), - primaryDomain: text("primary_domain").notNull(), - defaultModel: text("default_model").notNull(), - autoImportRepos: integer("auto_import_repos").notNull(), - repoImportStatus: text("repo_import_status").notNull(), - githubConnectedAccount: text("github_connected_account").notNull(), - githubInstallationStatus: text("github_installation_status").notNull(), - githubSyncStatus: text("github_sync_status").notNull(), - githubInstallationId: integer("github_installation_id"), - githubLastSyncLabel: text("github_last_sync_label").notNull(), - githubLastSyncAt: integer("github_last_sync_at"), - githubLastWebhookAt: integer("github_last_webhook_at"), - githubLastWebhookEvent: text("github_last_webhook_event"), - stripeCustomerId: text("stripe_customer_id"), - stripeSubscriptionId: text("stripe_subscription_id"), - stripePriceId: text("stripe_price_id"), - billingPlanId: text("billing_plan_id").notNull(), - billingStatus: text("billing_status").notNull(), - billingSeatsIncluded: integer("billing_seats_included").notNull(), - billingTrialEndsAt: text("billing_trial_ends_at"), - billingRenewalAt: text("billing_renewal_at"), - billingPaymentMethodLabel: text("billing_payment_method_label").notNull(), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), -}); +export const organizationProfile = sqliteTable( + "organization_profile", + { + id: integer("id").primaryKey(), + kind: text("kind").notNull(), + githubAccountId: text("github_account_id").notNull(), + githubLogin: text("github_login").notNull(), + githubAccountType: text("github_account_type").notNull(), + displayName: text("display_name").notNull(), + slug: text("slug").notNull(), + defaultModel: text("default_model").notNull().default(DEFAULT_WORKSPACE_MODEL_ID), + primaryDomain: text("primary_domain").notNull(), + autoImportRepos: integer("auto_import_repos").notNull(), + repoImportStatus: text("repo_import_status").notNull(), + githubConnectedAccount: text("github_connected_account").notNull(), + githubInstallationStatus: text("github_installation_status").notNull(), + githubSyncStatus: text("github_sync_status").notNull(), + githubInstallationId: integer("github_installation_id"), + githubLastSyncLabel: text("github_last_sync_label").notNull(), + githubLastSyncAt: integer("github_last_sync_at"), + githubLastWebhookAt: integer("github_last_webhook_at"), + githubLastWebhookEvent: text("github_last_webhook_event"), + githubSyncGeneration: integer("github_sync_generation").notNull(), + githubSyncPhase: text("github_sync_phase"), + githubProcessedRepositoryCount: integer("github_processed_repository_count").notNull(), + githubTotalRepositoryCount: integer("github_total_repository_count").notNull(), + stripeCustomerId: text("stripe_customer_id"), + stripeSubscriptionId: text("stripe_subscription_id"), + stripePriceId: text("stripe_price_id"), + billingPlanId: text("billing_plan_id").notNull(), + billingStatus: text("billing_status").notNull(), + billingSeatsIncluded: integer("billing_seats_included").notNull(), + billingTrialEndsAt: text("billing_trial_ends_at"), + billingRenewalAt: text("billing_renewal_at"), + billingPaymentMethodLabel: text("billing_payment_method_label").notNull(), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), + }, + (table) => [check("organization_profile_singleton_id_check", sql`${table.id} = 1`)], +); export const organizationMembers = sqliteTable("organization_members", { id: text("id").notNull().primaryKey(), @@ -133,6 +143,7 @@ export const authAccountIndex = sqliteTable("auth_account_index", { updatedAt: integer("updated_at").notNull(), }); +/** Better Auth core model — schema defined at https://better-auth.com/docs/concepts/database */ export const authVerification = sqliteTable("auth_verification", { id: text("id").notNull().primaryKey(), identifier: text("identifier").notNull(), diff --git a/foundry/packages/backend/src/actors/organization/index.ts b/foundry/packages/backend/src/actors/organization/index.ts index 1ea0196..9ceb27f 100644 --- a/foundry/packages/backend/src/actors/organization/index.ts +++ b/foundry/packages/backend/src/actors/organization/index.ts @@ -1,7 +1,9 @@ import { actor, queue } from "rivetkit"; import { workflow } from "rivetkit/workflow"; import { organizationDb } from "./db/db.js"; -import { runOrganizationWorkflow, ORGANIZATION_QUEUE_NAMES, organizationActions } from "./actions.js"; +import { organizationActions } from "./actions.js"; +import { runOrganizationWorkflow } from "./workflow.js"; +import { ORGANIZATION_QUEUE_NAMES } from "./queues.js"; export const organization = actor({ db: organizationDb, @@ -14,6 +16,8 @@ export const organization = actor({ createState: (_c, organizationId: string) => ({ organizationId, }), - actions: organizationActions, + actions: { + ...organizationActions, + }, run: workflow(runOrganizationWorkflow), }); diff --git a/foundry/packages/backend/src/actors/organization/queues.ts b/foundry/packages/backend/src/actors/organization/queues.ts new file mode 100644 index 0000000..2e67dc5 --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/queues.ts @@ -0,0 +1,26 @@ +export const ORGANIZATION_QUEUE_NAMES = [ + "organization.command.createTask", + "organization.command.materializeTask", + "organization.command.applyTaskSummaryUpdate", + "organization.command.removeTaskSummary", + "organization.command.refreshTaskSummaryForBranch", + "organization.command.snapshot.broadcast", + "organization.command.syncGithubSession", + "organization.command.github.organization_shell.sync_from_github", + "organization.command.github.sync_progress.apply", + "organization.command.github.webhook_receipt.record", + "organization.command.shell.sync_started.mark", + "organization.command.billing.stripe_customer.apply", + "organization.command.billing.stripe_subscription.apply", + "organization.command.billing.free_plan.apply", + "organization.command.billing.payment_method.set", + "organization.command.billing.status.set", + "organization.command.billing.invoice.upsert", + "organization.command.billing.seat_usage.record", +] as const; + +export type OrganizationQueueName = (typeof ORGANIZATION_QUEUE_NAMES)[number]; + +export function organizationWorkflowQueueName(name: OrganizationQueueName): OrganizationQueueName { + return name; +} diff --git a/foundry/packages/backend/src/actors/organization/workflow.ts b/foundry/packages/backend/src/actors/organization/workflow.ts new file mode 100644 index 0000000..e62e80d --- /dev/null +++ b/foundry/packages/backend/src/actors/organization/workflow.ts @@ -0,0 +1,164 @@ +// @ts-nocheck +/** + * Organization workflow — queue-based command loop. + * + * Mutations are dispatched through named queues and processed inside workflow + * steps so that every command appears in the RivetKit inspector's workflow + * history. Read actions remain direct (no queue). + * + * Callers send commands directly via `.send()` to the appropriate queue name. + */ +import { Loop } from "rivetkit/workflow"; +import { logActorWarning, resolveErrorMessage } from "../logging.js"; +import { ORGANIZATION_QUEUE_NAMES, type OrganizationQueueName } from "./queues.js"; + +import { applyGithubSyncProgressMutation, recordGithubWebhookReceiptMutation, refreshOrganizationSnapshotMutation } from "./actions.js"; +import { + applyTaskSummaryUpdateMutation, + createTaskMutation, + refreshTaskSummaryForBranchMutation, + removeTaskSummaryMutation, +} from "./actions/task-mutations.js"; +import { + applyOrganizationFreePlanMutation, + applyOrganizationStripeCustomerMutation, + applyOrganizationStripeSubscriptionMutation, + markOrganizationSyncStartedMutation, + recordOrganizationSeatUsageMutation, + setOrganizationBillingPaymentMethodMutation, + setOrganizationBillingStatusMutation, + syncOrganizationShellFromGithubMutation, + upsertOrganizationInvoiceMutation, +} from "./app-shell.js"; + +// --------------------------------------------------------------------------- +// Workflow command loop — runs inside `run: workflow(runOrganizationWorkflow)` +// --------------------------------------------------------------------------- + +type WorkflowHandler = (loopCtx: any, body: any) => Promise; + +/** + * Maps queue names to their mutation handlers. + * Each handler receives the workflow loop context and the message body, + * executes the mutation, and returns the result (which is sent back via + * msg.complete). + */ +const COMMAND_HANDLERS: Record = { + // Task mutations + "organization.command.createTask": async (c, body) => createTaskMutation(c, body), + "organization.command.materializeTask": async (c, body) => createTaskMutation(c, body), + "organization.command.applyTaskSummaryUpdate": async (c, body) => { + await applyTaskSummaryUpdateMutation(c, body); + return { ok: true }; + }, + "organization.command.removeTaskSummary": async (c, body) => { + await removeTaskSummaryMutation(c, body); + return { ok: true }; + }, + "organization.command.refreshTaskSummaryForBranch": async (c, body) => { + await refreshTaskSummaryForBranchMutation(c, body); + return { ok: true }; + }, + "organization.command.snapshot.broadcast": async (c, _body) => { + await refreshOrganizationSnapshotMutation(c); + return { ok: true }; + }, + "organization.command.syncGithubSession": async (c, body) => { + const { syncGithubOrganizations } = await import("./app-shell.js"); + await syncGithubOrganizations(c, body); + return { ok: true }; + }, + + // GitHub organization shell sync (stays on queue) + "organization.command.github.organization_shell.sync_from_github": async (c, body) => syncOrganizationShellFromGithubMutation(c, body), + + // GitHub sync progress + webhook receipt + "organization.command.github.sync_progress.apply": async (c, body) => { + await applyGithubSyncProgressMutation(c, body); + return { ok: true }; + }, + "organization.command.github.webhook_receipt.record": async (c, body) => { + await recordGithubWebhookReceiptMutation(c, body); + return { ok: true }; + }, + "organization.command.shell.sync_started.mark": async (c, body) => { + await markOrganizationSyncStartedMutation(c, body); + return { ok: true }; + }, + + // Billing mutations + "organization.command.billing.stripe_customer.apply": async (c, body) => { + await applyOrganizationStripeCustomerMutation(c, body); + return { ok: true }; + }, + "organization.command.billing.stripe_subscription.apply": async (c, body) => { + await applyOrganizationStripeSubscriptionMutation(c, body); + return { ok: true }; + }, + "organization.command.billing.free_plan.apply": async (c, body) => { + await applyOrganizationFreePlanMutation(c, body); + return { ok: true }; + }, + "organization.command.billing.payment_method.set": async (c, body) => { + await setOrganizationBillingPaymentMethodMutation(c, body); + return { ok: true }; + }, + "organization.command.billing.status.set": async (c, body) => { + await setOrganizationBillingStatusMutation(c, body); + return { ok: true }; + }, + "organization.command.billing.invoice.upsert": async (c, body) => { + await upsertOrganizationInvoiceMutation(c, body); + return { ok: true }; + }, + "organization.command.billing.seat_usage.record": async (c, body) => { + await recordOrganizationSeatUsageMutation(c, body); + return { ok: true }; + }, +}; + +export async function runOrganizationWorkflow(ctx: any): Promise { + await ctx.loop("organization-command-loop", async (loopCtx: any) => { + const msg = await loopCtx.queue.next("next-organization-command", { + names: [...ORGANIZATION_QUEUE_NAMES], + completable: true, + }); + + if (!msg) { + return Loop.continue(undefined); + } + + const handler = COMMAND_HANDLERS[msg.name as OrganizationQueueName]; + if (!handler) { + logActorWarning("organization", "unknown organization command", { command: msg.name }); + await msg.complete({ error: `Unknown command: ${msg.name}` }).catch(() => {}); + return Loop.continue(undefined); + } + + try { + // Wrap in a step so c.state and c.db are accessible inside mutation functions. + const result = await loopCtx.step({ + name: msg.name, + timeout: 10 * 60_000, + run: async () => handler(loopCtx, msg.body), + }); + try { + await msg.complete(result); + } catch (completeError) { + logActorWarning("organization", "organization workflow failed completing response", { + command: msg.name, + error: resolveErrorMessage(completeError), + }); + } + } catch (error) { + const message = resolveErrorMessage(error); + logActorWarning("organization", "organization workflow command failed", { + command: msg.name, + error: message, + }); + await msg.complete({ error: message }).catch(() => {}); + } + + return Loop.continue(undefined); + }); +} diff --git a/foundry/packages/backend/src/actors/repository/actions.ts b/foundry/packages/backend/src/actors/repository/actions.ts deleted file mode 100644 index 9ef8e75..0000000 --- a/foundry/packages/backend/src/actors/repository/actions.ts +++ /dev/null @@ -1,557 +0,0 @@ -// @ts-nocheck -import { randomUUID } from "node:crypto"; -import { and, desc, eq, isNotNull, ne } from "drizzle-orm"; -import { Loop } from "rivetkit/workflow"; -import type { AgentType, RepoOverview, SandboxProviderId, TaskRecord, TaskSummary } from "@sandbox-agent/foundry-shared"; -import { getGithubData, getOrCreateHistory, getOrCreateTask, getTask, selfRepository } from "../handles.js"; -import { deriveFallbackTitle, resolveCreateFlowDecision } from "../../services/create-flow.js"; -import { expectQueueResponse } from "../../services/queue.js"; -import { isActorNotFoundError, logActorWarning, resolveErrorMessage } from "../logging.js"; -import { repoMeta, taskIndex } from "./db/schema.js"; - -interface CreateTaskCommand { - task: string; - sandboxProviderId: SandboxProviderId; - agentType: AgentType | null; - explicitTitle: string | null; - explicitBranchName: string | null; - initialPrompt: string | null; - onBranch: string | null; -} - -interface RegisterTaskBranchCommand { - taskId: string; - branchName: string; - requireExistingRemote?: boolean; -} - -interface ListTaskSummariesCommand { - includeArchived?: boolean; -} - -interface GetTaskEnrichedCommand { - taskId: string; -} - -interface GetPullRequestForBranchCommand { - branchName: string; -} - -const REPOSITORY_QUEUE_NAMES = ["repository.command.createTask", "repository.command.registerTaskBranch"] as const; - -type RepositoryQueueName = (typeof REPOSITORY_QUEUE_NAMES)[number]; - -export { REPOSITORY_QUEUE_NAMES }; - -export function repositoryWorkflowQueueName(name: RepositoryQueueName): RepositoryQueueName { - return name; -} - -function isStaleTaskReferenceError(error: unknown): boolean { - const message = resolveErrorMessage(error); - return isActorNotFoundError(error) || message.startsWith("Task not found:"); -} - -async function persistRemoteUrl(c: any, remoteUrl: string): Promise { - c.state.remoteUrl = remoteUrl; - await c.db - .insert(repoMeta) - .values({ - id: 1, - remoteUrl, - updatedAt: Date.now(), - }) - .onConflictDoUpdate({ - target: repoMeta.id, - set: { - remoteUrl, - updatedAt: Date.now(), - }, - }) - .run(); -} - -async function deleteStaleTaskIndexRow(c: any, taskId: string): Promise { - try { - await c.db.delete(taskIndex).where(eq(taskIndex.taskId, taskId)).run(); - } catch { - // Best effort cleanup only. - } -} - -async function reinsertTaskIndexRow(c: any, taskId: string, branchName: string | null, updatedAt: number): Promise { - const now = Date.now(); - await c.db - .insert(taskIndex) - .values({ - taskId, - branchName, - createdAt: updatedAt || now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: taskIndex.taskId, - set: { - branchName, - updatedAt: now, - }, - }) - .run(); -} - -async function listKnownTaskBranches(c: any): Promise { - const rows = await c.db.select({ branchName: taskIndex.branchName }).from(taskIndex).where(isNotNull(taskIndex.branchName)).all(); - return rows.map((row) => row.branchName).filter((value): value is string => typeof value === "string" && value.trim().length > 0); -} - -async function resolveGitHubRepository(c: any) { - const githubData = getGithubData(c, c.state.organizationId); - return await githubData.getRepository({ repoId: c.state.repoId }).catch(() => null); -} - -async function listGitHubBranches(c: any): Promise> { - const githubData = getGithubData(c, c.state.organizationId); - return await githubData.listBranchesForRepository({ repoId: c.state.repoId }).catch(() => []); -} - -async function enrichTaskRecord(c: any, record: TaskRecord): Promise { - const branchName = record.branchName?.trim() || null; - if (!branchName) { - return record; - } - - const pr = - branchName != null - ? await getGithubData(c, c.state.organizationId) - .listPullRequestsForRepository({ repoId: c.state.repoId }) - .then((rows: any[]) => rows.find((row) => row.headRefName === branchName) ?? null) - .catch(() => null) - : null; - - return { - ...record, - prUrl: pr?.url ?? null, - prAuthor: pr?.authorLogin ?? null, - ciStatus: null, - reviewStatus: null, - reviewer: pr?.authorLogin ?? null, - diffStat: record.diffStat ?? null, - hasUnpushed: record.hasUnpushed ?? null, - conflictsWithMain: record.conflictsWithMain ?? null, - parentBranch: record.parentBranch ?? null, - }; -} - -async function createTaskMutation(c: any, cmd: CreateTaskCommand): Promise { - const organizationId = c.state.organizationId; - const repoId = c.state.repoId; - const repoRemote = c.state.remoteUrl; - const onBranch = cmd.onBranch?.trim() || null; - const taskId = randomUUID(); - let initialBranchName: string | null = null; - let initialTitle: string | null = null; - - await persistRemoteUrl(c, repoRemote); - - if (onBranch) { - initialBranchName = onBranch; - initialTitle = deriveFallbackTitle(cmd.task, cmd.explicitTitle ?? undefined); - - await registerTaskBranchMutation(c, { - taskId, - branchName: onBranch, - requireExistingRemote: true, - }); - } else { - const reservedBranches = await listKnownTaskBranches(c); - const resolved = resolveCreateFlowDecision({ - task: cmd.task, - explicitTitle: cmd.explicitTitle ?? undefined, - explicitBranchName: cmd.explicitBranchName ?? undefined, - localBranches: [], - taskBranches: reservedBranches, - }); - - initialBranchName = resolved.branchName; - initialTitle = resolved.title; - - const now = Date.now(); - await c.db - .insert(taskIndex) - .values({ - taskId, - branchName: resolved.branchName, - createdAt: now, - updatedAt: now, - }) - .onConflictDoNothing() - .run(); - } - - let taskHandle: Awaited>; - try { - taskHandle = await getOrCreateTask(c, organizationId, repoId, taskId, { - organizationId, - repoId, - taskId, - repoRemote, - branchName: initialBranchName, - title: initialTitle, - task: cmd.task, - sandboxProviderId: cmd.sandboxProviderId, - agentType: cmd.agentType, - explicitTitle: null, - explicitBranchName: null, - initialPrompt: cmd.initialPrompt, - }); - } catch (error) { - if (initialBranchName) { - await deleteStaleTaskIndexRow(c, taskId); - } - throw error; - } - - const created = await taskHandle.initialize({ sandboxProviderId: cmd.sandboxProviderId }); - - const history = await getOrCreateHistory(c, organizationId, repoId); - await history.append({ - kind: "task.created", - taskId, - payload: { - repoId, - sandboxProviderId: cmd.sandboxProviderId, - }, - }); - - return created; -} - -async function registerTaskBranchMutation(c: any, cmd: RegisterTaskBranchCommand): Promise<{ branchName: string; headSha: string }> { - const branchName = cmd.branchName.trim(); - if (!branchName) { - throw new Error("branchName is required"); - } - - await persistRemoteUrl(c, c.state.remoteUrl); - - const existingOwner = await c.db - .select({ taskId: taskIndex.taskId }) - .from(taskIndex) - .where(and(eq(taskIndex.branchName, branchName), ne(taskIndex.taskId, cmd.taskId))) - .get(); - - if (existingOwner) { - let ownerMissing = false; - try { - await getTask(c, c.state.organizationId, c.state.repoId, existingOwner.taskId).get(); - } catch (error) { - if (isStaleTaskReferenceError(error)) { - ownerMissing = true; - await deleteStaleTaskIndexRow(c, existingOwner.taskId); - } else { - throw error; - } - } - if (!ownerMissing) { - throw new Error(`branch is already assigned to a different task: ${branchName}`); - } - } - - const branches = await listGitHubBranches(c); - const branchMatch = branches.find((branch) => branch.branchName === branchName) ?? null; - if (cmd.requireExistingRemote && !branchMatch) { - throw new Error(`Remote branch not found: ${branchName}`); - } - - const repository = await resolveGitHubRepository(c); - const defaultBranch = repository?.defaultBranch ?? "main"; - const headSha = branchMatch?.commitSha ?? branches.find((branch) => branch.branchName === defaultBranch)?.commitSha ?? ""; - - const now = Date.now(); - await c.db - .insert(taskIndex) - .values({ - taskId: cmd.taskId, - branchName, - createdAt: now, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: taskIndex.taskId, - set: { - branchName, - updatedAt: now, - }, - }) - .run(); - - return { branchName, headSha }; -} - -async function listTaskSummaries(c: any, includeArchived = false): Promise { - const taskRows = await c.db.select({ taskId: taskIndex.taskId }).from(taskIndex).orderBy(desc(taskIndex.updatedAt)).all(); - const records: TaskSummary[] = []; - - for (const row of taskRows) { - try { - const record = await getTask(c, c.state.organizationId, c.state.repoId, row.taskId).get(); - if (!includeArchived && record.status === "archived") { - continue; - } - records.push({ - organizationId: record.organizationId, - repoId: record.repoId, - taskId: record.taskId, - branchName: record.branchName, - title: record.title, - status: record.status, - updatedAt: record.updatedAt, - }); - } catch (error) { - if (isStaleTaskReferenceError(error)) { - await deleteStaleTaskIndexRow(c, row.taskId); - continue; - } - logActorWarning("repository", "failed loading task summary row", { - organizationId: c.state.organizationId, - repoId: c.state.repoId, - taskId: row.taskId, - error: resolveErrorMessage(error), - }); - } - } - - records.sort((a, b) => b.updatedAt - a.updatedAt); - return records; -} - -function sortOverviewBranches( - branches: Array<{ - branchName: string; - commitSha: string; - taskId: string | null; - taskTitle: string | null; - taskStatus: TaskRecord["status"] | null; - prNumber: number | null; - prState: string | null; - prUrl: string | null; - ciStatus: string | null; - reviewStatus: string | null; - reviewer: string | null; - updatedAt: number; - }>, - defaultBranch: string | null, -) { - return [...branches].sort((left, right) => { - if (defaultBranch) { - if (left.branchName === defaultBranch && right.branchName !== defaultBranch) return -1; - if (right.branchName === defaultBranch && left.branchName !== defaultBranch) return 1; - } - if (Boolean(left.taskId) !== Boolean(right.taskId)) { - return left.taskId ? -1 : 1; - } - if (left.updatedAt !== right.updatedAt) { - return right.updatedAt - left.updatedAt; - } - return left.branchName.localeCompare(right.branchName); - }); -} - -export async function runRepositoryWorkflow(ctx: any): Promise { - await ctx.loop("repository-command-loop", async (loopCtx: any) => { - const msg = await loopCtx.queue.next("next-repository-command", { - names: [...REPOSITORY_QUEUE_NAMES], - completable: true, - }); - if (!msg) { - return Loop.continue(undefined); - } - - try { - if (msg.name === "repository.command.createTask") { - const result = await loopCtx.step({ - name: "repository-create-task", - timeout: 5 * 60_000, - run: async () => createTaskMutation(loopCtx, msg.body as CreateTaskCommand), - }); - await msg.complete(result); - return Loop.continue(undefined); - } - - if (msg.name === "repository.command.registerTaskBranch") { - const result = await loopCtx.step({ - name: "repository-register-task-branch", - timeout: 60_000, - run: async () => registerTaskBranchMutation(loopCtx, msg.body as RegisterTaskBranchCommand), - }); - await msg.complete(result); - return Loop.continue(undefined); - } - } catch (error) { - const message = resolveErrorMessage(error); - logActorWarning("repository", "repository workflow command failed", { - queueName: msg.name, - error: message, - }); - await msg.complete({ error: message }).catch(() => {}); - } - - return Loop.continue(undefined); - }); -} - -export const repositoryActions = { - async createTask(c: any, cmd: CreateTaskCommand): Promise { - const self = selfRepository(c); - return expectQueueResponse( - await self.send(repositoryWorkflowQueueName("repository.command.createTask"), cmd, { - wait: true, - timeout: 10_000, - }), - ); - }, - - async listReservedBranches(c: any): Promise { - return await listKnownTaskBranches(c); - }, - - async registerTaskBranch(c: any, cmd: RegisterTaskBranchCommand): Promise<{ branchName: string; headSha: string }> { - const self = selfRepository(c); - return expectQueueResponse<{ branchName: string; headSha: string }>( - await self.send(repositoryWorkflowQueueName("repository.command.registerTaskBranch"), cmd, { - wait: true, - timeout: 10_000, - }), - ); - }, - - async listTaskSummaries(c: any, cmd?: ListTaskSummariesCommand): Promise { - return await listTaskSummaries(c, cmd?.includeArchived === true); - }, - - async getTaskEnriched(c: any, cmd: GetTaskEnrichedCommand): Promise { - const row = await c.db.select({ taskId: taskIndex.taskId }).from(taskIndex).where(eq(taskIndex.taskId, cmd.taskId)).get(); - if (!row) { - const record = await getTask(c, c.state.organizationId, c.state.repoId, cmd.taskId).get(); - await reinsertTaskIndexRow(c, cmd.taskId, record.branchName ?? null, record.updatedAt ?? Date.now()); - return await enrichTaskRecord(c, record); - } - - try { - const record = await getTask(c, c.state.organizationId, c.state.repoId, cmd.taskId).get(); - return await enrichTaskRecord(c, record); - } catch (error) { - if (isStaleTaskReferenceError(error)) { - await deleteStaleTaskIndexRow(c, cmd.taskId); - throw new Error(`Unknown task in repo ${c.state.repoId}: ${cmd.taskId}`); - } - throw error; - } - }, - - async getRepositoryMetadata(c: any): Promise<{ defaultBranch: string | null; fullName: string | null; remoteUrl: string }> { - const repository = await resolveGitHubRepository(c); - return { - defaultBranch: repository?.defaultBranch ?? null, - fullName: repository?.fullName ?? null, - remoteUrl: c.state.remoteUrl, - }; - }, - - async getRepoOverview(c: any): Promise { - await persistRemoteUrl(c, c.state.remoteUrl); - - const now = Date.now(); - const repository = await resolveGitHubRepository(c); - const githubBranches = await listGitHubBranches(c).catch(() => []); - const githubData = getGithubData(c, c.state.organizationId); - const prRows = await githubData.listPullRequestsForRepository({ repoId: c.state.repoId }).catch(() => []); - const prByBranch = new Map(prRows.map((row) => [row.headRefName, row])); - - const taskRows = await c.db - .select({ - taskId: taskIndex.taskId, - branchName: taskIndex.branchName, - updatedAt: taskIndex.updatedAt, - }) - .from(taskIndex) - .all(); - - const taskMetaByBranch = new Map(); - for (const row of taskRows) { - if (!row.branchName) { - continue; - } - try { - const record = await getTask(c, c.state.organizationId, c.state.repoId, row.taskId).get(); - taskMetaByBranch.set(row.branchName, { - taskId: row.taskId, - title: record.title ?? null, - status: record.status, - updatedAt: record.updatedAt, - }); - } catch (error) { - if (isStaleTaskReferenceError(error)) { - await deleteStaleTaskIndexRow(c, row.taskId); - continue; - } - } - } - - const branchMap = new Map(); - for (const branch of githubBranches) { - branchMap.set(branch.branchName, branch); - } - for (const branchName of taskMetaByBranch.keys()) { - if (!branchMap.has(branchName)) { - branchMap.set(branchName, { branchName, commitSha: "" }); - } - } - if (repository?.defaultBranch && !branchMap.has(repository.defaultBranch)) { - branchMap.set(repository.defaultBranch, { branchName: repository.defaultBranch, commitSha: "" }); - } - - const branches = sortOverviewBranches( - [...branchMap.values()].map((branch) => { - const taskMeta = taskMetaByBranch.get(branch.branchName); - const pr = prByBranch.get(branch.branchName); - return { - branchName: branch.branchName, - commitSha: branch.commitSha, - taskId: taskMeta?.taskId ?? null, - taskTitle: taskMeta?.title ?? null, - taskStatus: taskMeta?.status ?? null, - prNumber: pr?.number ?? null, - prState: pr?.state ?? null, - prUrl: pr?.url ?? null, - ciStatus: null, - reviewStatus: null, - reviewer: pr?.authorLogin ?? null, - updatedAt: Math.max(taskMeta?.updatedAt ?? 0, pr?.updatedAtMs ?? 0, now), - }; - }), - repository?.defaultBranch ?? null, - ); - - return { - organizationId: c.state.organizationId, - repoId: c.state.repoId, - remoteUrl: c.state.remoteUrl, - baseRef: repository?.defaultBranch ?? null, - fetchedAt: now, - branches, - }; - }, - - async getPullRequestForBranch(c: any, cmd: GetPullRequestForBranchCommand): Promise<{ number: number; status: "draft" | "ready" } | null> { - const branchName = cmd.branchName?.trim(); - if (!branchName) { - return null; - } - const githubData = getGithubData(c, c.state.organizationId); - return await githubData.getPullRequestForBranch({ - repoId: c.state.repoId, - branchName, - }); - }, -}; diff --git a/foundry/packages/backend/src/actors/repository/db/db.ts b/foundry/packages/backend/src/actors/repository/db/db.ts deleted file mode 100644 index 79bed8e..0000000 --- a/foundry/packages/backend/src/actors/repository/db/db.ts +++ /dev/null @@ -1,5 +0,0 @@ -import { db } from "rivetkit/db/drizzle"; -import * as schema from "./schema.js"; -import migrations from "./migrations.js"; - -export const repositoryDb = db({ schema, migrations }); diff --git a/foundry/packages/backend/src/actors/repository/db/drizzle.config.ts b/foundry/packages/backend/src/actors/repository/db/drizzle.config.ts deleted file mode 100644 index 8b9a1b9..0000000 --- a/foundry/packages/backend/src/actors/repository/db/drizzle.config.ts +++ /dev/null @@ -1,6 +0,0 @@ -import { defineConfig } from "rivetkit/db/drizzle"; - -export default defineConfig({ - out: "./src/actors/repository/db/drizzle", - schema: "./src/actors/repository/db/schema.ts", -}); diff --git a/foundry/packages/backend/src/actors/repository/db/drizzle/0000_useful_la_nuit.sql b/foundry/packages/backend/src/actors/repository/db/drizzle/0000_useful_la_nuit.sql deleted file mode 100644 index 14bc071..0000000 --- a/foundry/packages/backend/src/actors/repository/db/drizzle/0000_useful_la_nuit.sql +++ /dev/null @@ -1,12 +0,0 @@ -CREATE TABLE `repo_meta` ( - `id` integer PRIMARY KEY NOT NULL, - `remote_url` text NOT NULL, - `updated_at` integer NOT NULL -); ---> statement-breakpoint -CREATE TABLE `task_index` ( - `task_id` text PRIMARY KEY NOT NULL, - `branch_name` text, - `created_at` integer NOT NULL, - `updated_at` integer NOT NULL -); diff --git a/foundry/packages/backend/src/actors/repository/db/drizzle/meta/_journal.json b/foundry/packages/backend/src/actors/repository/db/drizzle/meta/_journal.json deleted file mode 100644 index deebd86..0000000 --- a/foundry/packages/backend/src/actors/repository/db/drizzle/meta/_journal.json +++ /dev/null @@ -1,13 +0,0 @@ -{ - "version": "7", - "dialect": "sqlite", - "entries": [ - { - "idx": 0, - "version": "6", - "when": 1773376221848, - "tag": "0000_useful_la_nuit", - "breakpoints": true - } - ] -} diff --git a/foundry/packages/backend/src/actors/repository/db/migrations.ts b/foundry/packages/backend/src/actors/repository/db/migrations.ts deleted file mode 100644 index ebdb167..0000000 --- a/foundry/packages/backend/src/actors/repository/db/migrations.ts +++ /dev/null @@ -1,43 +0,0 @@ -// This file is generated by src/actors/_scripts/generate-actor-migrations.ts. -// Source of truth is drizzle-kit output under ./drizzle (meta/_journal.json + *.sql). -// Do not hand-edit this file. - -const journal = { - entries: [ - { - idx: 0, - when: 1773376221848, - tag: "0000_useful_la_nuit", - breakpoints: true, - }, - { - idx: 1, - when: 1778900000000, - tag: "0001_remove_local_git_state", - breakpoints: true, - }, - ], -} as const; - -export default { - journal, - migrations: { - m0000: `CREATE TABLE \`repo_meta\` ( -\t\`id\` integer PRIMARY KEY NOT NULL, -\t\`remote_url\` text NOT NULL, -\t\`updated_at\` integer NOT NULL -); ---> statement-breakpoint -CREATE TABLE \`task_index\` ( -\t\`task_id\` text PRIMARY KEY NOT NULL, -\t\`branch_name\` text, -\t\`created_at\` integer NOT NULL, -\t\`updated_at\` integer NOT NULL -); -`, - m0001: `DROP TABLE IF EXISTS \`branches\`; ---> statement-breakpoint -DROP TABLE IF EXISTS \`repo_action_jobs\`; -`, - } as const, -}; diff --git a/foundry/packages/backend/src/actors/repository/db/schema.ts b/foundry/packages/backend/src/actors/repository/db/schema.ts deleted file mode 100644 index 2f597e8..0000000 --- a/foundry/packages/backend/src/actors/repository/db/schema.ts +++ /dev/null @@ -1,23 +0,0 @@ -import { integer, sqliteTable, text } from "rivetkit/db/drizzle"; - -// SQLite is per repository actor instance (organizationId+repoId). - -export const repoMeta = sqliteTable("repo_meta", { - id: integer("id").primaryKey(), - remoteUrl: text("remote_url").notNull(), - updatedAt: integer("updated_at").notNull(), -}); - -/** - * Coordinator index of TaskActor instances. - * The repository actor is the coordinator for tasks. Each row maps a - * taskId to its branch name. Used for branch conflict checking and - * task-by-branch lookups. Rows are inserted at task creation and - * updated on branch rename. - */ -export const taskIndex = sqliteTable("task_index", { - taskId: text("task_id").notNull().primaryKey(), - branchName: text("branch_name"), - createdAt: integer("created_at").notNull(), - updatedAt: integer("updated_at").notNull(), -}); diff --git a/foundry/packages/backend/src/actors/repository/index.ts b/foundry/packages/backend/src/actors/repository/index.ts deleted file mode 100644 index 4253a90..0000000 --- a/foundry/packages/backend/src/actors/repository/index.ts +++ /dev/null @@ -1,27 +0,0 @@ -import { actor, queue } from "rivetkit"; -import { workflow } from "rivetkit/workflow"; -import { repositoryDb } from "./db/db.js"; -import { REPOSITORY_QUEUE_NAMES, repositoryActions, runRepositoryWorkflow } from "./actions.js"; - -export interface RepositoryInput { - organizationId: string; - repoId: string; - remoteUrl: string; -} - -export const repository = actor({ - db: repositoryDb, - queues: Object.fromEntries(REPOSITORY_QUEUE_NAMES.map((name) => [name, queue()])), - options: { - name: "Repository", - icon: "folder", - actionTimeout: 5 * 60_000, - }, - createState: (_c, input: RepositoryInput) => ({ - organizationId: input.organizationId, - repoId: input.repoId, - remoteUrl: input.remoteUrl, - }), - actions: repositoryActions, - run: workflow(runRepositoryWorkflow), -}); diff --git a/foundry/packages/backend/src/actors/sandbox/index.ts b/foundry/packages/backend/src/actors/sandbox/index.ts index 2e2087b..0444d9b 100644 --- a/foundry/packages/backend/src/actors/sandbox/index.ts +++ b/foundry/packages/backend/src/actors/sandbox/index.ts @@ -1,14 +1,25 @@ -import { actor } from "rivetkit"; +// @ts-nocheck +import { actor, queue } from "rivetkit"; +import { workflow, Loop } from "rivetkit/workflow"; import { e2b, sandboxActor } from "rivetkit/sandbox"; import { existsSync } from "node:fs"; import Dockerode from "dockerode"; +import { DEFAULT_WORKSPACE_MODEL_GROUPS, workspaceModelGroupsFromSandboxAgents, type WorkspaceModelGroup } from "@sandbox-agent/foundry-shared"; import { SandboxAgent } from "sandbox-agent"; import { getActorRuntimeContext } from "../context.js"; import { organizationKey } from "../keys.js"; +import { selfTaskSandbox } from "../handles.js"; +import { logActorWarning, resolveErrorMessage } from "../logging.js"; +import { expectQueueResponse } from "../../services/queue.js"; import { resolveSandboxProviderId } from "../../sandbox-config.js"; -const SANDBOX_REPO_CWD = "/home/sandbox/organization/repo"; -const DEFAULT_LOCAL_SANDBOX_IMAGE = "rivetdev/sandbox-agent:full"; +/** + * Default repo CWD inside the sandbox. The actual path is resolved dynamically + * via `$HOME/repo` because different sandbox providers run as different users + * (e.g. E2B uses `/home/user`, local Docker uses `/home/sandbox`). + */ +const DEFAULT_SANDBOX_REPO_CWD = "/home/user/repo"; +const DEFAULT_LOCAL_SANDBOX_IMAGE = "rivetdev/sandbox-agent:foundry-base-latest"; const DEFAULT_LOCAL_SANDBOX_PORT = 2468; const dockerClient = new Dockerode({ socketPath: "/var/run/docker.sock" }); @@ -201,8 +212,15 @@ const baseTaskSandbox = sandboxActor({ if (sandboxProviderId === "e2b") { return e2b({ create: () => ({ - template: config.sandboxProviders.e2b.template ?? "sandbox-agent-full-0.3.x", + template: config.sandboxProviders.e2b.template ?? "sandbox-agent-full-0.5.x", envs: sandboxEnvObject(), + // TEMPORARY: Default E2B timeout is 5 minutes which is too short. + // Set to 1 hour as a stopgap. Remove this once the E2B provider in + // sandbox-agent uses betaCreate + autoPause (see + // .context/proposal-rivetkit-sandbox-resilience.md). At that point + // the provider handles timeout/pause lifecycle and this override is + // unnecessary. + timeoutMs: 60 * 60 * 1000, }), installAgents: ["claude", "codex"], }); @@ -219,8 +237,12 @@ async function broadcastProcesses(c: any, actions: Record { sandboxProviderId === "e2b" ? e2b({ create: () => ({ - template: config.sandboxProviders.e2b.template ?? "sandbox-agent-full-0.3.x", + template: config.sandboxProviders.e2b.template ?? "sandbox-agent-full-0.5.x", envs: sandboxEnvObject(), }), installAgents: ["claude", "codex"], @@ -258,38 +280,224 @@ async function providerForConnection(c: any): Promise { return provider; } +async function listWorkspaceModelGroupsForSandbox(c: any): Promise { + const provider = await providerForConnection(c); + if (!provider || !c.state.sandboxId || typeof provider.connectAgent !== "function") { + return DEFAULT_WORKSPACE_MODEL_GROUPS; + } + + try { + const client = await provider.connectAgent(c.state.sandboxId, { + waitForHealth: { + timeoutMs: 15_000, + }, + }); + const listed = await client.listAgents({ config: true }); + const groups = workspaceModelGroupsFromSandboxAgents(Array.isArray(listed?.agents) ? listed.agents : []); + return groups.length > 0 ? groups : DEFAULT_WORKSPACE_MODEL_GROUPS; + } catch { + return DEFAULT_WORKSPACE_MODEL_GROUPS; + } +} + const baseActions = baseTaskSandbox.config.actions as Record Promise>; +// --------------------------------------------------------------------------- +// Dynamic repo CWD resolution +// --------------------------------------------------------------------------- + +let cachedRepoCwd: string | null = null; + +/** + * Resolve the repo CWD inside the sandbox by querying `$HOME`. + * Different providers run as different users (E2B: `/home/user`, local Docker: + * `/home/sandbox`), so the path must be resolved dynamically. The result is + * cached for the lifetime of this sandbox actor instance. + */ +async function resolveRepoCwd(c: any): Promise { + if (cachedRepoCwd) return cachedRepoCwd; + + try { + const result = await baseActions.runProcess(c, { + command: "bash", + args: ["-lc", "echo $HOME"], + cwd: "/", + timeoutMs: 10_000, + }); + const home = (result.stdout ?? result.result ?? "").trim(); + if (home && home.startsWith("/")) { + cachedRepoCwd = `${home}/repo`; + return cachedRepoCwd; + } + } catch (error) { + logActorWarning("taskSandbox", "failed to resolve $HOME, using default", { + error: resolveErrorMessage(error), + }); + } + + cachedRepoCwd = DEFAULT_SANDBOX_REPO_CWD; + return cachedRepoCwd; +} + +// --------------------------------------------------------------------------- +// Queue names for sandbox actor +// --------------------------------------------------------------------------- + +const SANDBOX_QUEUE_NAMES = [ + "sandbox.command.createSession", + "sandbox.command.resumeOrCreateSession", + "sandbox.command.destroySession", + "sandbox.command.createProcess", + "sandbox.command.stopProcess", + "sandbox.command.killProcess", + "sandbox.command.deleteProcess", +] as const; + +type SandboxQueueName = (typeof SANDBOX_QUEUE_NAMES)[number]; + +function sandboxWorkflowQueueName(name: SandboxQueueName): SandboxQueueName { + return name; +} + +// --------------------------------------------------------------------------- +// Mutation handlers — executed inside the workflow command loop +// --------------------------------------------------------------------------- + +async function createSessionMutation(c: any, request: any): Promise { + const session = await baseActions.createSession(c, request); + const sessionId = typeof request?.id === "string" && request.id.length > 0 ? request.id : session?.id; + const modeId = modeIdForAgent(request?.agent); + if (sessionId && modeId) { + try { + await baseActions.rawSendSessionMethod(c, sessionId, "session/set_mode", { modeId }); + } catch { + // Session mode updates are best-effort. + } + } + return sanitizeActorResult(session); +} + +async function resumeOrCreateSessionMutation(c: any, request: any): Promise { + return sanitizeActorResult(await baseActions.resumeOrCreateSession(c, request)); +} + +async function destroySessionMutation(c: any, sessionId: string): Promise { + return sanitizeActorResult(await baseActions.destroySession(c, sessionId)); +} + +async function createProcessMutation(c: any, request: any): Promise { + const created = await baseActions.createProcess(c, request); + await broadcastProcesses(c, baseActions); + return created; +} + +async function runProcessMutation(c: any, request: any): Promise { + const result = await baseActions.runProcess(c, request); + await broadcastProcesses(c, baseActions); + return result; +} + +async function stopProcessMutation(c: any, processId: string, query?: any): Promise { + const stopped = await baseActions.stopProcess(c, processId, query); + await broadcastProcesses(c, baseActions); + return stopped; +} + +async function killProcessMutation(c: any, processId: string, query?: any): Promise { + const killed = await baseActions.killProcess(c, processId, query); + await broadcastProcesses(c, baseActions); + return killed; +} + +async function deleteProcessMutation(c: any, processId: string): Promise { + await baseActions.deleteProcess(c, processId); + await broadcastProcesses(c, baseActions); +} + +// --------------------------------------------------------------------------- +// Workflow command loop +// --------------------------------------------------------------------------- + +type SandboxWorkflowHandler = (loopCtx: any, body: any) => Promise; + +const SANDBOX_COMMAND_HANDLERS: Record = { + "sandbox.command.createSession": async (c, body) => createSessionMutation(c, body), + "sandbox.command.resumeOrCreateSession": async (c, body) => resumeOrCreateSessionMutation(c, body), + "sandbox.command.destroySession": async (c, body) => destroySessionMutation(c, body?.sessionId), + "sandbox.command.createProcess": async (c, body) => createProcessMutation(c, body), + "sandbox.command.stopProcess": async (c, body) => stopProcessMutation(c, body?.processId, body?.query), + "sandbox.command.killProcess": async (c, body) => killProcessMutation(c, body?.processId, body?.query), + "sandbox.command.deleteProcess": async (c, body) => { + await deleteProcessMutation(c, body?.processId); + return { ok: true }; + }, +}; + +async function runSandboxWorkflow(ctx: any): Promise { + await ctx.loop("sandbox-command-loop", async (loopCtx: any) => { + const msg = await loopCtx.queue.next("next-sandbox-command", { + names: [...SANDBOX_QUEUE_NAMES], + completable: true, + }); + + if (!msg) { + return Loop.continue(undefined); + } + + const handler = SANDBOX_COMMAND_HANDLERS[msg.name as SandboxQueueName]; + if (!handler) { + logActorWarning("taskSandbox", "unknown sandbox command", { command: msg.name }); + await msg.complete({ error: `Unknown command: ${msg.name}` }).catch(() => {}); + return Loop.continue(undefined); + } + + try { + // Wrap in a step so c.state and c.db are accessible inside mutation functions. + const result = await loopCtx.step({ + name: msg.name, + timeout: 10 * 60_000, + run: async () => handler(loopCtx, msg.body), + }); + try { + await msg.complete(result); + } catch (completeError) { + logActorWarning("taskSandbox", "sandbox workflow failed completing response", { + command: msg.name, + error: resolveErrorMessage(completeError), + }); + } + } catch (error) { + const message = resolveErrorMessage(error); + logActorWarning("taskSandbox", "sandbox workflow command failed", { + command: msg.name, + error: message, + }); + await msg.complete({ error: message }).catch(() => {}); + } + + return Loop.continue(undefined); + }); +} + +// --------------------------------------------------------------------------- +// Actor definition +// --------------------------------------------------------------------------- + export const taskSandbox = actor({ ...baseTaskSandbox.config, + queues: Object.fromEntries(SANDBOX_QUEUE_NAMES.map((name) => [name, queue()])), options: { ...baseTaskSandbox.config.options, actionTimeout: 10 * 60_000, }, actions: { ...baseActions, - async createSession(c: any, request: any): Promise { - const session = await baseActions.createSession(c, request); - const sessionId = typeof request?.id === "string" && request.id.length > 0 ? request.id : session?.id; - const modeId = modeIdForAgent(request?.agent); - if (sessionId && modeId) { - try { - await baseActions.rawSendSessionMethod(c, sessionId, "session/set_mode", { modeId }); - } catch { - // Session mode updates are best-effort. - } - } - return sanitizeActorResult(session); - }, + // Read actions — direct (no queue) async resumeSession(c: any, sessionId: string): Promise { return sanitizeActorResult(await baseActions.resumeSession(c, sessionId)); }, - async resumeOrCreateSession(c: any, request: any): Promise { - return sanitizeActorResult(await baseActions.resumeOrCreateSession(c, request)); - }, - async getSession(c: any, sessionId: string): Promise { return sanitizeActorResult(await baseActions.getSession(c, sessionId)); }, @@ -298,51 +506,17 @@ export const taskSandbox = actor({ return sanitizeActorResult(await baseActions.listSessions(c, query)); }, - async destroySession(c: any, sessionId: string): Promise { - return sanitizeActorResult(await baseActions.destroySession(c, sessionId)); - }, - - async sendPrompt(c: any, request: { sessionId: string; prompt: string }): Promise { - const text = typeof request?.prompt === "string" ? request.prompt.trim() : ""; - if (!text) { - return null; + async listProcesses(c: any): Promise { + try { + return await baseActions.listProcesses(c); + } catch (error) { + // Sandbox may be gone (E2B timeout, destroyed, etc.) — degrade to empty + logActorWarning("taskSandbox", "listProcesses failed, sandbox may be expired", { + sandboxId: c.state.sandboxId, + error: resolveErrorMessage(error), + }); + return { processes: [] }; } - - const session = await baseActions.resumeSession(c, request.sessionId); - if (!session || typeof session.prompt !== "function") { - throw new Error(`session '${request.sessionId}' not found`); - } - - return sanitizeActorResult(await session.prompt([{ type: "text", text }])); - }, - - async createProcess(c: any, request: any): Promise { - const created = await baseActions.createProcess(c, request); - await broadcastProcesses(c, baseActions); - return created; - }, - - async runProcess(c: any, request: any): Promise { - const result = await baseActions.runProcess(c, request); - await broadcastProcesses(c, baseActions); - return result; - }, - - async stopProcess(c: any, processId: string, query?: any): Promise { - const stopped = await baseActions.stopProcess(c, processId, query); - await broadcastProcesses(c, baseActions); - return stopped; - }, - - async killProcess(c: any, processId: string, query?: any): Promise { - const killed = await baseActions.killProcess(c, processId, query); - await broadcastProcesses(c, baseActions); - return killed; - }, - - async deleteProcess(c: any, processId: string): Promise { - await baseActions.deleteProcess(c, processId); - await broadcastProcesses(c, baseActions); }, async sandboxAgentConnection(c: any): Promise<{ endpoint: string; token?: string }> { @@ -360,6 +534,10 @@ export const taskSandbox = actor({ } }, + async listWorkspaceModelGroups(c: any): Promise { + return await listWorkspaceModelGroupsForSandbox(c); + }, + async providerState(c: any): Promise<{ sandboxProviderId: "e2b" | "local"; sandboxId: string; state: string; at: number }> { const { config } = getActorRuntimeContext(); const { taskId } = parseTaskSandboxKey(c.key); @@ -392,10 +570,77 @@ export const taskSandbox = actor({ } }, - async repoCwd(): Promise<{ cwd: string }> { - return { cwd: SANDBOX_REPO_CWD }; + async repoCwd(c: any): Promise<{ cwd: string }> { + const resolved = await resolveRepoCwd(c); + return { cwd: resolved }; + }, + + // Long-running action — kept as direct action to avoid blocking the + // workflow loop (prompt responses can take minutes). + async sendPrompt(c: any, request: { sessionId: string; prompt: string }): Promise { + const text = typeof request?.prompt === "string" ? request.prompt.trim() : ""; + if (!text) { + return null; + } + + const session = await baseActions.resumeSession(c, request.sessionId); + if (!session || typeof session.prompt !== "function") { + throw new Error(`session '${request.sessionId}' not found`); + } + + return sanitizeActorResult(await session.prompt([{ type: "text", text }])); + }, + + // Mutation actions — self-send to queue for workflow history + async createSession(c: any, request: any): Promise { + const self = selfTaskSandbox(c); + return expectQueueResponse(await self.send(sandboxWorkflowQueueName("sandbox.command.createSession"), request ?? {}, { wait: true, timeout: 10_000 })); + }, + + async resumeOrCreateSession(c: any, request: any): Promise { + const self = selfTaskSandbox(c); + return expectQueueResponse( + await self.send(sandboxWorkflowQueueName("sandbox.command.resumeOrCreateSession"), request ?? {}, { wait: true, timeout: 10_000 }), + ); + }, + + async destroySession(c: any, sessionId: string): Promise { + const self = selfTaskSandbox(c); + return expectQueueResponse(await self.send(sandboxWorkflowQueueName("sandbox.command.destroySession"), { sessionId }, { wait: true, timeout: 10_000 })); + }, + + async createProcess(c: any, request: any): Promise { + const self = selfTaskSandbox(c); + return expectQueueResponse(await self.send(sandboxWorkflowQueueName("sandbox.command.createProcess"), request ?? {}, { wait: true, timeout: 10_000 })); + }, + + // runProcess kept as direct action — response can exceed 128KB queue limit + async runProcess(c: any, request: any): Promise { + const result = await baseActions.runProcess(c, request); + await broadcastProcesses(c, baseActions); + return result; + }, + + async stopProcess(c: any, processId: string, query?: any): Promise { + const self = selfTaskSandbox(c); + return expectQueueResponse( + await self.send(sandboxWorkflowQueueName("sandbox.command.stopProcess"), { processId, query }, { wait: true, timeout: 10_000 }), + ); + }, + + async killProcess(c: any, processId: string, query?: any): Promise { + const self = selfTaskSandbox(c); + return expectQueueResponse( + await self.send(sandboxWorkflowQueueName("sandbox.command.killProcess"), { processId, query }, { wait: true, timeout: 10_000 }), + ); + }, + + async deleteProcess(c: any, processId: string): Promise { + const self = selfTaskSandbox(c); + await self.send(sandboxWorkflowQueueName("sandbox.command.deleteProcess"), { processId }, { wait: false }); }, }, + run: workflow(runSandboxWorkflow), }); -export { SANDBOX_REPO_CWD }; +export { DEFAULT_SANDBOX_REPO_CWD, resolveRepoCwd }; diff --git a/foundry/packages/backend/src/actors/task/db/drizzle/0000_charming_maestro.sql b/foundry/packages/backend/src/actors/task/db/drizzle/0000_charming_maestro.sql index b9ef95a..c6a346a 100644 --- a/foundry/packages/backend/src/actors/task/db/drizzle/0000_charming_maestro.sql +++ b/foundry/packages/backend/src/actors/task/db/drizzle/0000_charming_maestro.sql @@ -3,10 +3,9 @@ CREATE TABLE `task` ( `branch_name` text, `title` text, `task` text NOT NULL, - `provider_id` text NOT NULL, + `sandbox_provider_id` text NOT NULL, `status` text NOT NULL, - `agent_type` text DEFAULT 'claude', - `pr_submitted` integer DEFAULT 0, + `pull_request_json` text, `created_at` integer NOT NULL, `updated_at` integer NOT NULL, CONSTRAINT "task_singleton_id_check" CHECK("task"."id" = 1) @@ -15,33 +14,33 @@ CREATE TABLE `task` ( CREATE TABLE `task_runtime` ( `id` integer PRIMARY KEY NOT NULL, `active_sandbox_id` text, - `active_session_id` text, `active_switch_target` text, `active_cwd` text, - `status_message` text, + `git_state_json` text, + `git_state_updated_at` integer, `updated_at` integer NOT NULL, CONSTRAINT "task_runtime_singleton_id_check" CHECK("task_runtime"."id" = 1) ); --> statement-breakpoint CREATE TABLE `task_sandboxes` ( `sandbox_id` text PRIMARY KEY NOT NULL, - `provider_id` text NOT NULL, + `sandbox_provider_id` text NOT NULL, `sandbox_actor_id` text, `switch_target` text NOT NULL, `cwd` text, - `status_message` text, `created_at` integer NOT NULL, `updated_at` integer NOT NULL ); --> statement-breakpoint -CREATE TABLE `task_workbench_sessions` ( +CREATE TABLE `task_workspace_sessions` ( `session_id` text PRIMARY KEY NOT NULL, + `sandbox_session_id` text, `session_name` text NOT NULL, `model` text NOT NULL, - `unread` integer DEFAULT 0 NOT NULL, - `draft_text` text DEFAULT '' NOT NULL, - `draft_attachments_json` text DEFAULT '[]' NOT NULL, - `draft_updated_at` integer, + `status` text DEFAULT 'ready' NOT NULL, + `error_message` text, + `transcript_json` text DEFAULT '[]' NOT NULL, + `transcript_updated_at` integer, `created` integer DEFAULT 1 NOT NULL, `closed` integer DEFAULT 0 NOT NULL, `thinking_since_ms` integer, diff --git a/foundry/packages/backend/src/actors/task/db/drizzle/meta/0000_snapshot.json b/foundry/packages/backend/src/actors/task/db/drizzle/meta/0000_snapshot.json index b8a5879..7397b89 100644 --- a/foundry/packages/backend/src/actors/task/db/drizzle/meta/0000_snapshot.json +++ b/foundry/packages/backend/src/actors/task/db/drizzle/meta/0000_snapshot.json @@ -35,8 +35,8 @@ "notNull": true, "autoincrement": false }, - "provider_id": { - "name": "provider_id", + "sandbox_provider_id": { + "name": "sandbox_provider_id", "type": "text", "primaryKey": false, "notNull": true, @@ -49,21 +49,12 @@ "notNull": true, "autoincrement": false }, - "agent_type": { - "name": "agent_type", + "pull_request_json": { + "name": "pull_request_json", "type": "text", "primaryKey": false, "notNull": false, - "autoincrement": false, - "default": "'claude'" - }, - "pr_submitted": { - "name": "pr_submitted", - "type": "integer", - "primaryKey": false, - "notNull": false, - "autoincrement": false, - "default": 0 + "autoincrement": false }, "created_at": { "name": "created_at", @@ -108,13 +99,6 @@ "notNull": false, "autoincrement": false }, - "active_session_id": { - "name": "active_session_id", - "type": "text", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, "active_switch_target": { "name": "active_switch_target", "type": "text", @@ -129,13 +113,20 @@ "notNull": false, "autoincrement": false }, - "status_message": { - "name": "status_message", + "git_state_json": { + "name": "git_state_json", "type": "text", "primaryKey": false, "notNull": false, "autoincrement": false }, + "git_state_updated_at": { + "name": "git_state_updated_at", + "type": "integer", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, "updated_at": { "name": "updated_at", "type": "integer", @@ -165,8 +156,8 @@ "notNull": true, "autoincrement": false }, - "provider_id": { - "name": "provider_id", + "sandbox_provider_id": { + "name": "sandbox_provider_id", "type": "text", "primaryKey": false, "notNull": true, @@ -193,13 +184,6 @@ "notNull": false, "autoincrement": false }, - "status_message": { - "name": "status_message", - "type": "text", - "primaryKey": false, - "notNull": false, - "autoincrement": false - }, "created_at": { "name": "created_at", "type": "integer", @@ -221,8 +205,8 @@ "uniqueConstraints": {}, "checkConstraints": {} }, - "task_workbench_sessions": { - "name": "task_workbench_sessions", + "task_workspace_sessions": { + "name": "task_workspace_sessions", "columns": { "session_id": { "name": "session_id", @@ -231,6 +215,13 @@ "notNull": true, "autoincrement": false }, + "sandbox_session_id": { + "name": "sandbox_session_id", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, "session_name": { "name": "session_name", "type": "text", @@ -245,32 +236,31 @@ "notNull": true, "autoincrement": false }, - "unread": { - "name": "unread", - "type": "integer", - "primaryKey": false, - "notNull": true, - "autoincrement": false, - "default": 0 - }, - "draft_text": { - "name": "draft_text", + "status": { + "name": "status", "type": "text", "primaryKey": false, "notNull": true, "autoincrement": false, - "default": "''" + "default": "'ready'" }, - "draft_attachments_json": { - "name": "draft_attachments_json", + "error_message": { + "name": "error_message", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "transcript_json": { + "name": "transcript_json", "type": "text", "primaryKey": false, "notNull": true, "autoincrement": false, "default": "'[]'" }, - "draft_updated_at": { - "name": "draft_updated_at", + "transcript_updated_at": { + "name": "transcript_updated_at", "type": "integer", "primaryKey": false, "notNull": false, diff --git a/foundry/packages/backend/src/actors/task/db/migrations.ts b/foundry/packages/backend/src/actors/task/db/migrations.ts index dc3193e..61b0dff 100644 --- a/foundry/packages/backend/src/actors/task/db/migrations.ts +++ b/foundry/packages/backend/src/actors/task/db/migrations.ts @@ -12,8 +12,8 @@ const journal = { }, { idx: 1, - when: 1773810000000, - tag: "0001_sandbox_provider_columns", + when: 1773984000000, + tag: "0001_add_task_owner", breakpoints: true, }, ], @@ -27,10 +27,9 @@ export default { \`branch_name\` text, \`title\` text, \`task\` text NOT NULL, - \`provider_id\` text NOT NULL, + \`sandbox_provider_id\` text NOT NULL, \`status\` text NOT NULL, - \`agent_type\` text DEFAULT 'claude', - \`pr_submitted\` integer DEFAULT 0, + \`pull_request_json\` text, \`created_at\` integer NOT NULL, \`updated_at\` integer NOT NULL, CONSTRAINT "task_singleton_id_check" CHECK("task"."id" = 1) @@ -39,43 +38,49 @@ export default { CREATE TABLE \`task_runtime\` ( \`id\` integer PRIMARY KEY NOT NULL, \`active_sandbox_id\` text, - \`active_session_id\` text, \`active_switch_target\` text, \`active_cwd\` text, - \`status_message\` text, + \`git_state_json\` text, + \`git_state_updated_at\` integer, \`updated_at\` integer NOT NULL, CONSTRAINT "task_runtime_singleton_id_check" CHECK("task_runtime"."id" = 1) ); --> statement-breakpoint CREATE TABLE \`task_sandboxes\` ( \`sandbox_id\` text PRIMARY KEY NOT NULL, - \`provider_id\` text NOT NULL, + \`sandbox_provider_id\` text NOT NULL, \`sandbox_actor_id\` text, \`switch_target\` text NOT NULL, \`cwd\` text, - \`status_message\` text, \`created_at\` integer NOT NULL, \`updated_at\` integer NOT NULL ); --> statement-breakpoint -CREATE TABLE \`task_workbench_sessions\` ( +CREATE TABLE \`task_workspace_sessions\` ( \`session_id\` text PRIMARY KEY NOT NULL, + \`sandbox_session_id\` text, \`session_name\` text NOT NULL, \`model\` text NOT NULL, - \`unread\` integer DEFAULT 0 NOT NULL, - \`draft_text\` text DEFAULT '' NOT NULL, - \`draft_attachments_json\` text DEFAULT '[]' NOT NULL, - \`draft_updated_at\` integer, + \`status\` text DEFAULT 'ready' NOT NULL, + \`error_message\` text, + \`transcript_json\` text DEFAULT '[]' NOT NULL, + \`transcript_updated_at\` integer, \`created\` integer DEFAULT 1 NOT NULL, \`closed\` integer DEFAULT 0 NOT NULL, \`thinking_since_ms\` integer, -\`created_at\` integer NOT NULL, + \`created_at\` integer NOT NULL, \`updated_at\` integer NOT NULL ); `, - m0001: `ALTER TABLE \`task\` RENAME COLUMN \`provider_id\` TO \`sandbox_provider_id\`; ---> statement-breakpoint -ALTER TABLE \`task_sandboxes\` RENAME COLUMN \`provider_id\` TO \`sandbox_provider_id\`; + m0001: `CREATE TABLE \`task_owner\` ( + \`id\` integer PRIMARY KEY NOT NULL, + \`primary_user_id\` text, + \`primary_github_login\` text, + \`primary_github_email\` text, + \`primary_github_avatar_url\` text, + \`updated_at\` integer NOT NULL, + CONSTRAINT "task_owner_singleton_id_check" CHECK("task_owner"."id" = 1) +); `, } as const, }; diff --git a/foundry/packages/backend/src/actors/task/db/schema.ts b/foundry/packages/backend/src/actors/task/db/schema.ts index 889aa31..bdb7cf7 100644 --- a/foundry/packages/backend/src/actors/task/db/schema.ts +++ b/foundry/packages/backend/src/actors/task/db/schema.ts @@ -11,8 +11,7 @@ export const task = sqliteTable( task: text("task").notNull(), sandboxProviderId: text("sandbox_provider_id").notNull(), status: text("status").notNull(), - agentType: text("agent_type").default("claude"), - prSubmitted: integer("pr_submitted").default(0), + pullRequestJson: text("pull_request_json"), createdAt: integer("created_at").notNull(), updatedAt: integer("updated_at").notNull(), }, @@ -24,14 +23,10 @@ export const taskRuntime = sqliteTable( { id: integer("id").primaryKey(), activeSandboxId: text("active_sandbox_id"), - activeSessionId: text("active_session_id"), activeSwitchTarget: text("active_switch_target"), activeCwd: text("active_cwd"), - statusMessage: text("status_message"), gitStateJson: text("git_state_json"), gitStateUpdatedAt: integer("git_state_updated_at"), - provisionStage: text("provision_stage"), - provisionStageUpdatedAt: integer("provision_stage_updated_at"), updatedAt: integer("updated_at").notNull(), }, (table) => [check("task_runtime_singleton_id_check", sql`${table.id} = 1`)], @@ -48,18 +43,35 @@ export const taskSandboxes = sqliteTable("task_sandboxes", { sandboxActorId: text("sandbox_actor_id"), switchTarget: text("switch_target").notNull(), cwd: text("cwd"), - statusMessage: text("status_message"), createdAt: integer("created_at").notNull(), updatedAt: integer("updated_at").notNull(), }); /** - * Coordinator index of workbench sessions within this task. + * Single-row table tracking the primary user (owner) of this task. + * The owner's GitHub OAuth credentials are injected into the sandbox + * for git operations. Updated when a different user sends a message. + */ +export const taskOwner = sqliteTable( + "task_owner", + { + id: integer("id").primaryKey(), + primaryUserId: text("primary_user_id"), + primaryGithubLogin: text("primary_github_login"), + primaryGithubEmail: text("primary_github_email"), + primaryGithubAvatarUrl: text("primary_github_avatar_url"), + updatedAt: integer("updated_at").notNull(), + }, + (table) => [check("task_owner_singleton_id_check", sql`${table.id} = 1`)], +); + +/** + * Coordinator index of workspace sessions within this task. * The task actor is the coordinator for sessions. Each row holds session * metadata, model, status, transcript, and draft state. Sessions are * sub-entities of the task — no separate session actor in the DB. */ -export const taskWorkbenchSessions = sqliteTable("task_workbench_sessions", { +export const taskWorkspaceSessions = sqliteTable("task_workspace_sessions", { sessionId: text("session_id").notNull().primaryKey(), sandboxSessionId: text("sandbox_session_id"), sessionName: text("session_name").notNull(), @@ -68,11 +80,6 @@ export const taskWorkbenchSessions = sqliteTable("task_workbench_sessions", { errorMessage: text("error_message"), transcriptJson: text("transcript_json").notNull().default("[]"), transcriptUpdatedAt: integer("transcript_updated_at"), - unread: integer("unread").notNull().default(0), - draftText: text("draft_text").notNull().default(""), - // Structured by the workbench composer attachment payload format. - draftAttachmentsJson: text("draft_attachments_json").notNull().default("[]"), - draftUpdatedAt: integer("draft_updated_at"), created: integer("created").notNull().default(1), closed: integer("closed").notNull().default(0), thinkingSinceMs: integer("thinking_since_ms"), diff --git a/foundry/packages/backend/src/actors/task/index.ts b/foundry/packages/backend/src/actors/task/index.ts index f2b9e51..68bee1c 100644 --- a/foundry/packages/backend/src/actors/task/index.ts +++ b/foundry/packages/backend/src/actors/task/index.ts @@ -1,116 +1,31 @@ import { actor, queue } from "rivetkit"; import { workflow } from "rivetkit/workflow"; -import type { - AgentType, - TaskRecord, - TaskWorkbenchChangeModelInput, - TaskWorkbenchRenameInput, - TaskWorkbenchRenameSessionInput, - TaskWorkbenchSetSessionUnreadInput, - TaskWorkbenchSendMessageInput, - TaskWorkbenchUpdateDraftInput, - SandboxProviderId, -} from "@sandbox-agent/foundry-shared"; -import { expectQueueResponse } from "../../services/queue.js"; -import { selfTask } from "../handles.js"; +import type { TaskRecord } from "@sandbox-agent/foundry-shared"; import { taskDb } from "./db/db.js"; import { getCurrentRecord } from "./workflow/common.js"; import { - changeWorkbenchModel, - closeWorkbenchSession, - createWorkbenchSession, + changeWorkspaceModel, getSessionDetail, getTaskDetail, getTaskSummary, - markWorkbenchUnread, - publishWorkbenchPr, - renameWorkbenchBranch, - renameWorkbenchTask, - renameWorkbenchSession, - revertWorkbenchFile, - sendWorkbenchMessage, - syncWorkbenchSessionStatus, - setWorkbenchSessionUnread, - stopWorkbenchSession, - updateWorkbenchDraft, -} from "./workbench.js"; -import { TASK_QUEUE_NAMES, taskWorkflowQueueName, runTaskWorkflow } from "./workflow/index.js"; + markWorkspaceUnread, + refreshWorkspaceDerivedState, + refreshWorkspaceSessionTranscript, + renameWorkspaceSession, + renameWorkspaceTask, + selectWorkspaceSession, + setWorkspaceSessionUnread, + syncTaskPullRequest, + syncWorkspaceSessionStatus, + updateWorkspaceDraft, +} from "./workspace.js"; +import { runTaskWorkflow } from "./workflow/index.js"; +import { TASK_QUEUE_NAMES } from "./workflow/queue.js"; export interface TaskInput { organizationId: string; repoId: string; taskId: string; - repoRemote: string; - branchName: string | null; - title: string | null; - task: string; - sandboxProviderId: SandboxProviderId; - agentType: AgentType | null; - explicitTitle: string | null; - explicitBranchName: string | null; - initialPrompt: string | null; -} - -interface InitializeCommand { - sandboxProviderId?: SandboxProviderId; -} - -interface TaskActionCommand { - reason?: string; -} - -interface TaskSessionCommand { - sessionId: string; -} - -interface TaskStatusSyncCommand { - sessionId: string; - status: "running" | "idle" | "error"; - at: number; -} - -interface TaskWorkbenchValueCommand { - value: string; -} - -interface TaskWorkbenchSessionTitleCommand { - sessionId: string; - title: string; -} - -interface TaskWorkbenchSessionUnreadCommand { - sessionId: string; - unread: boolean; -} - -interface TaskWorkbenchUpdateDraftCommand { - sessionId: string; - text: string; - attachments: Array; -} - -interface TaskWorkbenchChangeModelCommand { - sessionId: string; - model: string; -} - -interface TaskWorkbenchSendMessageCommand { - sessionId: string; - text: string; - attachments: Array; -} - -interface TaskWorkbenchCreateSessionCommand { - model?: string; -} - -interface TaskWorkbenchCreateSessionAndSendCommand { - model?: string; - text: string; -} - -interface TaskWorkbenchSessionCommand { - sessionId: string; } export const task = actor({ @@ -119,275 +34,66 @@ export const task = actor({ options: { name: "Task", icon: "wrench", - actionTimeout: 5 * 60_000, + actionTimeout: 10 * 60_000, }, createState: (_c, input: TaskInput) => ({ organizationId: input.organizationId, repoId: input.repoId, taskId: input.taskId, - repoRemote: input.repoRemote, - branchName: input.branchName, - title: input.title, - task: input.task, - sandboxProviderId: input.sandboxProviderId, - agentType: input.agentType, - explicitTitle: input.explicitTitle, - explicitBranchName: input.explicitBranchName, - initialPrompt: input.initialPrompt, - initialized: false, - previousStatus: null as string | null, }), actions: { - async initialize(c, cmd: InitializeCommand): Promise { - const self = selfTask(c); - const result = await self.send(taskWorkflowQueueName("task.command.initialize"), cmd ?? {}, { - wait: true, - timeout: 10_000, - }); - return expectQueueResponse(result); - }, - - async provision(c, cmd: InitializeCommand): Promise<{ ok: true }> { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.provision"), cmd ?? {}, { - wait: false, - }); - return { ok: true }; - }, - - async attach(c, cmd?: TaskActionCommand): Promise<{ target: string; sessionId: string | null }> { - const self = selfTask(c); - const result = await self.send(taskWorkflowQueueName("task.command.attach"), cmd ?? {}, { - wait: true, - timeout: 10_000, - }); - return expectQueueResponse<{ target: string; sessionId: string | null }>(result); - }, - - async switch(c): Promise<{ switchTarget: string }> { - const self = selfTask(c); - const result = await self.send( - taskWorkflowQueueName("task.command.switch"), - {}, - { - wait: true, - timeout: 10_000, - }, - ); - return expectQueueResponse<{ switchTarget: string }>(result); - }, - - async push(c, cmd?: TaskActionCommand): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.push"), cmd ?? {}, { - wait: false, - }); - }, - - async sync(c, cmd?: TaskActionCommand): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.sync"), cmd ?? {}, { - wait: false, - }); - }, - - async merge(c, cmd?: TaskActionCommand): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.merge"), cmd ?? {}, { - wait: false, - }); - }, - - async archive(c, cmd?: TaskActionCommand): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.archive"), cmd ?? {}, { - wait: false, - }); - }, - - async kill(c, cmd?: TaskActionCommand): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.kill"), cmd ?? {}, { - wait: false, - }); - }, - async get(c): Promise { - return await getCurrentRecord({ db: c.db, state: c.state }); + return await getCurrentRecord(c); }, async getTaskSummary(c) { return await getTaskSummary(c); }, - async getTaskDetail(c) { - return await getTaskDetail(c); + async getTaskDetail(c, input?: { authSessionId?: string }) { + return await getTaskDetail(c, input?.authSessionId); }, - async getSessionDetail(c, input: { sessionId: string }) { - return await getSessionDetail(c, input.sessionId); + async getSessionDetail(c, input: { sessionId: string; authSessionId?: string }) { + return await getSessionDetail(c, input.sessionId, input.authSessionId); }, - async markWorkbenchUnread(c): Promise { - const self = selfTask(c); - await self.send( - taskWorkflowQueueName("task.command.workbench.mark_unread"), - {}, - { - wait: true, - timeout: 10_000, - }, - ); + // Direct actions migrated from queue: + async markUnread(c, input: { authSessionId?: string }) { + await markWorkspaceUnread(c, input?.authSessionId); }, - - async renameWorkbenchTask(c, input: TaskWorkbenchRenameInput): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.workbench.rename_task"), { value: input.value } satisfies TaskWorkbenchValueCommand, { - wait: true, - timeout: 20_000, - }); + async renameTask(c, input: { value: string }) { + await renameWorkspaceTask(c, input.value); }, - - async renameWorkbenchBranch(c, input: TaskWorkbenchRenameInput): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.workbench.rename_branch"), { value: input.value } satisfies TaskWorkbenchValueCommand, { - wait: false, - }); + async renameSession(c, input: { sessionId: string; title: string }) { + await renameWorkspaceSession(c, input.sessionId, input.title); }, - - async createWorkbenchSession(c, input?: { model?: string }): Promise<{ sessionId: string }> { - const self = selfTask(c); - const result = await self.send( - taskWorkflowQueueName("task.command.workbench.create_session"), - { ...(input?.model ? { model: input.model } : {}) } satisfies TaskWorkbenchCreateSessionCommand, - { - wait: true, - timeout: 10_000, - }, - ); - return expectQueueResponse<{ sessionId: string }>(result); + async selectSession(c, input: { sessionId: string; authSessionId?: string }) { + await selectWorkspaceSession(c, input.sessionId, input?.authSessionId); }, - - /** - * Fire-and-forget: creates a workbench session and sends the initial message. - * Used by createWorkbenchTask so the caller doesn't block on session creation. - */ - async createWorkbenchSessionAndSend(c, input: { model?: string; text: string }): Promise { - const self = selfTask(c); - await self.send( - taskWorkflowQueueName("task.command.workbench.create_session_and_send"), - { model: input.model, text: input.text } satisfies TaskWorkbenchCreateSessionAndSendCommand, - { wait: false }, - ); + async setSessionUnread(c, input: { sessionId: string; unread: boolean; authSessionId?: string }) { + await setWorkspaceSessionUnread(c, input.sessionId, input.unread, input?.authSessionId); }, - - async renameWorkbenchSession(c, input: TaskWorkbenchRenameSessionInput): Promise { - const self = selfTask(c); - await self.send( - taskWorkflowQueueName("task.command.workbench.rename_session"), - { sessionId: input.sessionId, title: input.title } satisfies TaskWorkbenchSessionTitleCommand, - { - wait: true, - timeout: 10_000, - }, - ); + async updateDraft(c, input: { sessionId: string; text: string; attachments: any[]; authSessionId?: string }) { + await updateWorkspaceDraft(c, input.sessionId, input.text, input.attachments, input?.authSessionId); }, - - async setWorkbenchSessionUnread(c, input: TaskWorkbenchSetSessionUnreadInput): Promise { - const self = selfTask(c); - await self.send( - taskWorkflowQueueName("task.command.workbench.set_session_unread"), - { sessionId: input.sessionId, unread: input.unread } satisfies TaskWorkbenchSessionUnreadCommand, - { - wait: true, - timeout: 10_000, - }, - ); + async changeModel(c, input: { sessionId: string; model: string; authSessionId?: string }) { + await changeWorkspaceModel(c, input.sessionId, input.model, input?.authSessionId); }, - - async updateWorkbenchDraft(c, input: TaskWorkbenchUpdateDraftInput): Promise { - const self = selfTask(c); - await self.send( - taskWorkflowQueueName("task.command.workbench.update_draft"), - { - sessionId: input.sessionId, - text: input.text, - attachments: input.attachments, - } satisfies TaskWorkbenchUpdateDraftCommand, - { - wait: false, - }, - ); + async refreshSessionTranscript(c, input: { sessionId: string }) { + await refreshWorkspaceSessionTranscript(c, input.sessionId); }, - - async changeWorkbenchModel(c, input: TaskWorkbenchChangeModelInput): Promise { - const self = selfTask(c); - await self.send( - taskWorkflowQueueName("task.command.workbench.change_model"), - { sessionId: input.sessionId, model: input.model } satisfies TaskWorkbenchChangeModelCommand, - { - wait: true, - timeout: 10_000, - }, - ); + async refreshDerived(c) { + await refreshWorkspaceDerivedState(c); }, - - async sendWorkbenchMessage(c, input: TaskWorkbenchSendMessageInput): Promise { - const self = selfTask(c); - await self.send( - taskWorkflowQueueName("task.command.workbench.send_message"), - { - sessionId: input.sessionId, - text: input.text, - attachments: input.attachments, - } satisfies TaskWorkbenchSendMessageCommand, - { - wait: false, - }, - ); + async syncSessionStatus(c, input: { sessionId: string; status: "running" | "idle" | "error"; at: number }) { + await syncWorkspaceSessionStatus(c, input.sessionId, input.status, input.at); }, - - async stopWorkbenchSession(c, input: TaskSessionCommand): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.workbench.stop_session"), { sessionId: input.sessionId } satisfies TaskWorkbenchSessionCommand, { - wait: false, - }); - }, - - async syncWorkbenchSessionStatus(c, input: TaskStatusSyncCommand): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.workbench.sync_session_status"), input, { - wait: true, - timeout: 20_000, - }); - }, - - async closeWorkbenchSession(c, input: TaskSessionCommand): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.workbench.close_session"), { sessionId: input.sessionId } satisfies TaskWorkbenchSessionCommand, { - wait: false, - }); - }, - - async publishWorkbenchPr(c): Promise { - const self = selfTask(c); - await self.send( - taskWorkflowQueueName("task.command.workbench.publish_pr"), - {}, - { - wait: false, - }, - ); - }, - - async revertWorkbenchFile(c, input: { path: string }): Promise { - const self = selfTask(c); - await self.send(taskWorkflowQueueName("task.command.workbench.revert_file"), input, { - wait: false, - }); + async syncPullRequest(c, input: { pullRequest: any }) { + await syncTaskPullRequest(c, input?.pullRequest ?? null); }, }, run: workflow(runTaskWorkflow), }); -export { TASK_QUEUE_NAMES }; +export { taskWorkflowQueueName } from "./workflow/index.js"; diff --git a/foundry/packages/backend/src/actors/task/workflow/commands.ts b/foundry/packages/backend/src/actors/task/workflow/commands.ts index d03ade1..7ba2d2b 100644 --- a/foundry/packages/backend/src/actors/task/workflow/commands.ts +++ b/foundry/packages/backend/src/actors/task/workflow/commands.ts @@ -2,8 +2,8 @@ import { eq } from "drizzle-orm"; import { getTaskSandbox } from "../../handles.js"; import { logActorWarning, resolveErrorMessage } from "../../logging.js"; -import { task as taskTable, taskRuntime } from "../db/schema.js"; -import { TASK_ROW_ID, appendHistory, getCurrentRecord, setTaskState } from "./common.js"; +import { task as taskTable } from "../db/schema.js"; +import { TASK_ROW_ID, appendAuditLog, getCurrentRecord, setTaskState } from "./common.js"; import { pushActiveBranchActivity } from "./push.js"; async function withTimeout(promise: Promise, timeoutMs: number, label: string): Promise { @@ -25,6 +25,7 @@ async function withTimeout(promise: Promise, timeoutMs: number, label: str export async function handleAttachActivity(loopCtx: any, msg: any): Promise { const record = await getCurrentRecord(loopCtx); let target = record.sandboxes.find((sandbox: any) => sandbox.sandboxId === record.activeSandboxId)?.switchTarget ?? ""; + const sessionId = msg.body?.sessionId ?? null; if (record.activeSandboxId) { try { @@ -38,14 +39,14 @@ export async function handleAttachActivity(loopCtx: any, msg: any): Promise await msg.complete({ ok: true }); } -export async function handleSimpleCommandActivity(loopCtx: any, msg: any, statusMessage: string, historyKind: string): Promise { - const db = loopCtx.db; - await db.update(taskRuntime).set({ statusMessage, updatedAt: Date.now() }).where(eq(taskRuntime.id, TASK_ROW_ID)).run(); - - await appendHistory(loopCtx, historyKind, { reason: msg.body?.reason ?? null }); +export async function handleSimpleCommandActivity(loopCtx: any, msg: any, historyKind: string): Promise { + await appendAuditLog(loopCtx, historyKind, { reason: msg.body?.reason ?? null }); await msg.complete({ ok: true }); } export async function handleArchiveActivity(loopCtx: any, msg: any): Promise { - await setTaskState(loopCtx, "archive_stop_status_sync", "stopping status sync"); + await setTaskState(loopCtx, "archive_stop_status_sync"); const record = await getCurrentRecord(loopCtx); if (record.activeSandboxId) { - await setTaskState(loopCtx, "archive_release_sandbox", "releasing sandbox"); + await setTaskState(loopCtx, "archive_release_sandbox"); void withTimeout(getTaskSandbox(loopCtx, loopCtx.state.organizationId, record.activeSandboxId).destroy(), 45_000, "sandbox destroy").catch((error) => { logActorWarning("task.commands", "failed to release sandbox during archive", { organizationId: loopCtx.state.organizationId, @@ -90,17 +88,15 @@ export async function handleArchiveActivity(loopCtx: any, msg: any): Promise { - await setTaskState(loopCtx, "kill_destroy_sandbox", "destroying sandbox"); + await setTaskState(loopCtx, "kill_destroy_sandbox"); const record = await getCurrentRecord(loopCtx); if (!record.activeSandboxId) { return; @@ -110,13 +106,11 @@ export async function killDestroySandboxActivity(loopCtx: any): Promise { } export async function killWriteDbActivity(loopCtx: any, msg: any): Promise { - await setTaskState(loopCtx, "kill_finalize", "finalizing kill"); + await setTaskState(loopCtx, "kill_finalize"); const db = loopCtx.db; await db.update(taskTable).set({ status: "killed", updatedAt: Date.now() }).where(eq(taskTable.id, TASK_ROW_ID)).run(); - await db.update(taskRuntime).set({ statusMessage: "killed", updatedAt: Date.now() }).where(eq(taskRuntime.id, TASK_ROW_ID)).run(); - - await appendHistory(loopCtx, "task.kill", { reason: msg.body?.reason ?? null }); + await appendAuditLog(loopCtx, "task.kill", { reason: msg.body?.reason ?? null }); await msg.complete({ ok: true }); } diff --git a/foundry/packages/backend/src/actors/task/workflow/common.ts b/foundry/packages/backend/src/actors/task/workflow/common.ts index ae1e8dd..cbe63e6 100644 --- a/foundry/packages/backend/src/actors/task/workflow/common.ts +++ b/foundry/packages/backend/src/actors/task/workflow/common.ts @@ -2,8 +2,10 @@ import { eq } from "drizzle-orm"; import type { TaskRecord, TaskStatus } from "@sandbox-agent/foundry-shared"; import { task as taskTable, taskRuntime, taskSandboxes } from "../db/schema.js"; -import { historyKey } from "../../keys.js"; -import { broadcastTaskUpdate } from "../workbench.js"; +import { getOrCreateAuditLog, getOrCreateOrganization } from "../../handles.js"; +import { broadcastTaskUpdate } from "../workspace.js"; +import { getActorRuntimeContext } from "../../context.js"; +import { defaultSandboxProviderId } from "../../../sandbox-config.js"; export const TASK_ROW_ID = 1; @@ -56,50 +58,32 @@ export function buildAgentPrompt(task: string): string { return task.trim(); } -export async function setTaskState(ctx: any, status: TaskStatus, statusMessage?: string): Promise { +export async function setTaskState(ctx: any, status: TaskStatus): Promise { const now = Date.now(); const db = ctx.db; await db.update(taskTable).set({ status, updatedAt: now }).where(eq(taskTable.id, TASK_ROW_ID)).run(); - if (statusMessage != null) { - await db - .insert(taskRuntime) - .values({ - id: TASK_ROW_ID, - activeSandboxId: null, - activeSessionId: null, - activeSwitchTarget: null, - activeCwd: null, - statusMessage, - updatedAt: now, - }) - .onConflictDoUpdate({ - target: taskRuntime.id, - set: { - statusMessage, - updatedAt: now, - }, - }) - .run(); - } - await broadcastTaskUpdate(ctx); } +/** + * Read the task's current record from its local SQLite DB. + * If the task actor was lazily created (virtual task from PR sync) and has no + * DB rows yet, auto-initializes by reading branch/title from the org actor's + * getTaskIndexEntry. This is the self-initialization path for lazy task actors. + */ export async function getCurrentRecord(ctx: any): Promise { const db = ctx.db; - const row = await db + const organization = await getOrCreateOrganization(ctx, ctx.state.organizationId); + let row = await db .select({ branchName: taskTable.branchName, title: taskTable.title, task: taskTable.task, sandboxProviderId: taskTable.sandboxProviderId, status: taskTable.status, - statusMessage: taskRuntime.statusMessage, + pullRequestJson: taskTable.pullRequestJson, activeSandboxId: taskRuntime.activeSandboxId, - activeSessionId: taskRuntime.activeSessionId, - agentType: taskTable.agentType, - prSubmitted: taskTable.prSubmitted, createdAt: taskTable.createdAt, updatedAt: taskTable.updatedAt, }) @@ -109,7 +93,58 @@ export async function getCurrentRecord(ctx: any): Promise { .get(); if (!row) { - throw new Error(`Task not found: ${ctx.state.taskId}`); + // Virtual task — auto-initialize from org actor's task index data + let branchName: string | null = null; + let title = "Untitled"; + try { + const entry = await organization.getTaskIndexEntry({ taskId: ctx.state.taskId }); + branchName = entry?.branchName ?? null; + title = entry?.title ?? title; + } catch {} + + const { config } = getActorRuntimeContext(); + const { initBootstrapDbActivity, initCompleteActivity } = await import("./init.js"); + await initBootstrapDbActivity(ctx, { + sandboxProviderId: defaultSandboxProviderId(config), + branchName, + title, + task: title, + }); + await initCompleteActivity(ctx, { sandboxProviderId: defaultSandboxProviderId(config) }); + + // Re-read the row after initialization + const initialized = await db + .select({ + branchName: taskTable.branchName, + title: taskTable.title, + task: taskTable.task, + sandboxProviderId: taskTable.sandboxProviderId, + status: taskTable.status, + pullRequestJson: taskTable.pullRequestJson, + activeSandboxId: taskRuntime.activeSandboxId, + createdAt: taskTable.createdAt, + updatedAt: taskTable.updatedAt, + }) + .from(taskTable) + .leftJoin(taskRuntime, eq(taskTable.id, taskRuntime.id)) + .where(eq(taskTable.id, TASK_ROW_ID)) + .get(); + + if (!initialized) { + throw new Error(`Task not found after initialization: ${ctx.state.taskId}`); + } + + row = initialized; + } + + const repositoryMetadata = await organization.getRepositoryMetadata({ repoId: ctx.state.repoId }); + let pullRequest = null; + if (row.pullRequestJson) { + try { + pullRequest = JSON.parse(row.pullRequestJson); + } catch { + pullRequest = null; + } } const sandboxes = await db @@ -128,16 +163,15 @@ export async function getCurrentRecord(ctx: any): Promise { return { organizationId: ctx.state.organizationId, repoId: ctx.state.repoId, - repoRemote: ctx.state.repoRemote, + repoRemote: repositoryMetadata.remoteUrl, taskId: ctx.state.taskId, branchName: row.branchName, title: row.title, task: row.task, sandboxProviderId: row.sandboxProviderId, status: row.status, - statusMessage: row.statusMessage ?? null, activeSandboxId: row.activeSandboxId ?? null, - activeSessionId: row.activeSessionId ?? null, + pullRequest, sandboxes: sandboxes.map((sb) => ({ sandboxId: sb.sandboxId, sandboxProviderId: sb.sandboxProviderId, @@ -147,31 +181,19 @@ export async function getCurrentRecord(ctx: any): Promise { createdAt: sb.createdAt, updatedAt: sb.updatedAt, })), - agentType: row.agentType ?? null, - prSubmitted: Boolean(row.prSubmitted), - diffStat: null, - hasUnpushed: null, - conflictsWithMain: null, - parentBranch: null, - prUrl: null, - prAuthor: null, - ciStatus: null, - reviewStatus: null, - reviewer: null, createdAt: row.createdAt, updatedAt: row.updatedAt, } as TaskRecord; } -export async function appendHistory(ctx: any, kind: string, payload: Record): Promise { - const client = ctx.client(); - const history = await client.history.getOrCreate(historyKey(ctx.state.organizationId, ctx.state.repoId), { - createWithInput: { organizationId: ctx.state.organizationId, repoId: ctx.state.repoId }, - }); - await history.append({ +export async function appendAuditLog(ctx: any, kind: string, payload: Record): Promise { + const row = await ctx.db.select({ branchName: taskTable.branchName }).from(taskTable).where(eq(taskTable.id, TASK_ROW_ID)).get(); + const auditLog = await getOrCreateAuditLog(ctx, ctx.state.organizationId); + void auditLog.append({ kind, + repoId: ctx.state.repoId, taskId: ctx.state.taskId, - branchName: ctx.state.branchName, + branchName: row?.branchName ?? null, payload, }); diff --git a/foundry/packages/backend/src/actors/task/workflow/index.ts b/foundry/packages/backend/src/actors/task/workflow/index.ts index f6ffd10..75b2da3 100644 --- a/foundry/packages/backend/src/actors/task/workflow/index.ts +++ b/foundry/packages/backend/src/actors/task/workflow/index.ts @@ -1,162 +1,102 @@ +// @ts-nocheck +/** + * Task workflow — queue-based command loop. + * + * Mutations are dispatched through named queues and processed inside the + * workflow command loop so that every command appears in the RivetKit + * inspector's workflow history. Read actions remain direct (no queue). + * + * Callers send commands directly via `.send(taskWorkflowQueueName(...), ...)`. + */ import { Loop } from "rivetkit/workflow"; import { logActorWarning, resolveErrorMessage } from "../../logging.js"; +import { TASK_QUEUE_NAMES, type TaskQueueName, taskWorkflowQueueName } from "./queue.js"; import { getCurrentRecord } from "./common.js"; import { initBootstrapDbActivity, initCompleteActivity, initEnqueueProvisionActivity, initFailedActivity } from "./init.js"; import { handleArchiveActivity, handleAttachActivity, - handleGetActivity, handlePushActivity, handleSimpleCommandActivity, handleSwitchActivity, killDestroySandboxActivity, killWriteDbActivity, } from "./commands.js"; -import { TASK_QUEUE_NAMES } from "./queue.js"; import { - changeWorkbenchModel, - closeWorkbenchSession, - createWorkbenchSession, - ensureWorkbenchSession, - refreshWorkbenchDerivedState, - refreshWorkbenchSessionTranscript, - markWorkbenchUnread, - publishWorkbenchPr, - renameWorkbenchBranch, - renameWorkbenchTask, - renameWorkbenchSession, - revertWorkbenchFile, - sendWorkbenchMessage, - setWorkbenchSessionUnread, - stopWorkbenchSession, - syncWorkbenchSessionStatus, - updateWorkbenchDraft, -} from "../workbench.js"; + changeTaskOwnerManually, + closeWorkspaceSession, + createWorkspaceSession, + ensureWorkspaceSession, + publishWorkspacePr, + revertWorkspaceFile, + sendWorkspaceMessage, + stopWorkspaceSession, +} from "../workspace.js"; -export { TASK_QUEUE_NAMES, taskWorkflowQueueName } from "./queue.js"; +export { taskWorkflowQueueName } from "./queue.js"; -type TaskQueueName = (typeof TASK_QUEUE_NAMES)[number]; +// --------------------------------------------------------------------------- +// Workflow command loop — runs inside `run: workflow(runTaskWorkflow)` +// --------------------------------------------------------------------------- -type WorkflowHandler = (loopCtx: any, msg: { name: TaskQueueName; body: any; complete: (response: unknown) => Promise }) => Promise; +type WorkflowHandler = (loopCtx: any, msg: any) => Promise; -const commandHandlers: Record = { +const COMMAND_HANDLERS: Record = { "task.command.initialize": async (loopCtx, msg) => { - const body = msg.body; - - await loopCtx.step("init-bootstrap-db", async () => initBootstrapDbActivity(loopCtx, body)); - await loopCtx.step("init-enqueue-provision", async () => initEnqueueProvisionActivity(loopCtx, body)); - await loopCtx.removed("init-dispatch-provision-v2", "step"); - const currentRecord = await loopCtx.step("init-read-current-record", async () => getCurrentRecord(loopCtx)); - try { - await msg.complete(currentRecord); - } catch (error) { - logActorWarning("task.workflow", "initialize completion failed", { - error: resolveErrorMessage(error), - }); - } + await initBootstrapDbActivity(loopCtx, msg.body); + await initEnqueueProvisionActivity(loopCtx, msg.body); + const record = await getCurrentRecord(loopCtx); + await msg.complete(record); }, "task.command.provision": async (loopCtx, msg) => { - await loopCtx.removed("init-failed", "step"); - await loopCtx.removed("init-failed-v2", "step"); try { - await loopCtx.removed("init-ensure-name", "step"); - await loopCtx.removed("init-assert-name", "step"); - await loopCtx.removed("init-create-sandbox", "step"); - await loopCtx.removed("init-ensure-agent", "step"); - await loopCtx.removed("init-start-sandbox-instance", "step"); - await loopCtx.removed("init-expose-sandbox", "step"); - await loopCtx.removed("init-create-session", "step"); - await loopCtx.removed("init-write-db", "step"); - await loopCtx.removed("init-start-status-sync", "step"); - await loopCtx.step("init-complete", async () => initCompleteActivity(loopCtx, msg.body)); + await initCompleteActivity(loopCtx, msg.body); await msg.complete({ ok: true }); } catch (error) { - await loopCtx.step("init-failed-v3", async () => initFailedActivity(loopCtx, error)); - await msg.complete({ - ok: false, - error: resolveErrorMessage(error), - }); + await initFailedActivity(loopCtx, error, msg.body); + await msg.complete({ ok: false, error: resolveErrorMessage(error) }); } }, "task.command.attach": async (loopCtx, msg) => { - await loopCtx.step("handle-attach", async () => handleAttachActivity(loopCtx, msg)); + await handleAttachActivity(loopCtx, msg); }, "task.command.switch": async (loopCtx, msg) => { - await loopCtx.step("handle-switch", async () => handleSwitchActivity(loopCtx, msg)); + await handleSwitchActivity(loopCtx, msg); }, "task.command.push": async (loopCtx, msg) => { - await loopCtx.step("handle-push", async () => handlePushActivity(loopCtx, msg)); + await handlePushActivity(loopCtx, msg); }, "task.command.sync": async (loopCtx, msg) => { - await loopCtx.step("handle-sync", async () => handleSimpleCommandActivity(loopCtx, msg, "sync requested", "task.sync")); + await handleSimpleCommandActivity(loopCtx, msg, "task.sync"); }, "task.command.merge": async (loopCtx, msg) => { - await loopCtx.step("handle-merge", async () => handleSimpleCommandActivity(loopCtx, msg, "merge requested", "task.merge")); + await handleSimpleCommandActivity(loopCtx, msg, "task.merge"); }, "task.command.archive": async (loopCtx, msg) => { - await loopCtx.step("handle-archive", async () => handleArchiveActivity(loopCtx, msg)); + await handleArchiveActivity(loopCtx, msg); }, "task.command.kill": async (loopCtx, msg) => { - await loopCtx.step("kill-destroy-sandbox", async () => killDestroySandboxActivity(loopCtx)); - await loopCtx.step("kill-write-db", async () => killWriteDbActivity(loopCtx, msg)); + await killDestroySandboxActivity(loopCtx); + await killWriteDbActivity(loopCtx, msg); }, - "task.command.get": async (loopCtx, msg) => { - await loopCtx.step("handle-get", async () => handleGetActivity(loopCtx, msg)); + "task.command.workspace.create_session": async (loopCtx, msg) => { + const result = await createWorkspaceSession(loopCtx, msg.body?.model, msg.body?.authSessionId); + await msg.complete(result); }, - "task.command.workbench.mark_unread": async (loopCtx, msg) => { - await loopCtx.step("workbench-mark-unread", async () => markWorkbenchUnread(loopCtx)); - await msg.complete({ ok: true }); - }, - - "task.command.workbench.rename_task": async (loopCtx, msg) => { - await loopCtx.step("workbench-rename-task", async () => renameWorkbenchTask(loopCtx, msg.body.value)); - await msg.complete({ ok: true }); - }, - - "task.command.workbench.rename_branch": async (loopCtx, msg) => { - await loopCtx.step({ - name: "workbench-rename-branch", - timeout: 5 * 60_000, - run: async () => renameWorkbenchBranch(loopCtx, msg.body.value), - }); - await msg.complete({ ok: true }); - }, - - "task.command.workbench.create_session": async (loopCtx, msg) => { + "task.command.workspace.create_session_and_send": async (loopCtx, msg) => { try { - const created = await loopCtx.step({ - name: "workbench-create-session", - timeout: 5 * 60_000, - run: async () => createWorkbenchSession(loopCtx, msg.body?.model), - }); - await msg.complete(created); - } catch (error) { - await msg.complete({ error: resolveErrorMessage(error) }); - } - }, - - "task.command.workbench.create_session_and_send": async (loopCtx, msg) => { - try { - const created = await loopCtx.step({ - name: "workbench-create-session-for-send", - timeout: 5 * 60_000, - run: async () => createWorkbenchSession(loopCtx, msg.body?.model), - }); - await loopCtx.step({ - name: "workbench-send-initial-message", - timeout: 5 * 60_000, - run: async () => sendWorkbenchMessage(loopCtx, created.sessionId, msg.body.text, []), - }); + const created = await createWorkspaceSession(loopCtx, msg.body?.model, msg.body?.authSessionId); + await sendWorkspaceMessage(loopCtx, created.sessionId, msg.body.text, [], msg.body?.authSessionId); } catch (error) { logActorWarning("task.workflow", "create_session_and_send failed", { error: resolveErrorMessage(error), @@ -165,103 +105,42 @@ const commandHandlers: Record = { await msg.complete({ ok: true }); }, - "task.command.workbench.ensure_session": async (loopCtx, msg) => { - await loopCtx.step({ - name: "workbench-ensure-session", - timeout: 5 * 60_000, - run: async () => ensureWorkbenchSession(loopCtx, msg.body.sessionId, msg.body?.model), - }); + "task.command.workspace.ensure_session": async (loopCtx, msg) => { + await ensureWorkspaceSession(loopCtx, msg.body.sessionId, msg.body?.model, msg.body?.authSessionId); await msg.complete({ ok: true }); }, - "task.command.workbench.rename_session": async (loopCtx, msg) => { - await loopCtx.step("workbench-rename-session", async () => renameWorkbenchSession(loopCtx, msg.body.sessionId, msg.body.title)); + "task.command.workspace.send_message": async (loopCtx, msg) => { + await sendWorkspaceMessage(loopCtx, msg.body.sessionId, msg.body.text, msg.body.attachments, msg.body?.authSessionId); await msg.complete({ ok: true }); }, - "task.command.workbench.set_session_unread": async (loopCtx, msg) => { - await loopCtx.step("workbench-set-session-unread", async () => setWorkbenchSessionUnread(loopCtx, msg.body.sessionId, msg.body.unread)); + "task.command.workspace.stop_session": async (loopCtx, msg) => { + await stopWorkspaceSession(loopCtx, msg.body.sessionId); await msg.complete({ ok: true }); }, - "task.command.workbench.update_draft": async (loopCtx, msg) => { - await loopCtx.step("workbench-update-draft", async () => updateWorkbenchDraft(loopCtx, msg.body.sessionId, msg.body.text, msg.body.attachments)); + "task.command.workspace.close_session": async (loopCtx, msg) => { + await closeWorkspaceSession(loopCtx, msg.body.sessionId, msg.body?.authSessionId); await msg.complete({ ok: true }); }, - "task.command.workbench.change_model": async (loopCtx, msg) => { - await loopCtx.step("workbench-change-model", async () => changeWorkbenchModel(loopCtx, msg.body.sessionId, msg.body.model)); + "task.command.workspace.publish_pr": async (loopCtx, msg) => { + await publishWorkspacePr(loopCtx); await msg.complete({ ok: true }); }, - "task.command.workbench.send_message": async (loopCtx, msg) => { - try { - await loopCtx.step({ - name: "workbench-send-message", - timeout: 10 * 60_000, - run: async () => sendWorkbenchMessage(loopCtx, msg.body.sessionId, msg.body.text, msg.body.attachments), - }); - await msg.complete({ ok: true }); - } catch (error) { - await msg.complete({ error: resolveErrorMessage(error) }); - } - }, - - "task.command.workbench.stop_session": async (loopCtx, msg) => { - await loopCtx.step({ - name: "workbench-stop-session", - timeout: 5 * 60_000, - run: async () => stopWorkbenchSession(loopCtx, msg.body.sessionId), - }); + "task.command.workspace.revert_file": async (loopCtx, msg) => { + await revertWorkspaceFile(loopCtx, msg.body.path); await msg.complete({ ok: true }); }, - "task.command.workbench.sync_session_status": async (loopCtx, msg) => { - await loopCtx.step("workbench-sync-session-status", async () => syncWorkbenchSessionStatus(loopCtx, msg.body.sessionId, msg.body.status, msg.body.at)); - await msg.complete({ ok: true }); - }, - - "task.command.workbench.refresh_derived": async (loopCtx, msg) => { - await loopCtx.step({ - name: "workbench-refresh-derived", - timeout: 5 * 60_000, - run: async () => refreshWorkbenchDerivedState(loopCtx), - }); - await msg.complete({ ok: true }); - }, - - "task.command.workbench.refresh_session_transcript": async (loopCtx, msg) => { - await loopCtx.step({ - name: "workbench-refresh-session-transcript", - timeout: 60_000, - run: async () => refreshWorkbenchSessionTranscript(loopCtx, msg.body.sessionId), - }); - await msg.complete({ ok: true }); - }, - - "task.command.workbench.close_session": async (loopCtx, msg) => { - await loopCtx.step({ - name: "workbench-close-session", - timeout: 5 * 60_000, - run: async () => closeWorkbenchSession(loopCtx, msg.body.sessionId), - }); - await msg.complete({ ok: true }); - }, - - "task.command.workbench.publish_pr": async (loopCtx, msg) => { - await loopCtx.step({ - name: "workbench-publish-pr", - timeout: 10 * 60_000, - run: async () => publishWorkbenchPr(loopCtx), - }); - await msg.complete({ ok: true }); - }, - - "task.command.workbench.revert_file": async (loopCtx, msg) => { - await loopCtx.step({ - name: "workbench-revert-file", - timeout: 5 * 60_000, - run: async () => revertWorkbenchFile(loopCtx, msg.body.path), + "task.command.workspace.change_owner": async (loopCtx, msg) => { + await changeTaskOwnerManually(loopCtx, { + primaryUserId: msg.body.primaryUserId, + primaryGithubLogin: msg.body.primaryGithubLogin, + primaryGithubEmail: msg.body.primaryGithubEmail, + primaryGithubAvatarUrl: msg.body.primaryGithubAvatarUrl ?? null, }); await msg.complete({ ok: true }); }, @@ -269,26 +148,38 @@ const commandHandlers: Record = { export async function runTaskWorkflow(ctx: any): Promise { await ctx.loop("task-command-loop", async (loopCtx: any) => { - const msg = await loopCtx.queue.next("next-command", { + const msg = await loopCtx.queue.next("next-task-command", { names: [...TASK_QUEUE_NAMES], completable: true, }); + if (!msg) { return Loop.continue(undefined); } - const handler = commandHandlers[msg.name as TaskQueueName]; - if (handler) { - try { - await handler(loopCtx, msg); - } catch (error) { - const message = resolveErrorMessage(error); - logActorWarning("task.workflow", "task workflow command failed", { - queueName: msg.name, - error: message, - }); - await msg.complete({ error: message }).catch(() => {}); - } + + const handler = COMMAND_HANDLERS[msg.name as TaskQueueName]; + if (!handler) { + logActorWarning("task.workflow", "unknown task command", { command: msg.name }); + await msg.complete({ error: `Unknown command: ${msg.name}` }).catch(() => {}); + return Loop.continue(undefined); } + + try { + // Wrap in a step so c.state and c.db are accessible inside mutation functions. + await loopCtx.step({ + name: msg.name, + timeout: 10 * 60_000, + run: async () => handler(loopCtx, msg), + }); + } catch (error) { + const message = resolveErrorMessage(error); + logActorWarning("task.workflow", "task workflow command failed", { + command: msg.name, + error: message, + }); + await msg.complete({ error: message }).catch(() => {}); + } + return Loop.continue(undefined); }); } diff --git a/foundry/packages/backend/src/actors/task/workflow/init.ts b/foundry/packages/backend/src/actors/task/workflow/init.ts index 8a9962d..ffdf1d4 100644 --- a/foundry/packages/backend/src/actors/task/workflow/init.ts +++ b/foundry/packages/backend/src/actors/task/workflow/init.ts @@ -1,49 +1,45 @@ // @ts-nocheck import { eq } from "drizzle-orm"; import { getActorRuntimeContext } from "../../context.js"; -import { getOrCreateHistory, selfTask } from "../../handles.js"; +import { selfTask } from "../../handles.js"; import { resolveErrorMessage } from "../../logging.js"; +import { taskWorkflowQueueName } from "./queue.js"; import { defaultSandboxProviderId } from "../../../sandbox-config.js"; import { task as taskTable, taskRuntime } from "../db/schema.js"; -import { TASK_ROW_ID, appendHistory, collectErrorMessages, resolveErrorDetail, setTaskState } from "./common.js"; -import { taskWorkflowQueueName } from "./queue.js"; - -async function ensureTaskRuntimeCacheColumns(db: any): Promise { - await db.execute(`ALTER TABLE task_runtime ADD COLUMN git_state_json text`).catch(() => {}); - await db.execute(`ALTER TABLE task_runtime ADD COLUMN git_state_updated_at integer`).catch(() => {}); - await db.execute(`ALTER TABLE task_runtime ADD COLUMN provision_stage text`).catch(() => {}); - await db.execute(`ALTER TABLE task_runtime ADD COLUMN provision_stage_updated_at integer`).catch(() => {}); -} +import { TASK_ROW_ID, appendAuditLog, collectErrorMessages, resolveErrorDetail, setTaskState } from "./common.js"; +// task actions called directly (no queue) export async function initBootstrapDbActivity(loopCtx: any, body: any): Promise { const { config } = getActorRuntimeContext(); - const sandboxProviderId = body?.sandboxProviderId ?? loopCtx.state.sandboxProviderId ?? defaultSandboxProviderId(config); + const sandboxProviderId = body?.sandboxProviderId ?? defaultSandboxProviderId(config); + const task = body?.task; + if (typeof task !== "string" || task.trim().length === 0) { + throw new Error("task initialize requires the task prompt"); + } const now = Date.now(); - await ensureTaskRuntimeCacheColumns(loopCtx.db); - await loopCtx.db .insert(taskTable) .values({ id: TASK_ROW_ID, - branchName: loopCtx.state.branchName, - title: loopCtx.state.title, - task: loopCtx.state.task, + branchName: body?.branchName ?? null, + title: body?.title ?? null, + task, sandboxProviderId, status: "init_bootstrap_db", - agentType: loopCtx.state.agentType ?? config.default_agent, + pullRequestJson: null, createdAt: now, updatedAt: now, }) .onConflictDoUpdate({ target: taskTable.id, set: { - branchName: loopCtx.state.branchName, - title: loopCtx.state.title, - task: loopCtx.state.task, + branchName: body?.branchName ?? null, + title: body?.title ?? null, + task, sandboxProviderId, status: "init_bootstrap_db", - agentType: loopCtx.state.agentType ?? config.default_agent, + pullRequestJson: null, updatedAt: now, }, }) @@ -54,26 +50,18 @@ export async function initBootstrapDbActivity(loopCtx: any, body: any): Promise< .values({ id: TASK_ROW_ID, activeSandboxId: null, - activeSessionId: null, activeSwitchTarget: null, activeCwd: null, - statusMessage: "provisioning", gitStateJson: null, gitStateUpdatedAt: null, - provisionStage: "queued", - provisionStageUpdatedAt: now, updatedAt: now, }) .onConflictDoUpdate({ target: taskRuntime.id, set: { activeSandboxId: null, - activeSessionId: null, activeSwitchTarget: null, activeCwd: null, - statusMessage: "provisioning", - provisionStage: "queued", - provisionStageUpdatedAt: now, updatedAt: now, }, }) @@ -81,22 +69,11 @@ export async function initBootstrapDbActivity(loopCtx: any, body: any): Promise< } export async function initEnqueueProvisionActivity(loopCtx: any, body: any): Promise { - await setTaskState(loopCtx, "init_enqueue_provision", "provision queued"); - await loopCtx.db - .update(taskRuntime) - .set({ - provisionStage: "queued", - provisionStageUpdatedAt: Date.now(), - updatedAt: Date.now(), - }) - .where(eq(taskRuntime.id, TASK_ROW_ID)) - .run(); + await setTaskState(loopCtx, "init_enqueue_provision"); const self = selfTask(loopCtx); try { - await self.send(taskWorkflowQueueName("task.command.provision"), body, { - wait: false, - }); + void self.send(taskWorkflowQueueName("task.command.provision"), body ?? {}, { wait: false }).catch(() => {}); } catch (error) { logActorWarning("task.init", "background provision command failed", { organizationId: loopCtx.state.organizationId, @@ -111,60 +88,52 @@ export async function initEnqueueProvisionActivity(loopCtx: any, body: any): Pro export async function initCompleteActivity(loopCtx: any, body: any): Promise { const now = Date.now(); const { config } = getActorRuntimeContext(); - const sandboxProviderId = body?.sandboxProviderId ?? loopCtx.state.sandboxProviderId ?? defaultSandboxProviderId(config); + const sandboxProviderId = body?.sandboxProviderId ?? defaultSandboxProviderId(config); - await setTaskState(loopCtx, "init_complete", "task initialized"); + await setTaskState(loopCtx, "init_complete"); await loopCtx.db .update(taskRuntime) .set({ - statusMessage: "ready", - provisionStage: "ready", - provisionStageUpdatedAt: now, updatedAt: now, }) .where(eq(taskRuntime.id, TASK_ROW_ID)) .run(); - const history = await getOrCreateHistory(loopCtx, loopCtx.state.organizationId, loopCtx.state.repoId); - await history.append({ - kind: "task.initialized", - taskId: loopCtx.state.taskId, - branchName: loopCtx.state.branchName, + await appendAuditLog(loopCtx, "task.initialized", { payload: { sandboxProviderId }, }); - - loopCtx.state.initialized = true; } -export async function initFailedActivity(loopCtx: any, error: unknown): Promise { +export async function initFailedActivity(loopCtx: any, error: unknown, body?: any): Promise { const now = Date.now(); const detail = resolveErrorDetail(error); const messages = collectErrorMessages(error); const { config } = getActorRuntimeContext(); - const sandboxProviderId = loopCtx.state.sandboxProviderId ?? defaultSandboxProviderId(config); + const sandboxProviderId = defaultSandboxProviderId(config); + const task = typeof body?.task === "string" ? body.task : null; await loopCtx.db .insert(taskTable) .values({ id: TASK_ROW_ID, - branchName: loopCtx.state.branchName ?? null, - title: loopCtx.state.title ?? null, - task: loopCtx.state.task, + branchName: body?.branchName ?? null, + title: body?.title ?? null, + task: task ?? detail, sandboxProviderId, status: "error", - agentType: loopCtx.state.agentType ?? config.default_agent, + pullRequestJson: null, createdAt: now, updatedAt: now, }) .onConflictDoUpdate({ target: taskTable.id, set: { - branchName: loopCtx.state.branchName ?? null, - title: loopCtx.state.title ?? null, - task: loopCtx.state.task, + branchName: body?.branchName ?? null, + title: body?.title ?? null, + task: task ?? detail, sandboxProviderId, status: "error", - agentType: loopCtx.state.agentType ?? config.default_agent, + pullRequestJson: null, updatedAt: now, }, }) @@ -175,30 +144,22 @@ export async function initFailedActivity(loopCtx: any, error: unknown): Promise< .values({ id: TASK_ROW_ID, activeSandboxId: null, - activeSessionId: null, activeSwitchTarget: null, activeCwd: null, - statusMessage: detail, - provisionStage: "error", - provisionStageUpdatedAt: now, updatedAt: now, }) .onConflictDoUpdate({ target: taskRuntime.id, set: { activeSandboxId: null, - activeSessionId: null, activeSwitchTarget: null, activeCwd: null, - statusMessage: detail, - provisionStage: "error", - provisionStageUpdatedAt: now, updatedAt: now, }, }) .run(); - await appendHistory(loopCtx, "task.error", { + await appendAuditLog(loopCtx, "task.error", { detail, messages, }); diff --git a/foundry/packages/backend/src/actors/task/workflow/push.ts b/foundry/packages/backend/src/actors/task/workflow/push.ts index c525ebe..f15ab0b 100644 --- a/foundry/packages/backend/src/actors/task/workflow/push.ts +++ b/foundry/packages/backend/src/actors/task/workflow/push.ts @@ -1,9 +1,7 @@ // @ts-nocheck -import { eq } from "drizzle-orm"; import { getTaskSandbox } from "../../handles.js"; import { resolveOrganizationGithubAuth } from "../../../services/github-auth.js"; -import { taskRuntime, taskSandboxes } from "../db/schema.js"; -import { TASK_ROW_ID, appendHistory, getCurrentRecord } from "./common.js"; +import { appendAuditLog, getCurrentRecord } from "./common.js"; export interface PushActiveBranchOptions { reason?: string | null; @@ -13,7 +11,7 @@ export interface PushActiveBranchOptions { export async function pushActiveBranchActivity(loopCtx: any, options: PushActiveBranchOptions = {}): Promise { const record = await getCurrentRecord(loopCtx); const activeSandboxId = record.activeSandboxId; - const branchName = loopCtx.state.branchName ?? record.branchName; + const branchName = record.branchName; if (!activeSandboxId) { throw new Error("cannot push: no active sandbox"); @@ -28,19 +26,6 @@ export async function pushActiveBranchActivity(loopCtx: any, options: PushActive throw new Error("cannot push: active sandbox cwd is not set"); } - const now = Date.now(); - await loopCtx.db - .update(taskRuntime) - .set({ statusMessage: `pushing branch ${branchName}`, updatedAt: now }) - .where(eq(taskRuntime.id, TASK_ROW_ID)) - .run(); - - await loopCtx.db - .update(taskSandboxes) - .set({ statusMessage: `pushing branch ${branchName}`, updatedAt: now }) - .where(eq(taskSandboxes.sandboxId, activeSandboxId)) - .run(); - const script = [ "set -euo pipefail", `cd ${JSON.stringify(cwd)}`, @@ -68,20 +53,7 @@ export async function pushActiveBranchActivity(loopCtx: any, options: PushActive throw new Error(`git push failed (${result.exitCode ?? 1}): ${[result.stdout, result.stderr].filter(Boolean).join("")}`); } - const updatedAt = Date.now(); - await loopCtx.db - .update(taskRuntime) - .set({ statusMessage: `push complete for ${branchName}`, updatedAt }) - .where(eq(taskRuntime.id, TASK_ROW_ID)) - .run(); - - await loopCtx.db - .update(taskSandboxes) - .set({ statusMessage: `push complete for ${branchName}`, updatedAt }) - .where(eq(taskSandboxes.sandboxId, activeSandboxId)) - .run(); - - await appendHistory(loopCtx, options.historyKind ?? "task.push", { + await appendAuditLog(loopCtx, options.historyKind ?? "task.push", { reason: options.reason ?? null, branchName, sandboxId: activeSandboxId, diff --git a/foundry/packages/backend/src/actors/task/workflow/queue.ts b/foundry/packages/backend/src/actors/task/workflow/queue.ts index 3e613e2..a49c39a 100644 --- a/foundry/packages/backend/src/actors/task/workflow/queue.ts +++ b/foundry/packages/backend/src/actors/task/workflow/queue.ts @@ -8,27 +8,19 @@ export const TASK_QUEUE_NAMES = [ "task.command.merge", "task.command.archive", "task.command.kill", - "task.command.get", - "task.command.workbench.mark_unread", - "task.command.workbench.rename_task", - "task.command.workbench.rename_branch", - "task.command.workbench.create_session", - "task.command.workbench.create_session_and_send", - "task.command.workbench.ensure_session", - "task.command.workbench.rename_session", - "task.command.workbench.set_session_unread", - "task.command.workbench.update_draft", - "task.command.workbench.change_model", - "task.command.workbench.send_message", - "task.command.workbench.stop_session", - "task.command.workbench.sync_session_status", - "task.command.workbench.refresh_derived", - "task.command.workbench.refresh_session_transcript", - "task.command.workbench.close_session", - "task.command.workbench.publish_pr", - "task.command.workbench.revert_file", + "task.command.workspace.create_session", + "task.command.workspace.create_session_and_send", + "task.command.workspace.ensure_session", + "task.command.workspace.send_message", + "task.command.workspace.stop_session", + "task.command.workspace.close_session", + "task.command.workspace.publish_pr", + "task.command.workspace.revert_file", + "task.command.workspace.change_owner", ] as const; +export type TaskQueueName = (typeof TASK_QUEUE_NAMES)[number]; + export function taskWorkflowQueueName(name: string): string { return name; } diff --git a/foundry/packages/backend/src/actors/task/workbench.ts b/foundry/packages/backend/src/actors/task/workspace.ts similarity index 54% rename from foundry/packages/backend/src/actors/task/workbench.ts rename to foundry/packages/backend/src/actors/task/workspace.ts index d6698ca..0856947 100644 --- a/foundry/packages/backend/src/actors/task/workbench.ts +++ b/foundry/packages/backend/src/actors/task/workspace.ts @@ -1,14 +1,24 @@ // @ts-nocheck import { randomUUID } from "node:crypto"; -import { basename, dirname } from "node:path"; +import { basename } from "node:path"; import { asc, eq } from "drizzle-orm"; +import { + DEFAULT_WORKSPACE_MODEL_GROUPS, + DEFAULT_WORKSPACE_MODEL_ID, + workspaceAgentForModel, + workspaceSandboxAgentIdForModel, +} from "@sandbox-agent/foundry-shared"; import { getActorRuntimeContext } from "../context.js"; -import { getOrCreateRepository, getOrCreateTaskSandbox, getOrCreateOrganization, getTaskSandbox, selfTask } from "../handles.js"; -import { SANDBOX_REPO_CWD } from "../sandbox/index.js"; +import { getOrCreateOrganization, getOrCreateTaskSandbox, getOrCreateUser, getTaskSandbox, selfTask } from "../handles.js"; +import { logActorInfo, logActorWarning, resolveErrorMessage } from "../logging.js"; import { resolveSandboxProviderId } from "../../sandbox-config.js"; +import { getBetterAuthService } from "../../services/better-auth.js"; import { resolveOrganizationGithubAuth } from "../../services/github-auth.js"; import { githubRepoFullNameFromRemote } from "../../services/repo.js"; -import { task as taskTable, taskRuntime, taskSandboxes, taskWorkbenchSessions } from "./db/schema.js"; +import { taskWorkflowQueueName } from "./workflow/queue.js"; +import { organizationWorkflowQueueName } from "../organization/queues.js"; + +import { task as taskTable, taskOwner, taskRuntime, taskSandboxes, taskWorkspaceSessions } from "./db/schema.js"; import { getCurrentRecord } from "./workflow/common.js"; function emptyGitState() { @@ -20,62 +30,29 @@ function emptyGitState() { }; } -async function ensureWorkbenchSessionTable(c: any): Promise { - await c.db.execute(` - CREATE TABLE IF NOT EXISTS task_workbench_sessions ( - session_id text PRIMARY KEY NOT NULL, - sandbox_session_id text, - session_name text NOT NULL, - model text NOT NULL, - status text DEFAULT 'ready' NOT NULL, - error_message text, - transcript_json text DEFAULT '[]' NOT NULL, - transcript_updated_at integer, - unread integer DEFAULT 0 NOT NULL, - draft_text text DEFAULT '' NOT NULL, - draft_attachments_json text DEFAULT '[]' NOT NULL, - draft_updated_at integer, - created integer DEFAULT 1 NOT NULL, - closed integer DEFAULT 0 NOT NULL, - thinking_since_ms integer, - created_at integer NOT NULL, - updated_at integer NOT NULL - ) - `); - await c.db.execute(`ALTER TABLE task_workbench_sessions ADD COLUMN sandbox_session_id text`).catch(() => {}); - await c.db.execute(`ALTER TABLE task_workbench_sessions ADD COLUMN status text DEFAULT 'ready' NOT NULL`).catch(() => {}); - await c.db.execute(`ALTER TABLE task_workbench_sessions ADD COLUMN error_message text`).catch(() => {}); - await c.db.execute(`ALTER TABLE task_workbench_sessions ADD COLUMN transcript_json text DEFAULT '[]' NOT NULL`).catch(() => {}); - await c.db.execute(`ALTER TABLE task_workbench_sessions ADD COLUMN transcript_updated_at integer`).catch(() => {}); -} - -async function ensureTaskRuntimeCacheColumns(c: any): Promise { - await c.db.execute(`ALTER TABLE task_runtime ADD COLUMN git_state_json text`).catch(() => {}); - await c.db.execute(`ALTER TABLE task_runtime ADD COLUMN git_state_updated_at integer`).catch(() => {}); - await c.db.execute(`ALTER TABLE task_runtime ADD COLUMN provision_stage text`).catch(() => {}); - await c.db.execute(`ALTER TABLE task_runtime ADD COLUMN provision_stage_updated_at integer`).catch(() => {}); -} - -function defaultModelForAgent(agentType: string | null | undefined) { - return agentType === "codex" ? "gpt-5.3-codex" : "claude-sonnet-4"; -} - -function isCodexModel(model: string) { - return model.startsWith("gpt-") || model.startsWith("o"); -} +const FALLBACK_MODEL = DEFAULT_WORKSPACE_MODEL_ID; function agentKindForModel(model: string) { - if (isCodexModel(model)) { - return "Codex"; - } - return "Claude"; + return workspaceAgentForModel(model); } -export function agentTypeForModel(model: string) { - if (isCodexModel(model)) { - return "codex"; +export function sandboxAgentIdForModel(model: string) { + return workspaceSandboxAgentIdForModel(model); +} + +async function resolveWorkspaceModelGroups(c: any): Promise { + try { + const sandbox = await getOrCreateTaskSandbox(c, c.state.organizationId, stableSandboxId(c)); + const groups = await sandbox.listWorkspaceModelGroups(); + return Array.isArray(groups) && groups.length > 0 ? groups : DEFAULT_WORKSPACE_MODEL_GROUPS; + } catch { + return DEFAULT_WORKSPACE_MODEL_GROUPS; } - return "claude"; +} + +async function resolveSandboxAgentForModel(c: any, model: string): Promise { + const groups = await resolveWorkspaceModelGroups(c); + return workspaceSandboxAgentIdForModel(model, groups); } function repoLabelFromRemote(remoteUrl: string): string { @@ -93,6 +70,11 @@ function repoLabelFromRemote(remoteUrl: string): string { return basename(trimmed.replace(/\.git$/, "")); } +async function getRepositoryMetadata(c: any): Promise<{ defaultBranch: string | null; fullName: string | null; remoteUrl: string }> { + const organization = await getOrCreateOrganization(c, c.state.organizationId); + return await organization.getRepositoryMetadata({ repoId: c.state.repoId }); +} + function parseDraftAttachments(value: string | null | undefined): Array { if (!value) { return []; @@ -140,6 +122,191 @@ function parseGitState(value: string | null | undefined): { fileChanges: Array { + const row = await c.db.select().from(taskOwner).where(eq(taskOwner.id, 1)).get(); + if (!row) { + return null; + } + return { + primaryUserId: row.primaryUserId ?? null, + primaryGithubLogin: row.primaryGithubLogin ?? null, + primaryGithubEmail: row.primaryGithubEmail ?? null, + primaryGithubAvatarUrl: row.primaryGithubAvatarUrl ?? null, + }; +} + +async function upsertTaskOwner( + c: any, + owner: { primaryUserId: string; primaryGithubLogin: string; primaryGithubEmail: string; primaryGithubAvatarUrl: string | null }, +): Promise { + const now = Date.now(); + await c.db + .insert(taskOwner) + .values({ + id: 1, + primaryUserId: owner.primaryUserId, + primaryGithubLogin: owner.primaryGithubLogin, + primaryGithubEmail: owner.primaryGithubEmail, + primaryGithubAvatarUrl: owner.primaryGithubAvatarUrl, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: taskOwner.id, + set: { + primaryUserId: owner.primaryUserId, + primaryGithubLogin: owner.primaryGithubLogin, + primaryGithubEmail: owner.primaryGithubEmail, + primaryGithubAvatarUrl: owner.primaryGithubAvatarUrl, + updatedAt: now, + }, + }) + .run(); +} + +/** + * Inject the user's GitHub OAuth token into the sandbox as a git credential store file. + * Also configures git user.name and user.email so commits are attributed correctly. + * The credential file is overwritten on each owner swap. + * + * Race condition note: If User A sends a message and the agent starts a long git operation, + * then User B triggers an owner swap, the in-flight git process still has User A's credentials + * (already read from the credential store). The next git operation uses User B's credentials. + */ +async function injectGitCredentials(sandbox: any, login: string, email: string, token: string): Promise { + const script = [ + "set -euo pipefail", + `git config --global user.name ${JSON.stringify(login)}`, + `git config --global user.email ${JSON.stringify(email)}`, + `git config --global credential.helper 'store --file=$HOME/.git-token'`, + `printf '%s\\n' ${JSON.stringify(`https://${login}:${token}@github.com`)} > $HOME/.git-token`, + `chmod 600 $HOME/.git-token`, + ]; + const result = await sandbox.runProcess({ + command: "bash", + args: ["-lc", script.join("; ")], + cwd: "/", + timeoutMs: 30_000, + }); + if ((result.exitCode ?? 0) !== 0) { + logActorWarning("task", "git credential injection failed", { + exitCode: result.exitCode, + output: [result.stdout, result.stderr].filter(Boolean).join(""), + }); + } +} + +/** + * Resolves the current user's GitHub identity from their auth session. + * Returns null if the session is invalid or the user has no GitHub account. + */ +async function resolveGithubIdentity(authSessionId: string): Promise<{ + userId: string; + login: string; + email: string; + avatarUrl: string | null; + accessToken: string; +} | null> { + const authService = getBetterAuthService(); + const authState = await authService.getAuthState(authSessionId); + if (!authState?.user?.id) { + return null; + } + + const tokenResult = await authService.getAccessTokenForSession(authSessionId); + if (!tokenResult?.accessToken) { + return null; + } + + const githubAccount = authState.accounts?.find((account: any) => account.providerId === "github"); + if (!githubAccount) { + return null; + } + + // Resolve the GitHub login from the API since Better Auth only stores the + // numeric account ID, not the login username. + let login = authState.user.name ?? "unknown"; + let avatarUrl = authState.user.image ?? null; + try { + const resp = await fetch("https://api.github.com/user", { + headers: { + Authorization: `Bearer ${tokenResult.accessToken}`, + Accept: "application/vnd.github+json", + }, + }); + if (resp.ok) { + const ghUser = (await resp.json()) as { login?: string; avatar_url?: string }; + if (ghUser.login) { + login = ghUser.login; + } + if (ghUser.avatar_url) { + avatarUrl = ghUser.avatar_url; + } + } + } catch (error) { + console.warn("resolveGithubIdentity: failed to fetch GitHub user", error); + } + + return { + userId: authState.user.id, + login, + email: authState.user.email ?? `${githubAccount.accountId}@users.noreply.github.com`, + avatarUrl, + accessToken: tokenResult.accessToken, + }; +} + +/** + * Check if the task owner needs to swap, and if so, update the owner record + * and inject new git credentials into the sandbox. + * Returns true if an owner swap occurred. + */ +async function maybeSwapTaskOwner(c: any, authSessionId: string | null | undefined, sandbox: any | null): Promise { + if (!authSessionId) { + return false; + } + + const identity = await resolveGithubIdentity(authSessionId); + if (!identity) { + return false; + } + + const currentOwner = await readTaskOwner(c); + if (currentOwner?.primaryUserId === identity.userId) { + return false; + } + + await upsertTaskOwner(c, { + primaryUserId: identity.userId, + primaryGithubLogin: identity.login, + primaryGithubEmail: identity.email, + primaryGithubAvatarUrl: identity.avatarUrl, + }); + + if (sandbox) { + await injectGitCredentials(sandbox, identity.login, identity.email, identity.accessToken); + } + + return true; +} + +/** + * Manually change the task owner. Updates the owner record and broadcasts the + * change to subscribers. Git credentials are NOT injected here — they will be + * injected the next time the target user sends a message (auto-swap path). + */ +export async function changeTaskOwnerManually( + c: any, + input: { primaryUserId: string; primaryGithubLogin: string; primaryGithubEmail: string; primaryGithubAvatarUrl: string | null }, +): Promise { + await upsertTaskOwner(c, input); + await broadcastTaskUpdate(c); +} + export function shouldMarkSessionUnreadForStatus(meta: { thinkingSinceMs?: number | null }, status: "running" | "idle" | "error"): boolean { if (status === "running") { return false; @@ -168,8 +335,7 @@ export function shouldRecreateSessionForModelChange(meta: { } async function listSessionMetaRows(c: any, options?: { includeClosed?: boolean }): Promise> { - await ensureWorkbenchSessionTable(c); - const rows = await c.db.select().from(taskWorkbenchSessions).orderBy(asc(taskWorkbenchSessions.createdAt)).all(); + const rows = await c.db.select().from(taskWorkspaceSessions).orderBy(asc(taskWorkspaceSessions.createdAt)).all(); const mapped = rows.map((row: any) => ({ ...row, id: row.sessionId, @@ -179,9 +345,6 @@ async function listSessionMetaRows(c: any, options?: { includeClosed?: boolean } errorMessage: row.errorMessage ?? null, transcript: parseTranscript(row.transcriptJson), transcriptUpdatedAt: row.transcriptUpdatedAt ?? null, - draftAttachments: parseDraftAttachments(row.draftAttachmentsJson), - draftUpdatedAtMs: row.draftUpdatedAt ?? null, - unread: row.unread === 1, created: row.created === 1, closed: row.closed === 1, })); @@ -199,8 +362,7 @@ async function nextSessionName(c: any): Promise { } async function readSessionMeta(c: any, sessionId: string): Promise { - await ensureWorkbenchSessionTable(c); - const row = await c.db.select().from(taskWorkbenchSessions).where(eq(taskWorkbenchSessions.sessionId, sessionId)).get(); + const row = await c.db.select().from(taskWorkspaceSessions).where(eq(taskWorkspaceSessions.sessionId, sessionId)).get(); if (!row) { return null; @@ -215,28 +377,107 @@ async function readSessionMeta(c: any, sessionId: string): Promise { errorMessage: row.errorMessage ?? null, transcript: parseTranscript(row.transcriptJson), transcriptUpdatedAt: row.transcriptUpdatedAt ?? null, - draftAttachments: parseDraftAttachments(row.draftAttachmentsJson), - draftUpdatedAtMs: row.draftUpdatedAt ?? null, - unread: row.unread === 1, created: row.created === 1, closed: row.closed === 1, }; } +async function getUserTaskState(c: any, authSessionId?: string | null): Promise<{ activeSessionId: string | null; bySessionId: Map }> { + if (!authSessionId) { + return { activeSessionId: null, bySessionId: new Map() }; + } + + const authState = await getBetterAuthService().getAuthState(authSessionId); + const userId = authState?.user?.id; + if (typeof userId !== "string" || userId.length === 0) { + return { activeSessionId: null, bySessionId: new Map() }; + } + + const user = await getOrCreateUser(c, userId); + const state = await user.getTaskState({ taskId: c.state.taskId }); + const bySessionId = new Map( + (state?.sessions ?? []).map((row: any) => [ + row.sessionId, + { + unread: Boolean(row.unread), + draftText: row.draftText ?? "", + draftAttachments: parseDraftAttachments(row.draftAttachmentsJson), + draftUpdatedAtMs: row.draftUpdatedAt ?? null, + }, + ]), + ); + return { + activeSessionId: state?.activeSessionId ?? null, + bySessionId, + }; +} + +async function upsertUserTaskState(c: any, authSessionId: string | null | undefined, sessionId: string, patch: Record): Promise { + if (!authSessionId) { + return; + } + + const authState = await getBetterAuthService().getAuthState(authSessionId); + const userId = authState?.user?.id; + if (typeof userId !== "string" || userId.length === 0) { + return; + } + + const user = await getOrCreateUser(c, userId); + await user.upsertTaskState({ + taskId: c.state.taskId, + sessionId, + patch, + }); +} + +async function deleteUserTaskState(c: any, authSessionId: string | null | undefined, sessionId: string): Promise { + if (!authSessionId) { + return; + } + + const authState = await getBetterAuthService().getAuthState(authSessionId); + const userId = authState?.user?.id; + if (typeof userId !== "string" || userId.length === 0) { + return; + } + + const user = await getOrCreateUser(c, userId); + await user.deleteTaskState({ + taskId: c.state.taskId, + sessionId, + }); +} + +async function resolveDefaultModel(c: any, authSessionId?: string | null): Promise { + if (!authSessionId) { + return FALLBACK_MODEL; + } + + const authState = await getBetterAuthService().getAuthState(authSessionId); + const userId = authState?.user?.id; + if (typeof userId !== "string" || userId.length === 0) { + return FALLBACK_MODEL; + } + + const user = await getOrCreateUser(c, userId); + const userState = await user.getAppAuthState({ sessionId: authSessionId }); + return userState?.profile?.defaultModel ?? FALLBACK_MODEL; +} + async function ensureSessionMeta( c: any, params: { sessionId: string; sandboxSessionId?: string | null; model?: string; + authSessionId?: string | null; sessionName?: string; - unread?: boolean; created?: boolean; status?: "pending_provision" | "pending_session_create" | "ready" | "error"; errorMessage?: string | null; }, ): Promise { - await ensureWorkbenchSessionTable(c); const existing = await readSessionMeta(c, params.sessionId); if (existing) { return existing; @@ -244,11 +485,10 @@ async function ensureSessionMeta( const now = Date.now(); const sessionName = params.sessionName ?? (await nextSessionName(c)); - const model = params.model ?? defaultModelForAgent(c.state.agentType); - const unread = params.unread ?? false; + const model = params.model ?? (await resolveDefaultModel(c, params.authSessionId)); await c.db - .insert(taskWorkbenchSessions) + .insert(taskWorkspaceSessions) .values({ sessionId: params.sessionId, sandboxSessionId: params.sandboxSessionId ?? null, @@ -258,10 +498,6 @@ async function ensureSessionMeta( errorMessage: params.errorMessage ?? null, transcriptJson: "[]", transcriptUpdatedAt: null, - unread: unread ? 1 : 0, - draftText: "", - draftAttachmentsJson: "[]", - draftUpdatedAt: null, created: params.created === false ? 0 : 1, closed: 0, thinkingSinceMs: null, @@ -276,19 +512,18 @@ async function ensureSessionMeta( async function updateSessionMeta(c: any, sessionId: string, values: Record): Promise { await ensureSessionMeta(c, { sessionId }); await c.db - .update(taskWorkbenchSessions) + .update(taskWorkspaceSessions) .set({ ...values, updatedAt: Date.now(), }) - .where(eq(taskWorkbenchSessions.sessionId, sessionId)) + .where(eq(taskWorkspaceSessions.sessionId, sessionId)) .run(); return await readSessionMeta(c, sessionId); } async function readSessionMetaBySandboxSessionId(c: any, sandboxSessionId: string): Promise { - await ensureWorkbenchSessionTable(c); - const row = await c.db.select().from(taskWorkbenchSessions).where(eq(taskWorkbenchSessions.sandboxSessionId, sandboxSessionId)).get(); + const row = await c.db.select().from(taskWorkspaceSessions).where(eq(taskWorkspaceSessions.sandboxSessionId, sandboxSessionId)).get(); if (!row) { return null; } @@ -298,17 +533,17 @@ async function readSessionMetaBySandboxSessionId(c: any, sandboxSessionId: strin async function requireReadySessionMeta(c: any, sessionId: string): Promise { const meta = await readSessionMeta(c, sessionId); if (!meta) { - throw new Error(`Unknown workbench session: ${sessionId}`); + throw new Error(`Unknown workspace session: ${sessionId}`); } if (meta.status !== "ready" || !meta.sandboxSessionId) { - throw new Error(meta.errorMessage ?? "This workbench session is still preparing"); + throw new Error(meta.errorMessage ?? "This workspace session is still preparing"); } return meta; } export function requireSendableSessionMeta(meta: any, sessionId: string): any { if (!meta) { - throw new Error(`Unknown workbench session: ${sessionId}`); + throw new Error(`Unknown workspace session: ${sessionId}`); } if (meta.status !== "ready" || !meta.sandboxSessionId) { throw new Error(`Session is not ready (status: ${meta.status}). Wait for session provisioning to complete.`); @@ -336,10 +571,14 @@ async function getTaskSandboxRuntime( }> { const { config } = getActorRuntimeContext(); const sandboxId = stableSandboxId(c); - const sandboxProviderId = resolveSandboxProviderId(config, record.sandboxProviderId ?? c.state.sandboxProviderId ?? null); + const sandboxProviderId = resolveSandboxProviderId(config, record.sandboxProviderId ?? null); const sandbox = await getOrCreateTaskSandbox(c, c.state.organizationId, sandboxId, {}); const actorId = typeof sandbox.resolve === "function" ? await sandbox.resolve().catch(() => null) : null; const switchTarget = sandboxProviderId === "local" ? `sandbox://local/${sandboxId}` : `sandbox://e2b/${sandboxId}`; + + // Resolve the actual repo CWD from the sandbox's $HOME (differs by provider). + const repoCwdResult = await sandbox.repoCwd(); + const cwd = repoCwdResult?.cwd ?? "$HOME/repo"; const now = Date.now(); await c.db @@ -349,8 +588,7 @@ async function getTaskSandboxRuntime( sandboxProviderId, sandboxActorId: typeof actorId === "string" ? actorId : null, switchTarget, - cwd: SANDBOX_REPO_CWD, - statusMessage: "sandbox ready", + cwd, createdAt: now, updatedAt: now, }) @@ -360,7 +598,7 @@ async function getTaskSandboxRuntime( sandboxProviderId, sandboxActorId: typeof actorId === "string" ? actorId : null, switchTarget, - cwd: SANDBOX_REPO_CWD, + cwd, updatedAt: now, }, }) @@ -371,7 +609,7 @@ async function getTaskSandboxRuntime( .set({ activeSandboxId: sandboxId, activeSwitchTarget: switchTarget, - activeCwd: SANDBOX_REPO_CWD, + activeCwd: cwd, updatedAt: now, }) .where(eq(taskRuntime.id, 1)) @@ -382,18 +620,18 @@ async function getTaskSandboxRuntime( sandboxId, sandboxProviderId, switchTarget, - cwd: SANDBOX_REPO_CWD, + cwd, }; } /** * Track whether the sandbox repo has been fully prepared (cloned + fetched + checked out) * for the current actor lifecycle. Subsequent calls can skip the expensive `git fetch` - * when `skipFetch` is true (used by sendWorkbenchMessage to avoid blocking on every prompt). + * when `skipFetch` is true (used by sendWorkspaceMessage to avoid blocking on every prompt). */ let sandboxRepoPrepared = false; -async function ensureSandboxRepo(c: any, sandbox: any, record: any, opts?: { skipFetchIfPrepared?: boolean }): Promise { +async function ensureSandboxRepo(c: any, sandbox: any, record: any, opts?: { skipFetchIfPrepared?: boolean; authSessionId?: string | null }): Promise { if (!record.branchName) { throw new Error("cannot prepare a sandbox repo before the task branch exists"); } @@ -401,28 +639,35 @@ async function ensureSandboxRepo(c: any, sandbox: any, record: any, opts?: { ski // If the repo was already prepared and the caller allows skipping fetch, just return. // The clone, fetch, and checkout already happened on a prior call. if (opts?.skipFetchIfPrepared && sandboxRepoPrepared) { + logActorInfo("task.sandbox", "ensureSandboxRepo skipped (already prepared)"); return; } + const repoStart = performance.now(); + + const t0 = performance.now(); const auth = await resolveOrganizationGithubAuth(c, c.state.organizationId); - const repository = await getOrCreateRepository(c, c.state.organizationId, c.state.repoId, c.state.repoRemote); - const metadata = await repository.getRepositoryMetadata({}); + const metadata = await getRepositoryMetadata(c); + logActorInfo("task.sandbox", "resolveAuth+metadata", { durationMs: Math.round(performance.now() - t0) }); + const baseRef = metadata.defaultBranch ?? "main"; - const sandboxRepoRoot = dirname(SANDBOX_REPO_CWD); + // Use $HOME inside the shell script so the path resolves correctly regardless + // of which user the sandbox runs as (E2B: "user", local Docker: "sandbox"). const script = [ "set -euo pipefail", - `mkdir -p ${JSON.stringify(sandboxRepoRoot)}`, + 'REPO_DIR="$HOME/repo"', + 'mkdir -p "$HOME"', "git config --global credential.helper '!f() { echo username=x-access-token; echo password=${GH_TOKEN:-$GITHUB_TOKEN}; }; f'", - `if [ ! -d ${JSON.stringify(`${SANDBOX_REPO_CWD}/.git`)} ]; then rm -rf ${JSON.stringify(SANDBOX_REPO_CWD)} && git clone ${JSON.stringify( - c.state.repoRemote, - )} ${JSON.stringify(SANDBOX_REPO_CWD)}; fi`, - `cd ${JSON.stringify(SANDBOX_REPO_CWD)}`, + `if [ ! -d "$REPO_DIR/.git" ]; then rm -rf "$REPO_DIR" && git clone ${JSON.stringify(metadata.remoteUrl)} "$REPO_DIR"; fi`, + 'cd "$REPO_DIR"', "git fetch origin --prune", `if git show-ref --verify --quiet refs/remotes/origin/${JSON.stringify(record.branchName).slice(1, -1)}; then target_ref=${JSON.stringify( `origin/${record.branchName}`, )}; else target_ref=${JSON.stringify(baseRef)}; fi`, `git checkout -B ${JSON.stringify(record.branchName)} \"$target_ref\"`, ]; + + const t1 = performance.now(); const result = await sandbox.runProcess({ command: "bash", args: ["-lc", script.join("; ")], @@ -435,12 +680,26 @@ async function ensureSandboxRepo(c: any, sandbox: any, record: any, opts?: { ski : undefined, timeoutMs: 5 * 60_000, }); + logActorInfo("task.sandbox", "git clone/fetch/checkout", { + branch: record.branchName, + repo: metadata.remoteUrl, + durationMs: Math.round(performance.now() - t1), + }); if ((result.exitCode ?? 0) !== 0) { throw new Error(`sandbox repo preparation failed (${result.exitCode ?? 1}): ${[result.stdout, result.stderr].filter(Boolean).join("")}`); } + // On first repo preparation, inject the task owner's git credentials into the sandbox + // so that push/commit operations are authenticated and attributed to the correct user. + if (!sandboxRepoPrepared && opts?.authSessionId) { + const t2 = performance.now(); + await maybeSwapTaskOwner(c, opts.authSessionId, sandbox); + logActorInfo("task.sandbox", "maybeSwapTaskOwner", { durationMs: Math.round(performance.now() - t2) }); + } + sandboxRepoPrepared = true; + logActorInfo("task.sandbox", "ensureSandboxRepo complete", { totalDurationMs: Math.round(performance.now() - repoStart) }); } async function executeInSandbox( @@ -452,7 +711,7 @@ async function executeInSandbox( label: string; }, ): Promise<{ exitCode: number; result: string }> { - const record = await ensureWorkbenchSeeded(c); + const record = await ensureWorkspaceSeeded(c); const runtime = await getTaskSandboxRuntime(c, record); await ensureSandboxRepo(c, runtime.sandbox, record); const response = await runtime.sandbox.runProcess({ @@ -555,7 +814,7 @@ function buildFileTree(paths: string[]): Array { return sortNodes(root.children.values()); } -async function collectWorkbenchGitState(c: any, record: any) { +async function collectWorkspaceGitState(c: any, record: any) { const activeSandboxId = record.activeSandboxId; const activeSandbox = activeSandboxId != null ? ((record.sandboxes ?? []).find((candidate: any) => candidate.sandboxId === activeSandboxId) ?? null) : null; const cwd = activeSandbox?.cwd ?? record.sandboxes?.[0]?.cwd ?? null; @@ -628,7 +887,6 @@ async function collectWorkbenchGitState(c: any, record: any) { } async function readCachedGitState(c: any): Promise<{ fileChanges: Array; diffs: Record; fileTree: Array; updatedAt: number | null }> { - await ensureTaskRuntimeCacheColumns(c); const row = await c.db .select({ gitStateJson: taskRuntime.gitStateJson, @@ -645,7 +903,6 @@ async function readCachedGitState(c: any): Promise<{ fileChanges: Array; di } async function writeCachedGitState(c: any, gitState: { fileChanges: Array; diffs: Record; fileTree: Array }): Promise { - await ensureTaskRuntimeCacheColumns(c); const now = Date.now(); await c.db .update(taskRuntime) @@ -687,102 +944,73 @@ async function writeSessionTranscript(c: any, sessionId: string, transcript: Arr }); } -async function enqueueWorkbenchRefresh( - c: any, - command: "task.command.workbench.refresh_derived" | "task.command.workbench.refresh_session_transcript", - body: Record, -): Promise { +function fireRefreshDerived(c: any): void { const self = selfTask(c); - await self.send(command, body, { wait: false }); + void self.refreshDerived({}).catch(() => {}); } -async function enqueueWorkbenchEnsureSession(c: any, sessionId: string): Promise { +function fireRefreshSessionTranscript(c: any, sessionId: string): void { const self = selfTask(c); - await self.send( - "task.command.workbench.ensure_session", - { - sessionId, - }, - { - wait: false, - }, - ); + void self.refreshSessionTranscript({ sessionId }).catch(() => {}); } -function pendingWorkbenchSessionStatus(record: any): "pending_provision" | "pending_session_create" { +async function enqueueWorkspaceEnsureSession(c: any, sessionId: string): Promise { + const self = selfTask(c); + await self.send(taskWorkflowQueueName("task.command.workspace.ensure_session" as any), { sessionId }, { wait: false }); +} + +function pendingWorkspaceSessionStatus(record: any): "pending_provision" | "pending_session_create" { return record.activeSandboxId ? "pending_session_create" : "pending_provision"; } -async function maybeScheduleWorkbenchRefreshes(c: any, record: any, sessions: Array): Promise { +async function maybeScheduleWorkspaceRefreshes(c: any, record: any, sessions: Array): Promise { const gitState = await readCachedGitState(c); if (record.activeSandboxId && !gitState.updatedAt) { - await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_derived", {}); + fireRefreshDerived(c); } for (const session of sessions) { if (session.closed || session.status !== "ready" || !session.sandboxSessionId || session.transcriptUpdatedAt) { continue; } - await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_session_transcript", { - sessionId: session.sandboxSessionId, - }); + fireRefreshSessionTranscript(c, session.sandboxSessionId); } } -function activeSessionStatus(record: any, sessionId: string) { - if (record.activeSessionId !== sessionId) { - return "idle"; +function computeWorkspaceTaskStatus(record: any, sessions: Array) { + if (record.status && String(record.status).startsWith("init_")) { + return record.status; } - - if (record.status === "running") { + if (record.status === "archived" || record.status === "killed") { + return record.status; + } + if (sessions.some((session) => session.closed !== true && session.thinkingSinceMs)) { return "running"; } - if (record.status === "error") { + if (sessions.some((session) => session.closed !== true && session.status === "error")) { return "error"; } return "idle"; } -async function readPullRequestSummary(c: any, branchName: string | null) { - if (!branchName) { - return null; - } - - try { - const repository = await getOrCreateRepository(c, c.state.organizationId, c.state.repoId, c.state.repoRemote); - return await repository.getPullRequestForBranch({ branchName }); - } catch { - return null; - } +export async function ensureWorkspaceSeeded(c: any): Promise { + return await getCurrentRecord(c); } -export async function ensureWorkbenchSeeded(c: any): Promise { - await ensureTaskRuntimeCacheColumns(c); - const record = await getCurrentRecord({ db: c.db, state: c.state }); - if (record.activeSessionId) { - await ensureSessionMeta(c, { - sessionId: record.activeSessionId, - sandboxSessionId: record.activeSessionId, - model: defaultModelForAgent(record.agentType), - sessionName: "Session 1", - status: "ready", - }); - } - return record; -} - -function buildSessionSummary(record: any, meta: any): any { +function buildSessionSummary(meta: any, userState?: any): any { const derivedSandboxSessionId = meta.status === "ready" ? (meta.sandboxSessionId ?? null) : null; const sessionStatus = meta.status === "pending_provision" || meta.status === "pending_session_create" ? meta.status - : meta.status === "ready" && derivedSandboxSessionId - ? activeSessionStatus(record, derivedSandboxSessionId) + : meta.thinkingSinceMs + ? "running" : meta.status === "error" ? "error" - : "ready"; + : meta.status === "ready" && derivedSandboxSessionId + ? "idle" + : "ready"; let thinkingSinceMs = meta.thinkingSinceMs ?? null; - let unread = Boolean(meta.unread); + let unread = Boolean(userState?.unread); if (thinkingSinceMs && sessionStatus !== "running") { thinkingSinceMs = null; unread = true; @@ -803,8 +1031,8 @@ function buildSessionSummary(record: any, meta: any): any { }; } -function buildSessionDetailFromMeta(record: any, meta: any): any { - const summary = buildSessionSummary(record, meta); +function buildSessionDetailFromMeta(meta: any, userState?: any): any { + const summary = buildSessionSummary(meta, userState); return { sessionId: meta.sessionId, sandboxSessionId: summary.sandboxSessionId ?? null, @@ -817,115 +1045,148 @@ function buildSessionDetailFromMeta(record: any, meta: any): any { created: summary.created, errorMessage: summary.errorMessage, draft: { - text: meta.draftText ?? "", - attachments: Array.isArray(meta.draftAttachments) ? meta.draftAttachments : [], - updatedAtMs: meta.draftUpdatedAtMs ?? null, + text: userState?.draftText ?? "", + attachments: Array.isArray(userState?.draftAttachments) ? userState.draftAttachments : [], + updatedAtMs: userState?.draftUpdatedAtMs ?? null, }, transcript: meta.transcript ?? [], }; } /** - * Builds a WorkbenchTaskSummary from local task actor state. Task actors push + * Builds a WorkspaceTaskSummary from local task actor state. Task actors push * this to the parent organization actor so organization sidebar reads stay local. */ -export async function buildTaskSummary(c: any): Promise { - const record = await ensureWorkbenchSeeded(c); +export async function buildTaskSummary(c: any, authSessionId?: string | null): Promise { + const record = await ensureWorkspaceSeeded(c); + const repositoryMetadata = await getRepositoryMetadata(c); const sessions = await listSessionMetaRows(c); - await maybeScheduleWorkbenchRefreshes(c, record, sessions); + await maybeScheduleWorkspaceRefreshes(c, record, sessions); + const userTaskState = await getUserTaskState(c, authSessionId); + const taskStatus = computeWorkspaceTaskStatus(record, sessions); + const activeSessionId = + userTaskState.activeSessionId && sessions.some((meta) => meta.sessionId === userTaskState.activeSessionId) ? userTaskState.activeSessionId : null; + + const owner = await readTaskOwner(c); return { id: c.state.taskId, repoId: c.state.repoId, title: record.title ?? "New Task", - status: record.status ?? "new", - repoName: repoLabelFromRemote(c.state.repoRemote), + status: taskStatus, + repoName: repoLabelFromRemote(repositoryMetadata.remoteUrl), updatedAtMs: record.updatedAt, branch: record.branchName, - pullRequest: await readPullRequestSummary(c, record.branchName), - sessionsSummary: sessions.map((meta) => buildSessionSummary(record, meta)), + pullRequest: record.pullRequest ?? null, + activeSessionId, + sessionsSummary: sessions.map((meta) => buildSessionSummary(meta, userTaskState.bySessionId.get(meta.sessionId))), + primaryUserLogin: owner?.primaryGithubLogin ?? null, + primaryUserAvatarUrl: owner?.primaryGithubAvatarUrl ?? null, }; } /** - * Builds a WorkbenchTaskDetail from local task actor state for direct task + * Builds a WorkspaceTaskDetail from local task actor state for direct task * subscribers. This is a full replacement payload, not a patch. */ -export async function buildTaskDetail(c: any): Promise { - const record = await ensureWorkbenchSeeded(c); +export async function buildTaskDetail(c: any, authSessionId?: string | null): Promise { + const record = await ensureWorkspaceSeeded(c); const gitState = await readCachedGitState(c); const sessions = await listSessionMetaRows(c); - await maybeScheduleWorkbenchRefreshes(c, record, sessions); - const summary = await buildTaskSummary(c); + await maybeScheduleWorkspaceRefreshes(c, record, sessions); + const summary = await buildTaskSummary(c, authSessionId); return { ...summary, task: record.task, - agentType: record.agentType === "claude" || record.agentType === "codex" ? record.agentType : null, - runtimeStatus: record.status, - statusMessage: record.statusMessage ?? null, - activeSessionId: record.activeSessionId ?? null, - diffStat: record.diffStat ?? null, - prUrl: record.prUrl ?? null, - reviewStatus: record.reviewStatus ?? null, fileChanges: gitState.fileChanges, diffs: gitState.diffs, fileTree: gitState.fileTree, minutesUsed: 0, - sandboxes: (record.sandboxes ?? []).map((sandbox: any) => ({ - sandboxProviderId: sandbox.sandboxProviderId, - sandboxId: sandbox.sandboxId, - cwd: sandbox.cwd ?? null, - })), + sandboxes: await Promise.all( + (record.sandboxes ?? []).map(async (sandbox: any) => { + let url: string | null = null; + if (sandbox.sandboxId) { + try { + const handle = getTaskSandbox(c, c.state.organizationId, sandbox.sandboxId); + const conn = await handle.sandboxAgentConnection(); + if (conn?.endpoint && !conn.endpoint.startsWith("mock://")) { + url = conn.endpoint; + } + } catch { + // Sandbox may not be running + } + } + return { + sandboxProviderId: sandbox.sandboxProviderId, + sandboxId: sandbox.sandboxId, + cwd: sandbox.cwd ?? null, + url, + }; + }), + ), activeSandboxId: record.activeSandboxId ?? null, }; } /** - * Builds a WorkbenchSessionDetail for a specific session. + * Builds a WorkspaceSessionDetail for a specific session. */ -export async function buildSessionDetail(c: any, sessionId: string): Promise { - const record = await ensureWorkbenchSeeded(c); +export async function buildSessionDetail(c: any, sessionId: string, authSessionId?: string | null): Promise { + const record = await ensureWorkspaceSeeded(c); const meta = await readSessionMeta(c, sessionId); if (!meta || meta.closed) { - throw new Error(`Unknown workbench session: ${sessionId}`); + throw new Error(`Unknown workspace session: ${sessionId}`); } + const userTaskState = await getUserTaskState(c, authSessionId); + const userSessionState = userTaskState.bySessionId.get(sessionId); - if (!meta.sandboxSessionId) { - return buildSessionDetailFromMeta(record, meta); + // Skip live transcript fetch if the sandbox session doesn't exist yet or + // the session is still provisioning — the sandbox API will block/timeout. + const isPending = meta.status === "pending_provision" || meta.status === "pending_session_create"; + if (!meta.sandboxSessionId || isPending) { + return buildSessionDetailFromMeta(meta, userSessionState); } try { const transcript = await readSessionTranscript(c, record, meta.sandboxSessionId); if (JSON.stringify(meta.transcript ?? []) !== JSON.stringify(transcript)) { await writeSessionTranscript(c, meta.sessionId, transcript); - return buildSessionDetailFromMeta(record, { - ...meta, - transcript, - transcriptUpdatedAt: Date.now(), - }); + return buildSessionDetailFromMeta( + { + ...meta, + transcript, + transcriptUpdatedAt: Date.now(), + }, + userSessionState, + ); } - } catch { - // Session detail reads should degrade to cached transcript data if the live sandbox is unavailable. + } catch (error) { + // Session detail reads degrade to cached transcript when sandbox is unavailable. + logActorWarning("task", "readSessionTranscript failed, using cached transcript", { + taskId: c.state.taskId, + sessionId, + error: resolveErrorMessage(error), + }); } - return buildSessionDetailFromMeta(record, meta); + return buildSessionDetailFromMeta(meta, userSessionState); } export async function getTaskSummary(c: any): Promise { return await buildTaskSummary(c); } -export async function getTaskDetail(c: any): Promise { - return await buildTaskDetail(c); +export async function getTaskDetail(c: any, authSessionId?: string): Promise { + return await buildTaskDetail(c, authSessionId); } -export async function getSessionDetail(c: any, sessionId: string): Promise { - return await buildSessionDetail(c, sessionId); +export async function getSessionDetail(c: any, sessionId: string, authSessionId?: string): Promise { + return await buildSessionDetail(c, sessionId, authSessionId); } /** - * Replaces the old notifyWorkbenchUpdated pattern. + * Replaces the old notifyWorkspaceUpdated pattern. * * The task actor emits two kinds of updates: * - Push summary state up to the parent organization actor so the sidebar @@ -934,9 +1195,13 @@ export async function getSessionDetail(c: any, sessionId: string): Promise */ export async function broadcastTaskUpdate(c: any, options?: { sessionId?: string }): Promise { const organization = await getOrCreateOrganization(c, c.state.organizationId); - await organization.applyTaskSummaryUpdate({ taskSummary: await buildTaskSummary(c) }); + await organization.send( + organizationWorkflowQueueName("organization.command.applyTaskSummaryUpdate"), + { taskSummary: await buildTaskSummary(c) }, + { wait: false }, + ); c.broadcast("taskUpdated", { - type: "taskDetailUpdated", + type: "taskUpdated", detail: await buildTaskDetail(c), }); @@ -948,15 +1213,15 @@ export async function broadcastTaskUpdate(c: any, options?: { sessionId?: string } } -export async function refreshWorkbenchDerivedState(c: any): Promise { - const record = await ensureWorkbenchSeeded(c); - const gitState = await collectWorkbenchGitState(c, record); +export async function refreshWorkspaceDerivedState(c: any): Promise { + const record = await ensureWorkspaceSeeded(c); + const gitState = await collectWorkspaceGitState(c, record); await writeCachedGitState(c, gitState); await broadcastTaskUpdate(c); } -export async function refreshWorkbenchSessionTranscript(c: any, sessionId: string): Promise { - const record = await ensureWorkbenchSeeded(c); +export async function refreshWorkspaceSessionTranscript(c: any, sessionId: string): Promise { + const record = await ensureWorkspaceSeeded(c); const meta = (await readSessionMetaBySandboxSessionId(c, sessionId)) ?? (await readSessionMeta(c, sessionId)); if (!meta?.sandboxSessionId) { return; @@ -967,7 +1232,7 @@ export async function refreshWorkbenchSessionTranscript(c: any, sessionId: strin await broadcastTaskUpdate(c, { sessionId: meta.sessionId }); } -export async function renameWorkbenchTask(c: any, value: string): Promise { +export async function renameWorkspaceTask(c: any, value: string): Promise { const nextTitle = value.trim(); if (!nextTitle) { throw new Error("task title is required"); @@ -981,87 +1246,52 @@ export async function renameWorkbenchTask(c: any, value: string): Promise }) .where(eq(taskTable.id, 1)) .run(); - c.state.title = nextTitle; await broadcastTaskUpdate(c); } -export async function renameWorkbenchBranch(c: any, value: string): Promise { - const nextBranch = value.trim(); - if (!nextBranch) { - throw new Error("branch name is required"); - } - - const record = await ensureWorkbenchSeeded(c); - if (!record.branchName) { - throw new Error("cannot rename branch before task branch exists"); - } - if (!record.activeSandboxId) { - throw new Error("cannot rename branch without an active sandbox"); - } - const activeSandbox = (record.sandboxes ?? []).find((candidate: any) => candidate.sandboxId === record.activeSandboxId) ?? null; - if (!activeSandbox?.cwd) { - throw new Error("cannot rename branch without a sandbox cwd"); - } - - const renameResult = await executeInSandbox(c, { - sandboxId: record.activeSandboxId, - cwd: activeSandbox.cwd, - command: [ - `git branch -m ${JSON.stringify(record.branchName)} ${JSON.stringify(nextBranch)}`, - `if git ls-remote --exit-code --heads origin ${JSON.stringify(record.branchName)} >/dev/null 2>&1; then git push origin :${JSON.stringify(record.branchName)}; fi`, - `git push origin ${JSON.stringify(nextBranch)}`, - `git branch --set-upstream-to=${JSON.stringify(`origin/${nextBranch}`)} ${JSON.stringify(nextBranch)} || git push --set-upstream origin ${JSON.stringify(nextBranch)}`, - ].join(" && "), - label: `git branch -m ${record.branchName} ${nextBranch}`, - }); - if (renameResult.exitCode !== 0) { - throw new Error(`branch rename failed (${renameResult.exitCode}): ${renameResult.result}`); - } - +export async function syncTaskPullRequest(c: any, pullRequest: any): Promise { + const now = pullRequest?.updatedAtMs ?? Date.now(); await c.db .update(taskTable) .set({ - branchName: nextBranch, - updatedAt: Date.now(), + pullRequestJson: pullRequest ? JSON.stringify(pullRequest) : null, + updatedAt: now, }) .where(eq(taskTable.id, 1)) .run(); - c.state.branchName = nextBranch; - - const repository = await getOrCreateRepository(c, c.state.organizationId, c.state.repoId, c.state.repoRemote); - await repository.registerTaskBranch({ - taskId: c.state.taskId, - branchName: nextBranch, - }); await broadcastTaskUpdate(c); } -export async function createWorkbenchSession(c: any, model?: string): Promise<{ sessionId: string }> { +export async function createWorkspaceSession(c: any, model?: string, authSessionId?: string): Promise<{ sessionId: string }> { const sessionId = `session-${randomUUID()}`; - const record = await ensureWorkbenchSeeded(c); + const record = await ensureWorkspaceSeeded(c); await ensureSessionMeta(c, { sessionId, - model: model ?? defaultModelForAgent(record.agentType), + model: model ?? (await resolveDefaultModel(c, authSessionId)), + authSessionId, sandboxSessionId: null, - status: pendingWorkbenchSessionStatus(record), + status: pendingWorkspaceSessionStatus(record), created: false, }); + await upsertUserTaskState(c, authSessionId, sessionId, { + activeSessionId: sessionId, + unread: false, + }); await broadcastTaskUpdate(c, { sessionId: sessionId }); - await enqueueWorkbenchEnsureSession(c, sessionId); + await enqueueWorkspaceEnsureSession(c, sessionId); return { sessionId }; } -export async function ensureWorkbenchSession(c: any, sessionId: string, model?: string): Promise { +export async function ensureWorkspaceSession(c: any, sessionId: string, model?: string, authSessionId?: string): Promise { + const ensureStart = performance.now(); const meta = await readSessionMeta(c, sessionId); if (!meta || meta.closed) { return; } - const record = await ensureWorkbenchSeeded(c); + const record = await ensureWorkspaceSeeded(c); if (meta.sandboxSessionId && meta.status === "ready") { - await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_session_transcript", { - sessionId: meta.sandboxSessionId, - }); + fireRefreshSessionTranscript(c, meta.sandboxSessionId); await broadcastTaskUpdate(c, { sessionId: sessionId }); return; } @@ -1073,25 +1303,35 @@ export async function ensureWorkbenchSession(c: any, sessionId: string, model?: }); try { + const t0 = performance.now(); const runtime = await getTaskSandboxRuntime(c, record); + logActorInfo("task.session", "getTaskSandboxRuntime", { sessionId, durationMs: Math.round(performance.now() - t0) }); + + const t1 = performance.now(); await ensureSandboxRepo(c, runtime.sandbox, record); + logActorInfo("task.session", "ensureSandboxRepo", { sessionId, durationMs: Math.round(performance.now() - t1) }); + + const resolvedModel = model ?? meta.model ?? (await resolveDefaultModel(c, authSessionId)); + const resolvedAgent = await resolveSandboxAgentForModel(c, resolvedModel); + + const t2 = performance.now(); await runtime.sandbox.createSession({ id: meta.sandboxSessionId ?? sessionId, - agent: agentTypeForModel(model ?? meta.model ?? defaultModelForAgent(record.agentType)), - model: model ?? meta.model ?? defaultModelForAgent(record.agentType), + agent: resolvedAgent, + model: resolvedModel, sessionInit: { cwd: runtime.cwd, }, }); + logActorInfo("task.session", "createSession", { sessionId, agent: resolvedAgent, model: resolvedModel, durationMs: Math.round(performance.now() - t2) }); await updateSessionMeta(c, sessionId, { sandboxSessionId: meta.sandboxSessionId ?? sessionId, status: "ready", errorMessage: null, }); - await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_session_transcript", { - sessionId: meta.sandboxSessionId ?? sessionId, - }); + logActorInfo("task.session", "ensureWorkspaceSession complete", { sessionId, totalDurationMs: Math.round(performance.now() - ensureStart) }); + fireRefreshSessionTranscript(c, meta.sandboxSessionId ?? sessionId); } catch (error) { await updateSessionMeta(c, sessionId, { status: "error", @@ -1102,27 +1342,18 @@ export async function ensureWorkbenchSession(c: any, sessionId: string, model?: await broadcastTaskUpdate(c, { sessionId: sessionId }); } -export async function enqueuePendingWorkbenchSessions(c: any): Promise { - const self = selfTask(c); +export async function enqueuePendingWorkspaceSessions(c: any): Promise { const pending = (await listSessionMetaRows(c, { includeClosed: true })).filter( (row) => row.closed !== true && row.status !== "ready" && row.status !== "error", ); + const self = selfTask(c); for (const row of pending) { - await self.send( - "task.command.workbench.ensure_session", - { - sessionId: row.sessionId, - model: row.model, - }, - { - wait: false, - }, - ); + await self.send(taskWorkflowQueueName("task.command.workspace.ensure_session" as any), { sessionId: row.sessionId, model: row.model }, { wait: false }); } } -export async function renameWorkbenchSession(c: any, sessionId: string, title: string): Promise { +export async function renameWorkspaceSession(c: any, sessionId: string, title: string): Promise { const trimmed = title.trim(); if (!trimmed) { throw new Error("session title is required"); @@ -1133,15 +1364,26 @@ export async function renameWorkbenchSession(c: any, sessionId: string, title: s await broadcastTaskUpdate(c, { sessionId }); } -export async function setWorkbenchSessionUnread(c: any, sessionId: string, unread: boolean): Promise { - await updateSessionMeta(c, sessionId, { - unread: unread ? 1 : 0, +export async function selectWorkspaceSession(c: any, sessionId: string, authSessionId?: string): Promise { + const meta = await readSessionMeta(c, sessionId); + if (!meta || meta.closed) { + return; + } + await upsertUserTaskState(c, authSessionId, sessionId, { + activeSessionId: sessionId, }); await broadcastTaskUpdate(c, { sessionId }); } -export async function updateWorkbenchDraft(c: any, sessionId: string, text: string, attachments: Array): Promise { - await updateSessionMeta(c, sessionId, { +export async function setWorkspaceSessionUnread(c: any, sessionId: string, unread: boolean, authSessionId?: string): Promise { + await upsertUserTaskState(c, authSessionId, sessionId, { + unread, + }); + await broadcastTaskUpdate(c, { sessionId }); +} + +export async function updateWorkspaceDraft(c: any, sessionId: string, text: string, attachments: Array, authSessionId?: string): Promise { + await upsertUserTaskState(c, authSessionId, sessionId, { draftText: text, draftAttachmentsJson: JSON.stringify(attachments), draftUpdatedAt: Date.now(), @@ -1149,7 +1391,7 @@ export async function updateWorkbenchDraft(c: any, sessionId: string, text: stri await broadcastTaskUpdate(c, { sessionId }); } -export async function changeWorkbenchModel(c: any, sessionId: string, model: string): Promise { +export async function changeWorkspaceModel(c: any, sessionId: string, model: string, _authSessionId?: string): Promise { const meta = await readSessionMeta(c, sessionId); if (!meta || meta.closed) { return; @@ -1159,7 +1401,7 @@ export async function changeWorkbenchModel(c: any, sessionId: string, model: str return; } - const record = await ensureWorkbenchSeeded(c); + const record = await ensureWorkspaceSeeded(c); let nextMeta = await updateSessionMeta(c, sessionId, { model, }); @@ -1170,7 +1412,7 @@ export async function changeWorkbenchModel(c: any, sessionId: string, model: str await sandbox.destroySession(nextMeta.sandboxSessionId); nextMeta = await updateSessionMeta(c, sessionId, { sandboxSessionId: null, - status: pendingWorkbenchSessionStatus(record), + status: pendingWorkspaceSessionStatus(record), errorMessage: null, transcriptJson: "[]", transcriptUpdatedAt: null, @@ -1191,24 +1433,38 @@ export async function changeWorkbenchModel(c: any, sessionId: string, model: str } } else if (nextMeta.status !== "ready") { nextMeta = await updateSessionMeta(c, sessionId, { - status: pendingWorkbenchSessionStatus(record), + status: pendingWorkspaceSessionStatus(record), errorMessage: null, }); } if (shouldEnsure) { - await enqueueWorkbenchEnsureSession(c, sessionId); + await enqueueWorkspaceEnsureSession(c, sessionId); } await broadcastTaskUpdate(c, { sessionId }); } -export async function sendWorkbenchMessage(c: any, sessionId: string, text: string, attachments: Array): Promise { +export async function sendWorkspaceMessage(c: any, sessionId: string, text: string, attachments: Array, authSessionId?: string): Promise { + const sendStart = performance.now(); const meta = requireSendableSessionMeta(await readSessionMeta(c, sessionId), sessionId); - const record = await ensureWorkbenchSeeded(c); + const record = await ensureWorkspaceSeeded(c); + + const t0 = performance.now(); const runtime = await getTaskSandboxRuntime(c, record); + logActorInfo("task.message", "getTaskSandboxRuntime", { sessionId, durationMs: Math.round(performance.now() - t0) }); + + const t1 = performance.now(); // Skip git fetch on subsequent messages — the repo was already prepared during session // creation. This avoids a 5-30s network round-trip to GitHub on every prompt. - await ensureSandboxRepo(c, runtime.sandbox, record, { skipFetchIfPrepared: true }); + await ensureSandboxRepo(c, runtime.sandbox, record, { skipFetchIfPrepared: true, authSessionId }); + logActorInfo("task.message", "ensureSandboxRepo", { sessionId, durationMs: Math.round(performance.now() - t1) }); + + // Check if the task owner needs to swap. If a different user is sending this message, + // update the owner record and inject their git credentials into the sandbox. + const ownerSwapped = await maybeSwapTaskOwner(c, authSessionId, runtime.sandbox); + if (ownerSwapped) { + await broadcastTaskUpdate(c); + } const prompt = [text.trim(), ...attachments.map((attachment: any) => `@ ${attachment.filePath}:${attachment.lineNumber}\n${attachment.lineContent}`)].filter( Boolean, ); @@ -1217,42 +1473,39 @@ export async function sendWorkbenchMessage(c: any, sessionId: string, text: stri } await updateSessionMeta(c, sessionId, { - unread: 0, created: 1, + thinkingSinceMs: Date.now(), + }); + await upsertUserTaskState(c, authSessionId, sessionId, { + unread: false, draftText: "", draftAttachmentsJson: "[]", draftUpdatedAt: Date.now(), - thinkingSinceMs: Date.now(), + activeSessionId: sessionId, }); - await c.db - .update(taskRuntime) - .set({ - activeSessionId: meta.sandboxSessionId, - updatedAt: Date.now(), - }) - .where(eq(taskRuntime.id, 1)) - .run(); - - await syncWorkbenchSessionStatus(c, meta.sandboxSessionId, "running", Date.now()); + await syncWorkspaceSessionStatus(c, meta.sandboxSessionId, "running", Date.now()); try { + const t2 = performance.now(); await runtime.sandbox.sendPrompt({ sessionId: meta.sandboxSessionId, prompt: prompt.join("\n\n"), }); - await syncWorkbenchSessionStatus(c, meta.sandboxSessionId, "idle", Date.now()); + logActorInfo("task.message", "sendPrompt", { sessionId, durationMs: Math.round(performance.now() - t2) }); + await syncWorkspaceSessionStatus(c, meta.sandboxSessionId, "idle", Date.now()); } catch (error) { await updateSessionMeta(c, sessionId, { status: "error", errorMessage: error instanceof Error ? error.message : String(error), }); - await syncWorkbenchSessionStatus(c, meta.sandboxSessionId, "error", Date.now()); + await syncWorkspaceSessionStatus(c, meta.sandboxSessionId, "error", Date.now()); throw error; } + logActorInfo("task.message", "sendWorkspaceMessage complete", { sessionId, totalDurationMs: Math.round(performance.now() - sendStart) }); } -export async function stopWorkbenchSession(c: any, sessionId: string): Promise { +export async function stopWorkspaceSession(c: any, sessionId: string): Promise { const meta = await requireReadySessionMeta(c, sessionId); const sandbox = getTaskSandbox(c, c.state.organizationId, stableSandboxId(c)); await sandbox.destroySession(meta.sandboxSessionId); @@ -1262,39 +1515,10 @@ export async function stopWorkbenchSession(c: any, sessionId: string): Promise { - const record = await ensureWorkbenchSeeded(c); +export async function syncWorkspaceSessionStatus(c: any, sessionId: string, status: "running" | "idle" | "error", at: number): Promise { const meta = (await readSessionMetaBySandboxSessionId(c, sessionId)) ?? (await ensureSessionMeta(c, { sessionId: sessionId, sandboxSessionId: sessionId })); let changed = false; - if (record.activeSessionId === sessionId || record.activeSessionId === meta.sandboxSessionId) { - const mappedStatus = status === "running" ? "running" : status === "error" ? "error" : "idle"; - if (record.status !== mappedStatus) { - await c.db - .update(taskTable) - .set({ - status: mappedStatus, - updatedAt: at, - }) - .where(eq(taskTable.id, 1)) - .run(); - changed = true; - } - - const statusMessage = `session:${status}`; - if (record.statusMessage !== statusMessage) { - await c.db - .update(taskRuntime) - .set({ - statusMessage, - updatedAt: at, - }) - .where(eq(taskRuntime.id, 1)) - .run(); - changed = true; - } - } - if (status === "running") { if (!meta.thinkingSinceMs) { await updateSessionMeta(c, sessionId, { @@ -1309,27 +1533,28 @@ export async function syncWorkbenchSessionStatus(c: any, sessionId: string, stat }); changed = true; } - if (!meta.unread && shouldMarkSessionUnreadForStatus(meta, status)) { - await updateSessionMeta(c, sessionId, { - unread: 1, - }); - changed = true; - } } if (changed) { - await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_session_transcript", { - sessionId, - }); + const sessions = await listSessionMetaRows(c, { includeClosed: true }); + const nextStatus = computeWorkspaceTaskStatus(await ensureWorkspaceSeeded(c), sessions); + await c.db + .update(taskTable) + .set({ + status: nextStatus, + updatedAt: at, + }) + .where(eq(taskTable.id, 1)) + .run(); + fireRefreshSessionTranscript(c, sessionId); if (status !== "running") { - await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_derived", {}); + fireRefreshDerived(c); } await broadcastTaskUpdate(c, { sessionId: meta.sessionId }); } } -export async function closeWorkbenchSession(c: any, sessionId: string): Promise { - const record = await ensureWorkbenchSeeded(c); +export async function closeWorkspaceSession(c: any, sessionId: string, authSessionId?: string): Promise { const sessions = await listSessionMetaRows(c); if (sessions.filter((candidate) => candidate.closed !== true).length <= 1) { return; @@ -1347,61 +1572,63 @@ export async function closeWorkbenchSession(c: any, sessionId: string): Promise< closed: 1, thinkingSinceMs: null, }); - if (record.activeSessionId === sessionId || record.activeSessionId === meta.sandboxSessionId) { - await c.db - .update(taskRuntime) - .set({ - activeSessionId: null, - updatedAt: Date.now(), - }) - .where(eq(taskRuntime.id, 1)) - .run(); + const remainingSessions = sessions.filter((candidate) => candidate.sessionId !== sessionId && candidate.closed !== true); + const userTaskState = await getUserTaskState(c, authSessionId); + if (userTaskState.activeSessionId === sessionId && remainingSessions[0]) { + await upsertUserTaskState(c, authSessionId, remainingSessions[0].sessionId, { + activeSessionId: remainingSessions[0].sessionId, + }); } + await deleteUserTaskState(c, authSessionId, sessionId); await broadcastTaskUpdate(c); } -export async function markWorkbenchUnread(c: any): Promise { +export async function markWorkspaceUnread(c: any, authSessionId?: string): Promise { const sessions = await listSessionMetaRows(c); const latest = sessions[sessions.length - 1]; if (!latest) { return; } - await updateSessionMeta(c, latest.sessionId, { - unread: 1, + await upsertUserTaskState(c, authSessionId, latest.sessionId, { + unread: true, }); await broadcastTaskUpdate(c, { sessionId: latest.sessionId }); } -export async function publishWorkbenchPr(c: any): Promise { - const record = await ensureWorkbenchSeeded(c); +export async function publishWorkspacePr(c: any): Promise { + const record = await ensureWorkspaceSeeded(c); if (!record.branchName) { throw new Error("cannot publish PR without a branch"); } - const repository = await getOrCreateRepository(c, c.state.organizationId, c.state.repoId, c.state.repoRemote); - const metadata = await repository.getRepositoryMetadata({}); - const repoFullName = metadata.fullName ?? githubRepoFullNameFromRemote(c.state.repoRemote); + const metadata = await getRepositoryMetadata(c); + const repoFullName = metadata.fullName ?? githubRepoFullNameFromRemote(metadata.remoteUrl); if (!repoFullName) { - throw new Error(`Unable to resolve GitHub repository for ${c.state.repoRemote}`); + throw new Error(`Unable to resolve GitHub repository for ${metadata.remoteUrl}`); } const { driver } = getActorRuntimeContext(); const auth = await resolveOrganizationGithubAuth(c, c.state.organizationId); - await driver.github.createPr(repoFullName, record.branchName, record.title ?? c.state.task, undefined, { + const created = await driver.github.createPr(repoFullName, record.branchName, record.title ?? record.task, undefined, { githubToken: auth?.githubToken ?? null, baseBranch: metadata.defaultBranch ?? undefined, }); - await c.db - .update(taskTable) - .set({ - prSubmitted: 1, - updatedAt: Date.now(), - }) - .where(eq(taskTable.id, 1)) - .run(); - await broadcastTaskUpdate(c); + await syncTaskPullRequest(c, { + number: created.number, + status: "ready", + title: record.title ?? record.task, + body: null, + state: "open", + url: created.url, + headRefName: record.branchName, + baseRefName: metadata.defaultBranch ?? "main", + authorLogin: null, + isDraft: false, + merged: false, + updatedAtMs: Date.now(), + }); } -export async function revertWorkbenchFile(c: any, path: string): Promise { - const record = await ensureWorkbenchSeeded(c); +export async function revertWorkspaceFile(c: any, path: string): Promise { + const record = await ensureWorkspaceSeeded(c); if (!record.activeSandboxId) { throw new Error("cannot revert file without an active sandbox"); } @@ -1419,6 +1646,6 @@ export async function revertWorkbenchFile(c: any, path: string): Promise { if (result.exitCode !== 0) { throw new Error(`file revert failed (${result.exitCode}): ${result.result}`); } - await enqueueWorkbenchRefresh(c, "task.command.workbench.refresh_derived", {}); + fireRefreshDerived(c); await broadcastTaskUpdate(c); } diff --git a/foundry/packages/backend/src/actors/user/actions/better-auth.ts b/foundry/packages/backend/src/actors/user/actions/better-auth.ts new file mode 100644 index 0000000..3ef8656 --- /dev/null +++ b/foundry/packages/backend/src/actors/user/actions/better-auth.ts @@ -0,0 +1,105 @@ +import { asc, count as sqlCount, desc } from "drizzle-orm"; +import { applyJoinToRow, applyJoinToRows, buildWhere, columnFor, materializeRow, persistInput, persistPatch, tableFor } from "../query-helpers.js"; + +// Exception to the CLAUDE.md queue-for-mutations rule: Better Auth adapter operations +// use direct actions even for mutations. Better Auth runs during OAuth callbacks on the +// HTTP request path, not through the normal organization lifecycle. Routing through the +// queue adds multiple sequential round-trips (each with actor wake-up + step overhead) +// that cause 30-second OAuth callbacks and proxy retry storms. These mutations are simple +// SQLite upserts/deletes with no cross-actor coordination or broadcast side effects. +export const betterAuthActions = { + // --- Mutation actions --- + async betterAuthCreateRecord(c, input: { model: string; data: Record }) { + const table = tableFor(input.model); + const persisted = persistInput(input.model, input.data); + await c.db + .insert(table) + .values(persisted as any) + .run(); + const row = await c.db + .select() + .from(table) + .where(buildWhere(table, [{ field: "id", value: input.data.id }])!) + .get(); + return materializeRow(input.model, row); + }, + + async betterAuthUpdateRecord(c, input: { model: string; where: any[]; update: Record }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + if (!predicate) throw new Error("betterAuthUpdateRecord requires a where clause"); + await c.db + .update(table) + .set(persistPatch(input.model, input.update) as any) + .where(predicate) + .run(); + return materializeRow(input.model, await c.db.select().from(table).where(predicate).get()); + }, + + async betterAuthUpdateManyRecords(c, input: { model: string; where: any[]; update: Record }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + if (!predicate) throw new Error("betterAuthUpdateManyRecords requires a where clause"); + await c.db + .update(table) + .set(persistPatch(input.model, input.update) as any) + .where(predicate) + .run(); + const row = await c.db.select({ value: sqlCount() }).from(table).where(predicate).get(); + return row?.value ?? 0; + }, + + async betterAuthDeleteRecord(c, input: { model: string; where: any[] }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + if (!predicate) throw new Error("betterAuthDeleteRecord requires a where clause"); + await c.db.delete(table).where(predicate).run(); + }, + + async betterAuthDeleteManyRecords(c, input: { model: string; where: any[] }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + if (!predicate) throw new Error("betterAuthDeleteManyRecords requires a where clause"); + const rows = await c.db.select().from(table).where(predicate).all(); + await c.db.delete(table).where(predicate).run(); + return rows.length; + }, + + // --- Read actions --- + async betterAuthFindOneRecord(c, input: { model: string; where: any[]; join?: any }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + const row = predicate ? await c.db.select().from(table).where(predicate).get() : await c.db.select().from(table).get(); + return await applyJoinToRow(c, input.model, row ?? null, input.join); + }, + + async betterAuthFindManyRecords(c, input: { model: string; where?: any[]; limit?: number; offset?: number; sortBy?: any; join?: any }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + let query: any = c.db.select().from(table); + if (predicate) { + query = query.where(predicate); + } + if (input.sortBy?.field) { + const column = columnFor(input.model, table, input.sortBy.field); + query = query.orderBy(input.sortBy.direction === "asc" ? asc(column) : desc(column)); + } + if (typeof input.limit === "number") { + query = query.limit(input.limit); + } + if (typeof input.offset === "number") { + query = query.offset(input.offset); + } + const rows = await query.all(); + return await applyJoinToRows(c, input.model, rows, input.join); + }, + + async betterAuthCountRecords(c, input: { model: string; where?: any[] }) { + const table = tableFor(input.model); + const predicate = buildWhere(table, input.where); + const row = predicate + ? await c.db.select({ value: sqlCount() }).from(table).where(predicate).get() + : await c.db.select({ value: sqlCount() }).from(table).get(); + return row?.value ?? 0; + }, +}; diff --git a/foundry/packages/backend/src/actors/user/actions/user.ts b/foundry/packages/backend/src/actors/user/actions/user.ts new file mode 100644 index 0000000..f251c95 --- /dev/null +++ b/foundry/packages/backend/src/actors/user/actions/user.ts @@ -0,0 +1,188 @@ +import { eq, and } from "drizzle-orm"; +import { DEFAULT_WORKSPACE_MODEL_ID } from "@sandbox-agent/foundry-shared"; +import { authAccounts, authSessions, authUsers, sessionState, userProfiles, userTaskState } from "../db/schema.js"; +import { materializeRow } from "../query-helpers.js"; + +export const userActions = { + // Custom Foundry action — not part of Better Auth. + async getAppAuthState(c, input: { sessionId: string }) { + const session = await c.db.select().from(authSessions).where(eq(authSessions.id, input.sessionId)).get(); + if (!session) { + return null; + } + const [user, profile, currentSessionState, accounts] = await Promise.all([ + c.db.select().from(authUsers).where(eq(authUsers.authUserId, session.userId)).get(), + c.db.select().from(userProfiles).where(eq(userProfiles.userId, session.userId)).get(), + c.db.select().from(sessionState).where(eq(sessionState.sessionId, input.sessionId)).get(), + c.db.select().from(authAccounts).where(eq(authAccounts.userId, session.userId)).all(), + ]); + return { + session, + user: materializeRow("user", user), + profile: profile ?? null, + sessionState: currentSessionState ?? null, + accounts, + }; + }, + + // Custom Foundry action — not part of Better Auth. + async getTaskState(c, input: { taskId: string }) { + const rows = await c.db.select().from(userTaskState).where(eq(userTaskState.taskId, input.taskId)).all(); + const activeSessionId = rows.find((row) => typeof row.activeSessionId === "string" && row.activeSessionId.length > 0)?.activeSessionId ?? null; + return { + taskId: input.taskId, + activeSessionId, + sessions: rows.map((row) => ({ + sessionId: row.sessionId, + unread: row.unread === 1, + draftText: row.draftText, + draftAttachmentsJson: row.draftAttachmentsJson, + draftUpdatedAt: row.draftUpdatedAt ?? null, + updatedAt: row.updatedAt, + })), + }; + }, + + // --- Mutation actions (migrated from queue) --- + + async upsertProfile( + c, + input: { + userId: string; + patch: { + githubAccountId?: string | null; + githubLogin?: string | null; + roleLabel?: string; + defaultModel?: string; + eligibleOrganizationIdsJson?: string; + starterRepoStatus?: string; + starterRepoStarredAt?: number | null; + starterRepoSkippedAt?: number | null; + }; + }, + ) { + const now = Date.now(); + await c.db + .insert(userProfiles) + .values({ + id: 1, + userId: input.userId, + githubAccountId: input.patch.githubAccountId ?? null, + githubLogin: input.patch.githubLogin ?? null, + roleLabel: input.patch.roleLabel ?? "GitHub user", + defaultModel: input.patch.defaultModel ?? DEFAULT_WORKSPACE_MODEL_ID, + eligibleOrganizationIdsJson: input.patch.eligibleOrganizationIdsJson ?? "[]", + starterRepoStatus: input.patch.starterRepoStatus ?? "pending", + starterRepoStarredAt: input.patch.starterRepoStarredAt ?? null, + starterRepoSkippedAt: input.patch.starterRepoSkippedAt ?? null, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: userProfiles.userId, + set: { + ...(input.patch.githubAccountId !== undefined ? { githubAccountId: input.patch.githubAccountId } : {}), + ...(input.patch.githubLogin !== undefined ? { githubLogin: input.patch.githubLogin } : {}), + ...(input.patch.roleLabel !== undefined ? { roleLabel: input.patch.roleLabel } : {}), + ...(input.patch.defaultModel !== undefined ? { defaultModel: input.patch.defaultModel } : {}), + ...(input.patch.eligibleOrganizationIdsJson !== undefined ? { eligibleOrganizationIdsJson: input.patch.eligibleOrganizationIdsJson } : {}), + ...(input.patch.starterRepoStatus !== undefined ? { starterRepoStatus: input.patch.starterRepoStatus } : {}), + ...(input.patch.starterRepoStarredAt !== undefined ? { starterRepoStarredAt: input.patch.starterRepoStarredAt } : {}), + ...(input.patch.starterRepoSkippedAt !== undefined ? { starterRepoSkippedAt: input.patch.starterRepoSkippedAt } : {}), + updatedAt: now, + }, + }) + .run(); + return await c.db.select().from(userProfiles).where(eq(userProfiles.userId, input.userId)).get(); + }, + + async upsertSessionState(c, input: { sessionId: string; activeOrganizationId: string | null }) { + const now = Date.now(); + await c.db + .insert(sessionState) + .values({ + sessionId: input.sessionId, + activeOrganizationId: input.activeOrganizationId, + createdAt: now, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: sessionState.sessionId, + set: { activeOrganizationId: input.activeOrganizationId, updatedAt: now }, + }) + .run(); + return await c.db.select().from(sessionState).where(eq(sessionState.sessionId, input.sessionId)).get(); + }, + + async upsertTaskState( + c, + input: { + taskId: string; + sessionId: string; + patch: { + activeSessionId?: string | null; + unread?: boolean; + draftText?: string; + draftAttachmentsJson?: string; + draftUpdatedAt?: number | null; + }; + }, + ) { + const now = Date.now(); + const existing = await c.db + .select() + .from(userTaskState) + .where(and(eq(userTaskState.taskId, input.taskId), eq(userTaskState.sessionId, input.sessionId))) + .get(); + + if (input.patch.activeSessionId !== undefined) { + await c.db + .update(userTaskState) + .set({ activeSessionId: input.patch.activeSessionId, updatedAt: now }) + .where(eq(userTaskState.taskId, input.taskId)) + .run(); + } + + await c.db + .insert(userTaskState) + .values({ + taskId: input.taskId, + sessionId: input.sessionId, + activeSessionId: input.patch.activeSessionId ?? existing?.activeSessionId ?? null, + unread: input.patch.unread !== undefined ? (input.patch.unread ? 1 : 0) : (existing?.unread ?? 0), + draftText: input.patch.draftText ?? existing?.draftText ?? "", + draftAttachmentsJson: input.patch.draftAttachmentsJson ?? existing?.draftAttachmentsJson ?? "[]", + draftUpdatedAt: input.patch.draftUpdatedAt === undefined ? (existing?.draftUpdatedAt ?? null) : input.patch.draftUpdatedAt, + updatedAt: now, + }) + .onConflictDoUpdate({ + target: [userTaskState.taskId, userTaskState.sessionId], + set: { + ...(input.patch.activeSessionId !== undefined ? { activeSessionId: input.patch.activeSessionId } : {}), + ...(input.patch.unread !== undefined ? { unread: input.patch.unread ? 1 : 0 } : {}), + ...(input.patch.draftText !== undefined ? { draftText: input.patch.draftText } : {}), + ...(input.patch.draftAttachmentsJson !== undefined ? { draftAttachmentsJson: input.patch.draftAttachmentsJson } : {}), + ...(input.patch.draftUpdatedAt !== undefined ? { draftUpdatedAt: input.patch.draftUpdatedAt } : {}), + updatedAt: now, + }, + }) + .run(); + + return await c.db + .select() + .from(userTaskState) + .where(and(eq(userTaskState.taskId, input.taskId), eq(userTaskState.sessionId, input.sessionId))) + .get(); + }, + + async deleteTaskState(c, input: { taskId: string; sessionId?: string }) { + if (input.sessionId) { + await c.db + .delete(userTaskState) + .where(and(eq(userTaskState.taskId, input.taskId), eq(userTaskState.sessionId, input.sessionId))) + .run(); + return; + } + await c.db.delete(userTaskState).where(eq(userTaskState.taskId, input.taskId)).run(); + }, +}; diff --git a/foundry/packages/backend/src/actors/history/db/db.ts b/foundry/packages/backend/src/actors/user/db/db.ts similarity index 70% rename from foundry/packages/backend/src/actors/history/db/db.ts rename to foundry/packages/backend/src/actors/user/db/db.ts index ef76e36..a864893 100644 --- a/foundry/packages/backend/src/actors/history/db/db.ts +++ b/foundry/packages/backend/src/actors/user/db/db.ts @@ -2,4 +2,4 @@ import { db } from "rivetkit/db/drizzle"; import * as schema from "./schema.js"; import migrations from "./migrations.js"; -export const historyDb = db({ schema, migrations }); +export const userDb = db({ schema, migrations }); diff --git a/foundry/packages/backend/src/actors/auth-user/db/migrations.ts b/foundry/packages/backend/src/actors/user/db/migrations.ts similarity index 65% rename from foundry/packages/backend/src/actors/auth-user/db/migrations.ts rename to foundry/packages/backend/src/actors/user/db/migrations.ts index be7cb17..da92bdc 100644 --- a/foundry/packages/backend/src/actors/auth-user/db/migrations.ts +++ b/foundry/packages/backend/src/actors/user/db/migrations.ts @@ -10,6 +10,12 @@ const journal = { tag: "0000_auth_user", breakpoints: true, }, + { + idx: 1, + when: 1773532800000, + tag: "0001_user_task_state", + breakpoints: true, + }, ], } as const; @@ -17,15 +23,19 @@ export default { journal, migrations: { m0000: `CREATE TABLE \`user\` ( - \`id\` text PRIMARY KEY NOT NULL, + \`id\` integer PRIMARY KEY NOT NULL, + \`auth_user_id\` text NOT NULL, \`name\` text NOT NULL, \`email\` text NOT NULL, \`email_verified\` integer NOT NULL, \`image\` text, \`created_at\` integer NOT NULL, - \`updated_at\` integer NOT NULL + \`updated_at\` integer NOT NULL, + CONSTRAINT \`user_singleton_id_check\` CHECK(\`id\` = 1) ); --> statement-breakpoint +CREATE UNIQUE INDEX \`user_auth_user_id_idx\` ON \`user\` (\`auth_user_id\`); +--> statement-breakpoint CREATE TABLE \`session\` ( \`id\` text PRIMARY KEY NOT NULL, \`token\` text NOT NULL, @@ -58,23 +68,39 @@ CREATE TABLE \`account\` ( CREATE UNIQUE INDEX \`account_provider_account_idx\` ON \`account\` (\`provider_id\`, \`account_id\`); --> statement-breakpoint CREATE TABLE \`user_profiles\` ( - \`user_id\` text PRIMARY KEY NOT NULL, + \`id\` integer PRIMARY KEY NOT NULL, + \`user_id\` text NOT NULL, \`github_account_id\` text, \`github_login\` text, \`role_label\` text NOT NULL, + \`default_model\` text DEFAULT 'gpt-5.3-codex' NOT NULL, \`eligible_organization_ids_json\` text NOT NULL, \`starter_repo_status\` text NOT NULL, \`starter_repo_starred_at\` integer, \`starter_repo_skipped_at\` integer, \`created_at\` integer NOT NULL, - \`updated_at\` integer NOT NULL + \`updated_at\` integer NOT NULL, + CONSTRAINT \`user_profiles_singleton_id_check\` CHECK(\`id\` = 1) ); --> statement-breakpoint +CREATE UNIQUE INDEX \`user_profiles_user_id_idx\` ON \`user_profiles\` (\`user_id\`); +--> statement-breakpoint CREATE TABLE \`session_state\` ( \`session_id\` text PRIMARY KEY NOT NULL, \`active_organization_id\` text, \`created_at\` integer NOT NULL, \`updated_at\` integer NOT NULL +);`, + m0001: `CREATE TABLE \`user_task_state\` ( + \`task_id\` text NOT NULL, + \`session_id\` text NOT NULL, + \`active_session_id\` text, + \`unread\` integer DEFAULT 0 NOT NULL, + \`draft_text\` text DEFAULT '' NOT NULL, + \`draft_attachments_json\` text DEFAULT '[]' NOT NULL, + \`draft_updated_at\` integer, + \`updated_at\` integer NOT NULL, + PRIMARY KEY(\`task_id\`, \`session_id\`) );`, } as const, }; diff --git a/foundry/packages/backend/src/actors/user/db/schema.ts b/foundry/packages/backend/src/actors/user/db/schema.ts new file mode 100644 index 0000000..6a87a11 --- /dev/null +++ b/foundry/packages/backend/src/actors/user/db/schema.ts @@ -0,0 +1,112 @@ +import { check, integer, primaryKey, sqliteTable, text, uniqueIndex } from "drizzle-orm/sqlite-core"; +import { sql } from "drizzle-orm"; +import { DEFAULT_WORKSPACE_MODEL_ID } from "@sandbox-agent/foundry-shared"; + +/** Better Auth core model — schema defined at https://better-auth.com/docs/concepts/database */ +export const authUsers = sqliteTable( + "user", + { + id: integer("id").primaryKey(), + authUserId: text("auth_user_id").notNull(), + name: text("name").notNull(), + email: text("email").notNull(), + emailVerified: integer("email_verified").notNull(), + image: text("image"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), + }, + (table) => ({ + authUserIdIdx: uniqueIndex("user_auth_user_id_idx").on(table.authUserId), + singletonCheck: check("user_singleton_id_check", sql`${table.id} = 1`), + }), +); + +/** Better Auth core model — schema defined at https://better-auth.com/docs/concepts/database */ +export const authSessions = sqliteTable( + "session", + { + id: text("id").notNull().primaryKey(), + token: text("token").notNull(), + userId: text("user_id").notNull(), + expiresAt: integer("expires_at").notNull(), + ipAddress: text("ip_address"), + userAgent: text("user_agent"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), + }, + (table) => ({ + tokenIdx: uniqueIndex("session_token_idx").on(table.token), + }), +); + +/** Better Auth core model — schema defined at https://better-auth.com/docs/concepts/database */ +export const authAccounts = sqliteTable( + "account", + { + id: text("id").notNull().primaryKey(), + accountId: text("account_id").notNull(), + providerId: text("provider_id").notNull(), + userId: text("user_id").notNull(), + accessToken: text("access_token"), + refreshToken: text("refresh_token"), + idToken: text("id_token"), + accessTokenExpiresAt: integer("access_token_expires_at"), + refreshTokenExpiresAt: integer("refresh_token_expires_at"), + scope: text("scope"), + password: text("password"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), + }, + (table) => ({ + providerAccountIdx: uniqueIndex("account_provider_account_idx").on(table.providerId, table.accountId), + }), +); + +/** Custom Foundry table — not part of Better Auth. */ +export const userProfiles = sqliteTable( + "user_profiles", + { + id: integer("id").primaryKey(), + userId: text("user_id").notNull(), + githubAccountId: text("github_account_id"), + githubLogin: text("github_login"), + roleLabel: text("role_label").notNull(), + defaultModel: text("default_model").notNull().default(DEFAULT_WORKSPACE_MODEL_ID), + eligibleOrganizationIdsJson: text("eligible_organization_ids_json").notNull(), + starterRepoStatus: text("starter_repo_status").notNull(), + starterRepoStarredAt: integer("starter_repo_starred_at"), + starterRepoSkippedAt: integer("starter_repo_skipped_at"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), + }, + (table) => ({ + userIdIdx: uniqueIndex("user_profiles_user_id_idx").on(table.userId), + singletonCheck: check("user_profiles_singleton_id_check", sql`${table.id} = 1`), + }), +); + +/** Custom Foundry table — not part of Better Auth. */ +export const sessionState = sqliteTable("session_state", { + sessionId: text("session_id").notNull().primaryKey(), + activeOrganizationId: text("active_organization_id"), + createdAt: integer("created_at").notNull(), + updatedAt: integer("updated_at").notNull(), +}); + +/** Custom Foundry table — not part of Better Auth. Stores per-user task/session UI state. */ +export const userTaskState = sqliteTable( + "user_task_state", + { + taskId: text("task_id").notNull(), + sessionId: text("session_id").notNull(), + activeSessionId: text("active_session_id"), + unread: integer("unread").notNull().default(0), + draftText: text("draft_text").notNull().default(""), + draftAttachmentsJson: text("draft_attachments_json").notNull().default("[]"), + draftUpdatedAt: integer("draft_updated_at"), + updatedAt: integer("updated_at").notNull(), + }, + (table) => ({ + pk: primaryKey({ columns: [table.taskId, table.sessionId] }), + }), +); diff --git a/foundry/packages/backend/src/actors/user/index.ts b/foundry/packages/backend/src/actors/user/index.ts new file mode 100644 index 0000000..0deb1cb --- /dev/null +++ b/foundry/packages/backend/src/actors/user/index.ts @@ -0,0 +1,20 @@ +import { actor } from "rivetkit"; +import { userDb } from "./db/db.js"; +import { betterAuthActions } from "./actions/better-auth.js"; +import { userActions } from "./actions/user.js"; + +export const user = actor({ + db: userDb, + options: { + name: "User", + icon: "shield", + actionTimeout: 60_000, + }, + createState: (_c, input: { userId: string }) => ({ + userId: input.userId, + }), + actions: { + ...betterAuthActions, + ...userActions, + }, +}); diff --git a/foundry/packages/backend/src/actors/user/query-helpers.ts b/foundry/packages/backend/src/actors/user/query-helpers.ts new file mode 100644 index 0000000..5bdee10 --- /dev/null +++ b/foundry/packages/backend/src/actors/user/query-helpers.ts @@ -0,0 +1,197 @@ +import { and, eq, inArray, isNotNull, isNull, like, lt, lte, gt, gte, ne, notInArray, or } from "drizzle-orm"; +import { authAccounts, authSessions, authUsers, sessionState, userProfiles, userTaskState } from "./db/schema.js"; + +export const userTables = { + user: authUsers, + session: authSessions, + account: authAccounts, + userProfiles, + sessionState, + userTaskState, +} as const; + +export function tableFor(model: string) { + const table = userTables[model as keyof typeof userTables]; + if (!table) { + throw new Error(`Unsupported user model: ${model}`); + } + return table as any; +} + +function dbFieldFor(model: string, field: string): string { + if (model === "user" && field === "id") { + return "authUserId"; + } + return field; +} + +export function materializeRow(model: string, row: any) { + if (!row || model !== "user") { + return row; + } + + const { id: _singletonId, authUserId, ...rest } = row; + return { + id: authUserId, + ...rest, + }; +} + +export function persistInput(model: string, data: Record) { + if (model !== "user") { + return data; + } + + const { id, ...rest } = data; + return { + id: 1, + authUserId: id, + ...rest, + }; +} + +export function persistPatch(model: string, data: Record) { + if (model !== "user") { + return data; + } + + const { id, ...rest } = data; + return { + ...(id !== undefined ? { authUserId: id } : {}), + ...rest, + }; +} + +export function columnFor(model: string, table: any, field: string) { + const column = table[dbFieldFor(model, field)]; + if (!column) { + throw new Error(`Unsupported user field: ${model}.${field}`); + } + return column; +} + +export function normalizeValue(value: unknown): unknown { + if (value instanceof Date) { + return value.getTime(); + } + if (Array.isArray(value)) { + return value.map((entry) => normalizeValue(entry)); + } + return value; +} + +export function clauseToExpr(table: any, clause: any) { + const model = table === authUsers ? "user" : table === authSessions ? "session" : table === authAccounts ? "account" : ""; + const column = columnFor(model, table, clause.field); + const value = normalizeValue(clause.value); + + switch (clause.operator) { + case "ne": + return value === null ? isNotNull(column) : ne(column, value as any); + case "lt": + return lt(column, value as any); + case "lte": + return lte(column, value as any); + case "gt": + return gt(column, value as any); + case "gte": + return gte(column, value as any); + case "in": + return inArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); + case "not_in": + return notInArray(column, Array.isArray(value) ? (value as any[]) : [value as any]); + case "contains": + return like(column, `%${String(value ?? "")}%`); + case "starts_with": + return like(column, `${String(value ?? "")}%`); + case "ends_with": + return like(column, `%${String(value ?? "")}`); + case "eq": + default: + return value === null ? isNull(column) : eq(column, value as any); + } +} + +export function buildWhere(table: any, where: any[] | undefined) { + if (!where || where.length === 0) { + return undefined; + } + + let expr = clauseToExpr(table, where[0]); + for (const clause of where.slice(1)) { + const next = clauseToExpr(table, clause); + expr = clause.connector === "OR" ? or(expr, next) : and(expr, next); + } + return expr; +} + +export function applyJoinToRow(c: any, model: string, row: any, join: any) { + const materialized = materializeRow(model, row); + if (!materialized || !join) { + return materialized; + } + + if (model === "session" && join.user) { + return c.db + .select() + .from(authUsers) + .where(eq(authUsers.authUserId, materialized.userId)) + .get() + .then((user: any) => ({ ...materialized, user: materializeRow("user", user) ?? null })); + } + + if (model === "account" && join.user) { + return c.db + .select() + .from(authUsers) + .where(eq(authUsers.authUserId, materialized.userId)) + .get() + .then((user: any) => ({ ...materialized, user: materializeRow("user", user) ?? null })); + } + + if (model === "user" && join.account) { + return c.db + .select() + .from(authAccounts) + .where(eq(authAccounts.userId, materialized.id)) + .all() + .then((accounts: any[]) => ({ ...materialized, account: accounts })); + } + + return Promise.resolve(materialized); +} + +export async function applyJoinToRows(c: any, model: string, rows: any[], join: any) { + if (!join || rows.length === 0) { + return rows.map((row) => materializeRow(model, row)); + } + + if (model === "session" && join.user) { + const userIds = [...new Set(rows.map((row) => row.userId).filter(Boolean))]; + const users = userIds.length > 0 ? await c.db.select().from(authUsers).where(inArray(authUsers.authUserId, userIds)).all() : []; + const userMap = new Map(users.map((user: any) => [user.authUserId, materializeRow("user", user)])); + return rows.map((row) => ({ ...row, user: userMap.get(row.userId) ?? null })); + } + + if (model === "account" && join.user) { + const userIds = [...new Set(rows.map((row) => row.userId).filter(Boolean))]; + const users = userIds.length > 0 ? await c.db.select().from(authUsers).where(inArray(authUsers.authUserId, userIds)).all() : []; + const userMap = new Map(users.map((user: any) => [user.authUserId, materializeRow("user", user)])); + return rows.map((row) => ({ ...row, user: userMap.get(row.userId) ?? null })); + } + + if (model === "user" && join.account) { + const materializedRows = rows.map((row) => materializeRow("user", row)); + const userIds = materializedRows.map((row) => row.id); + const accounts = userIds.length > 0 ? await c.db.select().from(authAccounts).where(inArray(authAccounts.userId, userIds)).all() : []; + const accountsByUserId = new Map(); + for (const account of accounts) { + const entries = accountsByUserId.get(account.userId) ?? []; + entries.push(account); + accountsByUserId.set(account.userId, entries); + } + return materializedRows.map((row) => ({ ...row, account: accountsByUserId.get(row.id) ?? [] })); + } + + return rows.map((row) => materializeRow(model, row)); +} diff --git a/foundry/packages/backend/src/config/runner-version.ts b/foundry/packages/backend/src/config/runner-version.ts new file mode 100644 index 0000000..5c33672 --- /dev/null +++ b/foundry/packages/backend/src/config/runner-version.ts @@ -0,0 +1,33 @@ +import { readFileSync } from "node:fs"; + +function parseRunnerVersion(rawValue: string | undefined): number | undefined { + const value = rawValue?.trim(); + if (!value) { + return undefined; + } + + const parsed = Number.parseInt(value, 10); + if (Number.isNaN(parsed)) { + return undefined; + } + + return parsed; +} + +export function resolveRunnerVersion(): number | undefined { + const envVersion = parseRunnerVersion(process.env.RIVET_RUNNER_VERSION); + if (envVersion !== undefined) { + return envVersion; + } + + const versionFilePath = process.env.RIVET_RUNNER_VERSION_FILE; + if (!versionFilePath) { + return undefined; + } + + try { + return parseRunnerVersion(readFileSync(versionFilePath, "utf8")); + } catch { + return undefined; + } +} diff --git a/foundry/packages/backend/src/index.ts b/foundry/packages/backend/src/index.ts index 3af36c3..617bacc 100644 --- a/foundry/packages/backend/src/index.ts +++ b/foundry/packages/backend/src/index.ts @@ -10,7 +10,7 @@ import { createDefaultDriver } from "./driver.js"; import { createClient } from "rivetkit/client"; import { initBetterAuthService } from "./services/better-auth.js"; import { createDefaultAppShellServices } from "./services/app-shell-runtime.js"; -import { APP_SHELL_ORGANIZATION_ID } from "./actors/organization/app-shell.js"; +import { APP_SHELL_ORGANIZATION_ID } from "./actors/organization/constants.js"; import { logger } from "./logging.js"; export interface BackendStartOptions { @@ -48,6 +48,19 @@ function isRivetRequest(request: Request): boolean { } export async function startBackend(options: BackendStartOptions = {}): Promise { + // Prevent the sandbox-agent SDK's unhandled SQLite constraint errors from + // crashing the entire process. The SDK has a bug where duplicate event + // inserts (sandbox_agent_events UNIQUE constraint) throw from an internal + // async path with no catch. Log and continue. + process.on("uncaughtException", (error) => { + logger.error({ error: error?.message ?? String(error), stack: error?.stack }, "uncaughtException (kept alive)"); + }); + process.on("unhandledRejection", (reason) => { + const msg = reason instanceof Error ? reason.message : String(reason); + const stack = reason instanceof Error ? reason.stack : undefined; + logger.error({ error: msg, stack }, "unhandledRejection (kept alive)"); + }); + // sandbox-agent agent plugins vary on which env var they read for OpenAI/Codex auth. // Normalize to keep local dev + docker-compose simple. if (!process.env.CODEX_API_KEY && process.env.OPENAI_API_KEY) { @@ -128,6 +141,59 @@ export async function startBackend(options: BackendStartOptions = {}): Promise.json, inspect with chrome://tracing) + app.get("/debug/memory", async (c) => { + if (process.env.NODE_ENV !== "development") { + return c.json({ error: "debug endpoints disabled in production" }, 403); + } + const wantGc = c.req.query("gc") === "1"; + if (wantGc && typeof Bun !== "undefined") { + // Bun.gc(true) triggers a synchronous full GC sweep in JavaScriptCore. + Bun.gc(true); + } + const mem = process.memoryUsage(); + const rssMb = Math.round(mem.rss / 1024 / 1024); + const heapUsedMb = Math.round(mem.heapUsed / 1024 / 1024); + const heapTotalMb = Math.round(mem.heapTotal / 1024 / 1024); + const externalMb = Math.round(mem.external / 1024 / 1024); + const nonHeapMb = rssMb - heapUsedMb - externalMb; + // Bun.heapStats() gives JSC-specific breakdown: object counts, typed array + // bytes, extra memory (native allocations tracked by JSC). Useful for + // distinguishing JS object bloat from native/WASM memory. + // eslint-disable-next-line @typescript-eslint/no-explicit-any + const BunAny = Bun as any; + const heapStats = typeof BunAny.heapStats === "function" ? BunAny.heapStats() : null; + const snapshot = { + rssMb, + heapUsedMb, + heapTotalMb, + externalMb, + nonHeapMb, + gcTriggered: wantGc, + rssBytes: mem.rss, + heapUsedBytes: mem.heapUsed, + heapTotalBytes: mem.heapTotal, + externalBytes: mem.external, + ...(heapStats ? { bunHeapStats: heapStats } : {}), + }; + // Optionally write a full JSC heap snapshot for offline analysis. + let heapSnapshotPath: string | null = null; + const wantHeap = c.req.query("heap") === "1"; + if (wantHeap && typeof Bun !== "undefined") { + heapSnapshotPath = `/tmp/foundry-heap-${Date.now()}.json`; + // Bun.generateHeapSnapshot("v8") returns a V8-compatible JSON string. + const heapJson = Bun.generateHeapSnapshot("v8"); + await Bun.write(heapSnapshotPath, heapJson); + } + logger.info(snapshot, "memory_usage_debug"); + return c.json({ ...snapshot, ...(heapSnapshotPath ? { heapSnapshotPath } : {}) }); + }); + app.use("*", async (c, next) => { const requestId = c.req.header("x-request-id")?.trim() || randomUUID(); const start = performance.now(); @@ -215,7 +281,55 @@ export async function startBackend(options: BackendStartOptions = {}): Promise Fastly -> Railway) retries callback requests when they take + // >10s. The first request deletes the verification record on success, so the + // retry fails with "verification not found" -> ?error=please_restart_the_process. + // This map tracks in-flight callbacks by state param so retries wait for and + // reuse the first request's response. + const inflightCallbacks = new Map>(); + app.all("/v1/auth/*", async (c) => { + const authPath = c.req.path; + const authMethod = c.req.method; + const isCallback = authPath.includes("/callback/"); + + // Deduplicate callback requests by OAuth state parameter + if (isCallback) { + const url = new URL(c.req.url); + const state = url.searchParams.get("state"); + if (state) { + const existing = inflightCallbacks.get(state); + if (existing) { + logger.info({ path: authPath, state: state.slice(0, 8) + "..." }, "auth_callback_dedup"); + const original = await existing; + return original.clone(); + } + + const promise = (async () => { + logger.info({ path: authPath, method: authMethod, state: state.slice(0, 8) + "..." }, "auth_callback_start"); + const start = performance.now(); + const response = await betterAuth.auth.handler(c.req.raw); + const durationMs = Math.round((performance.now() - start) * 100) / 100; + const location = response.headers.get("location"); + logger.info({ path: authPath, status: response.status, durationMs, location: location ?? undefined }, "auth_callback_complete"); + if (location && location.includes("error=")) { + logger.error({ path: authPath, status: response.status, durationMs, location }, "auth_callback_error_redirect"); + } + return response; + })(); + + inflightCallbacks.set(state, promise); + try { + const response = await promise; + return response.clone(); + } finally { + // Keep entry briefly so late retries still hit the cache + setTimeout(() => inflightCallbacks.delete(state), 30_000); + } + } + } + return await betterAuth.auth.handler(c.req.raw); }); @@ -293,6 +407,11 @@ export async function startBackend(options: BackendStartOptions = {}): Promise { + const mem = process.memoryUsage(); + const rssMb = Math.round(mem.rss / 1024 / 1024); + const heapUsedMb = Math.round(mem.heapUsed / 1024 / 1024); + const heapTotalMb = Math.round(mem.heapTotal / 1024 / 1024); + const externalMb = Math.round(mem.external / 1024 / 1024); + // Non-heap RSS: memory not accounted for by JS heap or external buffers. + // Large values here point to native allocations (WASM, mmap, child process + // bookkeeping, Bun's internal arena, etc.). + const nonHeapMb = rssMb - heapUsedMb - externalMb; + const deltaRss = rssMb - prevRss; + prevRss = rssMb; + logger.info( + { + rssMb, + heapUsedMb, + heapTotalMb, + externalMb, + nonHeapMb, + deltaRssMb: deltaRss, + rssBytes: mem.rss, + heapUsedBytes: mem.heapUsed, + heapTotalBytes: mem.heapTotal, + externalBytes: mem.external, + }, + "memory_usage", + ); + }, 60_000); + } + process.on("SIGINT", async () => { server.stop(); process.exit(0); diff --git a/foundry/packages/backend/src/services/app-github.ts b/foundry/packages/backend/src/services/app-github.ts index 6cb6db3..52e5308 100644 --- a/foundry/packages/backend/src/services/app-github.ts +++ b/foundry/packages/backend/src/services/app-github.ts @@ -41,11 +41,6 @@ export interface GitHubRepositoryRecord { defaultBranch: string; } -export interface GitHubBranchRecord { - name: string; - commitSha: string; -} - export interface GitHubMemberRecord { id: string; login: string; @@ -402,15 +397,6 @@ export class GitHubAppClient { return await this.getUserRepository(accessToken, fullName); } - async listUserRepositoryBranches(accessToken: string, fullName: string): Promise { - return await this.listRepositoryBranches(accessToken, fullName); - } - - async listInstallationRepositoryBranches(installationId: number, fullName: string): Promise { - const accessToken = await this.createInstallationAccessToken(installationId); - return await this.listRepositoryBranches(accessToken, fullName); - } - async listOrganizationMembers(accessToken: string, organizationLogin: string): Promise { const members = await this.paginate<{ id: number; @@ -708,20 +694,6 @@ export class GitHubAppClient { nextUrl: parseNextLink(response.headers.get("link")), }; } - - private async listRepositoryBranches(accessToken: string, fullName: string): Promise { - const branches = await this.paginate<{ - name: string; - commit?: { sha?: string | null } | null; - }>(`/repos/${fullName}/branches?per_page=100`, accessToken); - - return branches - .map((branch) => ({ - name: branch.name?.trim() ?? "", - commitSha: branch.commit?.sha?.trim() ?? "", - })) - .filter((branch) => branch.name.length > 0 && branch.commitSha.length > 0); - } } function parseNextLink(linkHeader: string | null): string | null { diff --git a/foundry/packages/backend/src/services/better-auth.ts b/foundry/packages/backend/src/services/better-auth.ts index 4509402..23d227f 100644 --- a/foundry/packages/backend/src/services/better-auth.ts +++ b/foundry/packages/backend/src/services/better-auth.ts @@ -1,7 +1,7 @@ import { betterAuth } from "better-auth"; import { createAdapterFactory } from "better-auth/adapters"; -import { APP_SHELL_ORGANIZATION_ID } from "../actors/organization/app-shell.js"; -import { authUserKey, organizationKey } from "../actors/keys.js"; +import { APP_SHELL_ORGANIZATION_ID } from "../actors/organization/constants.js"; +import { organizationKey, userKey } from "../actors/keys.js"; import { logger } from "../logging.js"; const AUTH_BASE_PATH = "/v1/auth"; @@ -75,7 +75,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin } // getOrCreate is intentional here: the adapter runs during Better Auth callbacks - // which can fire before any explicit create path. The app organization and auth user + // which can fire before any explicit create path. The app organization and user // actors must exist by the time the adapter needs them. const appOrganization = () => actorClient.organization.getOrCreate(organizationKey(APP_SHELL_ORGANIZATION_ID), { @@ -83,9 +83,9 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin }); // getOrCreate is intentional: Better Auth creates user records during OAuth - // callbacks, so the auth-user actor must be lazily provisioned on first access. - const getAuthUser = async (userId: string) => - await actorClient.authUser.getOrCreate(authUserKey(userId), { + // callbacks, so the user actor must be lazily provisioned on first access. + const getUser = async (userId: string) => + await actorClient.user.getOrCreate(userKey(userId), { createWithInput: { userId }, }); @@ -110,7 +110,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const email = direct("email"); if (typeof email === "string" && email.length > 0) { const organization = await appOrganization(); - const resolved = await organization.authFindEmailIndex({ email: email.toLowerCase() }); + const resolved = await organization.betterAuthFindEmailIndex({ email: email.toLowerCase() }); return resolveRouteUserId(organization, resolved); } return null; @@ -125,7 +125,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const sessionToken = direct("token") ?? data?.token; if (typeof sessionId === "string" || typeof sessionToken === "string") { const organization = await appOrganization(); - const resolved = await organization.authFindSessionIndex({ + const resolved = await organization.betterAuthFindSessionIndex({ ...(typeof sessionId === "string" ? { sessionId } : {}), ...(typeof sessionToken === "string" ? { sessionToken } : {}), }); @@ -144,11 +144,11 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const accountId = direct("accountId") ?? data?.accountId; const organization = await appOrganization(); if (typeof accountRecordId === "string" && accountRecordId.length > 0) { - const resolved = await organization.authFindAccountIndex({ id: accountRecordId }); + const resolved = await organization.betterAuthFindAccountIndex({ id: accountRecordId }); return resolveRouteUserId(organization, resolved); } if (typeof providerId === "string" && providerId.length > 0 && typeof accountId === "string" && accountId.length > 0) { - const resolved = await organization.authFindAccountIndex({ providerId, accountId }); + const resolved = await organization.betterAuthFindAccountIndex({ providerId, accountId }); return resolveRouteUserId(organization, resolved); } return null; @@ -157,11 +157,6 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return null; }; - const ensureOrganizationVerification = async (method: string, payload: Record) => { - const organization = await appOrganization(); - return await organization[method](payload); - }; - return { options: { useDatabaseGeneratedIds: false, @@ -170,7 +165,8 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin create: async ({ model, data }) => { const transformed = await transformInput(data, model, "create", true); if (model === "verification") { - return await ensureOrganizationVerification("authCreateVerification", { data: transformed }); + const organization = await appOrganization(); + return await organization.betterAuthCreateVerification({ data: transformed }); } const userId = await resolveUserIdForQuery(model, undefined, transformed); @@ -178,19 +174,19 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin throw new Error(`Unable to resolve auth actor for create(${model})`); } - const userActor = await getAuthUser(userId); - const created = await userActor.createAuthRecord({ model, data: transformed }); + const userActor = await getUser(userId); + const created = await userActor.betterAuthCreateRecord({ model, data: transformed }); const organization = await appOrganization(); if (model === "user" && typeof transformed.email === "string" && transformed.email.length > 0) { - await organization.authUpsertEmailIndex({ + await organization.betterAuthUpsertEmailIndex({ email: transformed.email.toLowerCase(), userId, }); } if (model === "session") { - await organization.authUpsertSessionIndex({ + await organization.betterAuthUpsertSessionIndex({ sessionId: String(created.id), sessionToken: String(created.token), userId, @@ -198,7 +194,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin } if (model === "account") { - await organization.authUpsertAccountIndex({ + await organization.betterAuthUpsertAccountIndex({ id: String(created.id), providerId: String(created.providerId), accountId: String(created.accountId), @@ -212,7 +208,8 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin findOne: async ({ model, where, join }) => { const transformedWhere = transformWhereClause({ model, where, action: "findOne" }); if (model === "verification") { - return await ensureOrganizationVerification("authFindOneVerification", { where: transformedWhere, join }); + const organization = await appOrganization(); + return await organization.betterAuthFindOneVerification({ where: transformedWhere, join }); } const userId = await resolveUserIdForQuery(model, transformedWhere); @@ -220,15 +217,16 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return null; } - const userActor = await getAuthUser(userId); - const found = await userActor.findOneAuthRecord({ model, where: transformedWhere, join }); + const userActor = await getUser(userId); + const found = await userActor.betterAuthFindOneRecord({ model, where: transformedWhere, join }); return found ? ((await transformOutput(found, model, undefined, join)) as any) : null; }, findMany: async ({ model, where, limit, sortBy, offset, join }) => { const transformedWhere = transformWhereClause({ model, where, action: "findMany" }); if (model === "verification") { - return await ensureOrganizationVerification("authFindManyVerification", { + const organization = await appOrganization(); + return await organization.betterAuthFindManyVerification({ where: transformedWhere, limit, sortBy, @@ -244,7 +242,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const resolved = await Promise.all( (tokenClause.value as string[]).map(async (sessionToken: string) => ({ sessionToken, - route: await organization.authFindSessionIndex({ sessionToken }), + route: await organization.betterAuthFindSessionIndex({ sessionToken }), })), ); const byUser = new Map(); @@ -259,11 +257,11 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const rows = []; for (const [userId, tokens] of byUser) { - const userActor = await getAuthUser(userId); + const userActor = await getUser(userId); const scopedWhere = transformedWhere.map((entry: any) => entry.field === "token" && entry.operator === "in" ? { ...entry, value: tokens } : entry, ); - const found = await userActor.findManyAuthRecords({ model, where: scopedWhere, limit, sortBy, offset, join }); + const found = await userActor.betterAuthFindManyRecords({ model, where: scopedWhere, limit, sortBy, offset, join }); rows.push(...found); } return await Promise.all(rows.map(async (row: any) => await transformOutput(row, model, undefined, join))); @@ -275,8 +273,8 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return []; } - const userActor = await getAuthUser(userId); - const found = await userActor.findManyAuthRecords({ model, where: transformedWhere, limit, sortBy, offset, join }); + const userActor = await getUser(userId); + const found = await userActor.betterAuthFindManyRecords({ model, where: transformedWhere, limit, sortBy, offset, join }); return await Promise.all(found.map(async (row: any) => await transformOutput(row, model, undefined, join))); }, @@ -284,7 +282,11 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const transformedWhere = transformWhereClause({ model, where, action: "update" }); const transformedUpdate = (await transformInput(update as Record, model, "update", true)) as Record; if (model === "verification") { - return await ensureOrganizationVerification("authUpdateVerification", { where: transformedWhere, update: transformedUpdate }); + const organization = await appOrganization(); + return await organization.betterAuthUpdateVerification({ + where: transformedWhere, + update: transformedUpdate, + }); } const userId = await resolveUserIdForQuery(model, transformedWhere, transformedUpdate); @@ -292,29 +294,38 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return null; } - const userActor = await getAuthUser(userId); + const userActor = await getUser(userId); const before = model === "user" - ? await userActor.findOneAuthRecord({ model, where: transformedWhere }) + ? await userActor.betterAuthFindOneRecord({ model, where: transformedWhere }) : model === "account" - ? await userActor.findOneAuthRecord({ model, where: transformedWhere }) + ? await userActor.betterAuthFindOneRecord({ model, where: transformedWhere }) : model === "session" - ? await userActor.findOneAuthRecord({ model, where: transformedWhere }) + ? await userActor.betterAuthFindOneRecord({ model, where: transformedWhere }) : null; - const updated = await userActor.updateAuthRecord({ model, where: transformedWhere, update: transformedUpdate }); + const updated = await userActor.betterAuthUpdateRecord({ + model, + where: transformedWhere, + update: transformedUpdate, + }); const organization = await appOrganization(); if (model === "user" && updated) { if (before?.email && before.email !== updated.email) { - await organization.authDeleteEmailIndex({ email: before.email.toLowerCase() }); + await organization.betterAuthDeleteEmailIndex({ + email: before.email.toLowerCase(), + }); } if (updated.email) { - await organization.authUpsertEmailIndex({ email: updated.email.toLowerCase(), userId }); + await organization.betterAuthUpsertEmailIndex({ + email: updated.email.toLowerCase(), + userId, + }); } } if (model === "session" && updated) { - await organization.authUpsertSessionIndex({ + await organization.betterAuthUpsertSessionIndex({ sessionId: String(updated.id), sessionToken: String(updated.token), userId, @@ -322,7 +333,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin } if (model === "account" && updated) { - await organization.authUpsertAccountIndex({ + await organization.betterAuthUpsertAccountIndex({ id: String(updated.id), providerId: String(updated.providerId), accountId: String(updated.accountId), @@ -337,7 +348,11 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin const transformedWhere = transformWhereClause({ model, where, action: "updateMany" }); const transformedUpdate = (await transformInput(update as Record, model, "update", true)) as Record; if (model === "verification") { - return await ensureOrganizationVerification("authUpdateManyVerification", { where: transformedWhere, update: transformedUpdate }); + const organization = await appOrganization(); + return await organization.betterAuthUpdateManyVerification({ + where: transformedWhere, + update: transformedUpdate, + }); } const userId = await resolveUserIdForQuery(model, transformedWhere, transformedUpdate); @@ -345,14 +360,19 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return 0; } - const userActor = await getAuthUser(userId); - return await userActor.updateManyAuthRecords({ model, where: transformedWhere, update: transformedUpdate }); + const userActor = await getUser(userId); + return await userActor.betterAuthUpdateManyRecords({ + model, + where: transformedWhere, + update: transformedUpdate, + }); }, delete: async ({ model, where }) => { const transformedWhere = transformWhereClause({ model, where, action: "delete" }); if (model === "verification") { - await ensureOrganizationVerification("authDeleteVerification", { where: transformedWhere }); + const organization = await appOrganization(); + await organization.betterAuthDeleteVerification({ where: transformedWhere }); return; } @@ -361,20 +381,20 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return; } - const userActor = await getAuthUser(userId); + const userActor = await getUser(userId); const organization = await appOrganization(); - const before = await userActor.findOneAuthRecord({ model, where: transformedWhere }); - await userActor.deleteAuthRecord({ model, where: transformedWhere }); + const before = await userActor.betterAuthFindOneRecord({ model, where: transformedWhere }); + await userActor.betterAuthDeleteRecord({ model, where: transformedWhere }); if (model === "session" && before) { - await organization.authDeleteSessionIndex({ + await organization.betterAuthDeleteSessionIndex({ sessionId: before.id, sessionToken: before.token, }); } if (model === "account" && before) { - await organization.authDeleteAccountIndex({ + await organization.betterAuthDeleteAccountIndex({ id: before.id, providerId: before.providerId, accountId: before.accountId, @@ -382,14 +402,17 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin } if (model === "user" && before?.email) { - await organization.authDeleteEmailIndex({ email: before.email.toLowerCase() }); + await organization.betterAuthDeleteEmailIndex({ + email: before.email.toLowerCase(), + }); } }, deleteMany: async ({ model, where }) => { const transformedWhere = transformWhereClause({ model, where, action: "deleteMany" }); if (model === "verification") { - return await ensureOrganizationVerification("authDeleteManyVerification", { where: transformedWhere }); + const organization = await appOrganization(); + return await organization.betterAuthDeleteManyVerification({ where: transformedWhere }); } if (model === "session") { @@ -397,12 +420,12 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin if (!userId) { return 0; } - const userActor = await getAuthUser(userId); + const userActor = await getUser(userId); const organization = await appOrganization(); - const sessions = await userActor.findManyAuthRecords({ model, where: transformedWhere, limit: 5000 }); - const deleted = await userActor.deleteManyAuthRecords({ model, where: transformedWhere }); + const sessions = await userActor.betterAuthFindManyRecords({ model, where: transformedWhere, limit: 5000 }); + const deleted = await userActor.betterAuthDeleteManyRecords({ model, where: transformedWhere }); for (const session of sessions) { - await organization.authDeleteSessionIndex({ + await organization.betterAuthDeleteSessionIndex({ sessionId: session.id, sessionToken: session.token, }); @@ -415,15 +438,15 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return 0; } - const userActor = await getAuthUser(userId); - const deleted = await userActor.deleteManyAuthRecords({ model, where: transformedWhere }); - return deleted; + const userActor = await getUser(userId); + return await userActor.betterAuthDeleteManyRecords({ model, where: transformedWhere }); }, count: async ({ model, where }) => { const transformedWhere = transformWhereClause({ model, where, action: "count" }); if (model === "verification") { - return await ensureOrganizationVerification("authCountVerification", { where: transformedWhere }); + const organization = await appOrganization(); + return await organization.betterAuthCountVerification({ where: transformedWhere }); } const userId = await resolveUserIdForQuery(model, transformedWhere); @@ -431,8 +454,8 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin return 0; } - const userActor = await getAuthUser(userId); - return await userActor.countAuthRecords({ model, where: transformedWhere }); + const userActor = await getUser(userId); + return await userActor.betterAuthCountRecords({ model, where: transformedWhere }); }, }; }, @@ -451,6 +474,9 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin strategy: "compact", }, }, + onAPIError: { + errorURL: stripTrailingSlash(options.appUrl) + "/signin", + }, socialProviders: { github: { clientId: requireEnv("GITHUB_CLIENT_ID"), @@ -477,17 +503,17 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin async getAuthState(sessionId: string) { const organization = await appOrganization(); - const route = await organization.authFindSessionIndex({ sessionId }); + const route = await organization.betterAuthFindSessionIndex({ sessionId }); if (!route?.userId) { return null; } - const userActor = await getAuthUser(route.userId); + const userActor = await getUser(route.userId); return await userActor.getAppAuthState({ sessionId }); }, async upsertUserProfile(userId: string, patch: Record) { - const userActor = await getAuthUser(userId); - return await userActor.upsertUserProfile({ userId, patch }); + const userActor = await getUser(userId); + return await userActor.upsertProfile({ userId, patch }); }, async setActiveOrganization(sessionId: string, activeOrganizationId: string | null) { @@ -495,7 +521,7 @@ export function initBetterAuthService(actorClient: any, options: { apiUrl: strin if (!authState?.user?.id) { throw new Error(`Unknown auth session ${sessionId}`); } - const userActor = await getAuthUser(authState.user.id); + const userActor = await getUser(authState.user.id); return await userActor.upsertSessionState({ sessionId, activeOrganizationId }); }, diff --git a/foundry/packages/backend/src/services/branch-name-prefixes.ts b/foundry/packages/backend/src/services/branch-name-prefixes.ts new file mode 100644 index 0000000..aaccaee --- /dev/null +++ b/foundry/packages/backend/src/services/branch-name-prefixes.ts @@ -0,0 +1,584 @@ +// Auto-generated list of branch name prefixes. +// Source: McMaster-Carr product catalog. +export const BRANCH_NAME_PREFIXES: readonly string[] = [ + "abrasive-blasters", + "ac-motors", + "access-doors", + "adjustable-handles", + "aerosol-paint", + "air-cleaners", + "air-cylinders", + "air-filters", + "air-hose", + "air-knives", + "air-nozzles", + "air-regulators", + "air-ride-wheels", + "air-slides", + "alligator-clips", + "alloy-steel", + "aluminum-honeycomb", + "angle-indicators", + "antiseize-lubricants", + "antislip-fluid", + "backlight-panel-kits", + "ball-bearings", + "ball-end-mills", + "ball-joint-linkages", + "ball-transfers", + "band-clamps", + "band-saw-blades", + "bar-clamps", + "bar-grating", + "barbed-hose-fittings", + "barbed-tube-fittings", + "basket-strainers", + "batch-cans", + "battery-chargers", + "battery-holders", + "bead-chain", + "beam-clamps", + "belt-conveyors", + "bench-scales", + "bench-vises", + "bin-boxes", + "bin-storage", + "binding-posts", + "blank-tags", + "blasting-cabinets", + "blind-rivets", + "bluetooth-padlocks", + "boring-lathe-tools", + "box-reducers", + "box-wrenches", + "braided-hose", + "brass-pipe-fittings", + "breather-vents", + "butt-splices", + "c-clamps", + "cable-cutters", + "cable-holders", + "cable-tie-mounts", + "cable-ties", + "cam-handles", + "cam-latches", + "cam-locks", + "cap-nuts", + "captive-panel-screws", + "carbide-burs", + "carbide-inserts", + "carbon-fiber", + "carbon-steel", + "cardstock-tags", + "carriage-bolts", + "cast-acrylic", + "cast-iron", + "cast-nylon", + "casting-compounds", + "ceiling-lights", + "ceramic-adhesives", + "chain-slings", + "check-valves", + "chemical-hose", + "chemistry-meters", + "chemistry-testing", + "chip-clearing-tools", + "chucking-reamers", + "cinching-straps", + "circuit-breakers", + "circular-saw-blades", + "circular-saws", + "clamping-hangers", + "clevis-pins", + "clevis-rod-ends", + "clip-on-nuts", + "coaxial-connectors", + "coaxial-cords", + "coiled-spring-pins", + "compact-connectors", + "computer-adapters", + "concrete-adhesives", + "concrete-repair", + "contour-transfers", + "conveyor-belt-lacing", + "conveyor-belting", + "conveyor-brushes", + "conveyor-rollers", + "coolant-hose", + "copper-tube-fittings", + "copper-tubing", + "cord-grips", + "cord-reels", + "cotter-pins", + "coupling-nuts", + "cpvc-pipe-fittings", + "cup-brushes", + "cutoff-wheels", + "cylinder-hones", + "cylinder-racks", + "cylinder-trucks", + "data-cable", + "data-connectors", + "dc-motors", + "dead-blow-hammers", + "delrin-acetal-resin", + "desiccant-air-dryers", + "desktop-cranes", + "dial-calipers", + "dial-indicators", + "die-springs", + "direct-heaters", + "disconnect-switches", + "dispensing-needles", + "dispensing-pumps", + "disposable-clothing", + "disposable-gloves", + "document-protectors", + "door-closers", + "door-handles", + "door-holders", + "dowel-pins", + "drafting-equipment", + "drain-cleaners", + "drainage-mats", + "draw-latches", + "drawer-cabinets", + "drawer-slides", + "drill-bit-sets", + "drill-bits", + "drill-bushings", + "drill-chucks", + "drill-presses", + "drilling-screws", + "drinking-fountains", + "drive-anchors", + "drive-rollers", + "drive-shafts", + "drum-faucets", + "drum-pumps", + "drum-top-vacuums", + "drum-trucks", + "dry-box-gloves", + "dry-erase-boards", + "dry-film-lubricants", + "duct-fans", + "duct-hose", + "duct-tape", + "dust-collectors", + "dustless-chalk", + "edge-trim", + "electric-actuators", + "electric-drills", + "electric-drum-pumps", + "electric-mixers", + "electrical-switches", + "electrical-tape", + "electronic-calipers", + "enclosure-heaters", + "enclosure-panels", + "ethernet-cords", + "exhaust-fans", + "exit-lights", + "expansion-joints", + "expansion-plugs", + "extension-cords", + "extension-springs", + "fabric-snaps", + "fan-blades", + "fep-tubing", + "fiberglass-grating", + "file-holders", + "filter-bag-housings", + "filter-bags", + "filter-cartridges", + "fire-fighting-hose", + "first-aid-supplies", + "fixture-clamps", + "flange-locknuts", + "flange-mount-seals", + "flap-sanding-discs", + "flap-sanding-wheels", + "flared-tube-fittings", + "flashing-lights", + "flat-washers", + "flexible-shafts", + "flexible-shank-burs", + "flexible-trays", + "float-valves", + "floor-locks", + "floor-marking-tape", + "floor-scales", + "floor-squeegees", + "flow-sights", + "flow-switches", + "flowmeter-totalizers", + "foot-switches", + "force-gauges", + "fume-exhausters", + "garbage-bags", + "garden-hose", + "gas-hose", + "gas-regulators", + "gas-springs", + "gauge-blocks", + "glass-sights", + "gold-wire", + "grab-latches", + "grease-fittings", + "grinding-bits", + "grinding-wheels", + "hand-brushes", + "hand-chain-hoists", + "hand-reamers", + "hand-trucks", + "hand-wheels", + "hand-winches", + "hanging-scales", + "hard-hats", + "hardened-shafts", + "hardness-testers", + "heat-exchangers", + "heat-guns", + "heat-lamps", + "heat-sealable-bags", + "heat-set-inserts", + "heat-shrink-tubing", + "heat-sinks", + "heated-scrapers", + "helical-inserts", + "hex-bit-sockets", + "hex-head-screws", + "hex-nuts", + "high-accuracy-rulers", + "high-amp-relays", + "high-vacuum-filters", + "high-vacuum-sights", + "hinge-adjusters", + "hoist-rings", + "hole-saws", + "hose-couplings", + "hose-reels", + "hot-melt-glue", + "hydraulic-cylinders", + "hydraulic-hose", + "hydraulic-jacks", + "iec-connectors", + "immersion-heaters", + "impression-foam", + "indicating-lights", + "inflatable-wedges", + "ink-markers", + "insertion-heaters", + "inspection-mirrors", + "instrument-carts", + "insulation-jacketing", + "jam-removers", + "jigsaw-blades", + "key-cabinets", + "key-locking-inserts", + "key-stock", + "keyed-drive-shafts", + "keyseat-end-mills", + "l-key-sets", + "l-keys", + "label-holders", + "latching-connectors", + "lathe-tools", + "lavatory-partitions", + "lead-screws", + "leveling-lasers", + "leveling-mounts", + "lid-supports", + "lift-off-hinges", + "lift-trucks", + "light-bulbs", + "limit-switches", + "linear-ball-bearings", + "liquid-level-gauges", + "lock-washers", + "lockout-devices", + "loop-clamps", + "loop-hangers", + "machine-brackets", + "machine-handles", + "machine-keys", + "magnetic-base-drills", + "magnetic-bumpers", + "masking-tape", + "masonry-drill-bits", + "medium-amp-relays", + "metal-cable-ties", + "metal-panels", + "metal-plates", + "metal-tags", + "metering-pumps", + "metric-o-rings", + "mil-spec-connectors", + "mobile-lift-tables", + "motor-controls", + "motor-starters", + "mountable-cable-ties", + "mounting-tape", + "neoprene-foam", + "nickel-titanium", + "nonmarring-hammers", + "nonslip-bumpers", + "nylon-rivets", + "nylon-tubing", + "o-rings", + "oil-level-indicators", + "oil-reservoirs", + "oil-skimmers", + "on-off-valves", + "open-end-wrenches", + "outlet-boxes", + "outlet-strips", + "packaging-tape", + "paint-brushes", + "paint-markers", + "paint-sprayers", + "pallet-racks", + "pallet-trucks", + "panel-air-filters", + "parts-baskets", + "pendant-switches", + "perforated-sheets", + "pest-control", + "petroleum-hose", + "piano-hinges", + "pipe-couplings", + "pipe-gaskets", + "pipe-markers", + "pipe-wrenches", + "plank-grating", + "plastic-clamps", + "plastic-mesh", + "plate-lifting-clamps", + "platinum-wire", + "plier-clamps", + "plug-gauges", + "portable-lights", + "power-cords", + "power-supplied", + "power-supplies", + "precision-knives", + "press-fit-nuts", + "press-in-nuts", + "protecting-tape", + "protective-coatings", + "protective-curtains", + "protective-panels", + "protective-wrap", + "proximity-switches", + "pull-handles", + "push-brooms", + "push-nuts", + "push-on-seals", + "pvc-pipe-fittings", + "pvc-tubing", + "quick-release-pins", + "ratchet-pullers", + "recycled-plastics", + "repair-adhesives", + "repair-clamps", + "reusable-cable-ties", + "ring-terminals", + "rivet-nuts", + "robot-base-mounts", + "robot-bases", + "rocker-switches", + "rod-wipers", + "roller-bearings", + "roller-chain", + "roller-conveyors", + "roof-exhaust-fans", + "roof-repair", + "rotary-broaches", + "rotary-hammers", + "rotary-shaft-seals", + "rotating-cranes", + "rotating-joints", + "router-bits", + "rtd-probes", + "rubber-edge-seals", + "rubber-tread-wheels", + "rubber-tubing", + "safety-cabinets", + "safety-glasses", + "safety-mirrors", + "sanding-belts", + "sanding-discs", + "sanding-guides", + "sanding-rolls", + "sanding-sheets", + "screw-extractors", + "screw-jacks", + "scrub-brushes", + "sealing-washers", + "security-lights", + "sensor-connectors", + "set-screws", + "setup-clamps", + "shaft-collars", + "shaft-couplings", + "shaft-repair-sleeves", + "shaft-supports", + "sharpening-stones", + "sheet-metal-cutters", + "shelf-cabinets", + "shim-stock", + "shim-tape", + "shipping-pails", + "shock-absorbers", + "shoulder-screws", + "shower-stations", + "silicone-foam", + "sleeve-bearings", + "slide-bolts", + "slitting-saws", + "slotted-spring-pins", + "sludge-samplers", + "small-parts-storage", + "snap-acting-switches", + "soap-dispensers", + "socket-head-screws", + "socket-organizers", + "socket-wrenches", + "soldering-irons", + "solid-rivets", + "solid-rod-ends", + "sound-insulation", + "space-heaters", + "spacing-beads", + "spanner-wrenches", + "specialty-pliers", + "specialty-vises", + "specialty-washers", + "speed-reducers", + "splicing-connectors", + "spray-bottles", + "spray-nozzles", + "spring-clamps", + "spring-plungers", + "spring-steel", + "square-drive-sockets", + "square-end-mills", + "square-nuts", + "squeeze-bottles", + "stack-lights", + "stainless-steel", + "stair-treads", + "static-control-mats", + "steel-carts", + "steel-pipe-fittings", + "steel-pipe-flanges", + "steel-stamps", + "steel-tubing", + "step-ladders", + "stepper-motors", + "storage-bags", + "storage-boxes", + "storage-chests", + "straight-ladders", + "strap-hinges", + "stretch-wrap", + "strip-doors", + "strip-springs", + "strobe-lights", + "structural-adhesives", + "strut-channel", + "strut-channel-nuts", + "strut-mount-clamps", + "suction-cup-lifters", + "suction-strainers", + "super-absorbent-foam", + "super-flexible-glass", + "surface-fillers", + "surface-mount-hinges", + "t-handle-keys", + "t-slotted-framing", + "tamper-seals", + "tank-level-measurers", + "tape-dispensers", + "tape-measures", + "taper-pins", + "tapping-screws", + "teflon-ptfe", + "terminal-blocks", + "test-indicators", + "test-leads", + "test-weights", + "tethered-knobs", + "thermal-insulation", + "thread-adapters", + "thread-sealant-tape", + "thread-sealants", + "threaded-inserts", + "threaded-standoffs", + "threaded-studs", + "thrust-ball-bearings", + "thrust-bearings", + "thumb-nuts", + "thumb-screws", + "tie-down-rings", + "time-clocks", + "timer-relays", + "timer-switches", + "toggle-clamps", + "toggle-switches", + "tool-holders", + "tool-sets", + "tool-steel", + "torque-wrenches", + "torsion-springs", + "tote-boxes", + "touch-bars", + "track-casters", + "track-rollers", + "track-wheels", + "traction-mats", + "trolley-systems", + "tube-brushes", + "tube-fittings", + "tubular-light-bulbs", + "turn-lock-connectors", + "twist-ties", + "u-bolts", + "u-joints", + "ul-class-fuses", + "unthreaded-spacers", + "usb-adapters", + "usb-cords", + "utility-knives", + "v-belts", + "vacuum-cups", + "vacuum-pumps", + "wall-louvers", + "wash-fountains", + "wash-guns", + "waste-containers", + "water-deionizers", + "water-filters", + "water-hose", + "water-removal-pumps", + "weather-stations", + "web-slings", + "weld-nuts", + "welding-clothing", + "welding-helmets", + "wet-dry-vacuums", + "wet-mops", + "wheel-brushes", + "wing-nuts", + "wire-cloth", + "wire-connectors", + "wire-cutting-pliers", + "wire-partitions", + "wire-rope", + "wire-rope-clamps", + "wire-wrap", + "wool-felt", + "work-platforms", + "workbench-legs", + "woven-wire-cloth", +] as const; diff --git a/foundry/packages/backend/src/services/create-flow.ts b/foundry/packages/backend/src/services/create-flow.ts index 8341399..eb9e53f 100644 --- a/foundry/packages/backend/src/services/create-flow.ts +++ b/foundry/packages/backend/src/services/create-flow.ts @@ -1,3 +1,5 @@ +import { BRANCH_NAME_PREFIXES } from "./branch-name-prefixes.js"; + export interface ResolveCreateFlowDecisionInput { task: string; explicitTitle?: string; @@ -89,30 +91,42 @@ export function sanitizeBranchName(input: string): string { return trimmed.slice(0, 50).replace(/-+$/g, ""); } +function generateRandomSuffix(length: number): string { + const chars = "abcdefghijklmnopqrstuvwxyz0123456789"; + let result = ""; + for (let i = 0; i < length; i++) { + result += chars[Math.floor(Math.random() * chars.length)]; + } + return result; +} + +function generateBranchName(): string { + const prefix = BRANCH_NAME_PREFIXES[Math.floor(Math.random() * BRANCH_NAME_PREFIXES.length)]!; + const suffix = generateRandomSuffix(4); + return `${prefix}-${suffix}`; +} + export function resolveCreateFlowDecision(input: ResolveCreateFlowDecisionInput): ResolveCreateFlowDecisionResult { const explicitBranch = input.explicitBranchName?.trim(); const title = deriveFallbackTitle(input.task, input.explicitTitle); - const generatedBase = sanitizeBranchName(title) || "task"; - - const branchBase = explicitBranch && explicitBranch.length > 0 ? explicitBranch : generatedBase; const existingBranches = new Set(input.localBranches.map((value) => value.trim()).filter((value) => value.length > 0)); const existingTaskBranches = new Set(input.taskBranches.map((value) => value.trim()).filter((value) => value.length > 0)); const conflicts = (name: string): boolean => existingBranches.has(name) || existingTaskBranches.has(name); - if (explicitBranch && conflicts(branchBase)) { - throw new Error(`Branch '${branchBase}' already exists. Choose a different --name/--branch value.`); + if (explicitBranch && explicitBranch.length > 0) { + if (conflicts(explicitBranch)) { + throw new Error(`Branch '${explicitBranch}' already exists. Choose a different --name/--branch value.`); + } + return { title, branchName: explicitBranch }; } - if (explicitBranch) { - return { title, branchName: branchBase }; - } - - let candidate = branchBase; - let index = 2; - while (conflicts(candidate)) { - candidate = `${branchBase}-${index}`; - index += 1; + // Generate a random McMaster-Carr-style branch name, retrying on conflicts + let candidate = generateBranchName(); + let attempts = 0; + while (conflicts(candidate) && attempts < 100) { + candidate = generateBranchName(); + attempts += 1; } return { diff --git a/foundry/packages/backend/src/services/github-auth.ts b/foundry/packages/backend/src/services/github-auth.ts index ebbbce9..aa475b0 100644 --- a/foundry/packages/backend/src/services/github-auth.ts +++ b/foundry/packages/backend/src/services/github-auth.ts @@ -1,5 +1,5 @@ import { getOrCreateOrganization } from "../actors/handles.js"; -import { APP_SHELL_ORGANIZATION_ID } from "../actors/organization/app-shell.js"; +import { APP_SHELL_ORGANIZATION_ID } from "../actors/organization/constants.js"; export interface ResolvedGithubAuth { githubToken: string; diff --git a/foundry/packages/backend/test/create-flow.test.ts b/foundry/packages/backend/test/create-flow.test.ts index 498c4dc..8c66cb4 100644 --- a/foundry/packages/backend/test/create-flow.test.ts +++ b/foundry/packages/backend/test/create-flow.test.ts @@ -1,5 +1,6 @@ import { describe, expect, it } from "vitest"; import { deriveFallbackTitle, resolveCreateFlowDecision, sanitizeBranchName } from "../src/services/create-flow.js"; +import { BRANCH_NAME_PREFIXES } from "../src/services/branch-name-prefixes.js"; describe("create flow decision", () => { it("derives a conventional-style fallback title from task text", () => { @@ -17,15 +18,49 @@ describe("create flow decision", () => { expect(sanitizeBranchName(" spaces everywhere ")).toBe("spaces-everywhere"); }); - it("auto-increments generated branch names for conflicts", () => { + it("generates a McMaster-Carr-style branch name with random suffix", () => { const resolved = resolveCreateFlowDecision({ task: "Add auth", - localBranches: ["feat-add-auth"], - taskBranches: ["feat-add-auth-2"], + localBranches: [], + taskBranches: [], }); expect(resolved.title).toBe("feat: Add auth"); - expect(resolved.branchName).toBe("feat-add-auth-3"); + // Branch name should be "-<4-char-suffix>" where prefix is from BRANCH_NAME_PREFIXES + const lastDash = resolved.branchName.lastIndexOf("-"); + const prefix = resolved.branchName.slice(0, lastDash); + const suffix = resolved.branchName.slice(lastDash + 1); + expect(BRANCH_NAME_PREFIXES).toContain(prefix); + expect(suffix).toMatch(/^[a-z0-9]{4}$/); + }); + + it("avoids conflicts by generating a different random name", () => { + // Even with a conflicting branch, it should produce something different + const resolved = resolveCreateFlowDecision({ + task: "Add auth", + localBranches: [], + taskBranches: [], + }); + + // Running again with the first result as a conflict should produce a different name + const resolved2 = resolveCreateFlowDecision({ + task: "Add auth", + localBranches: [resolved.branchName], + taskBranches: [], + }); + + expect(resolved2.branchName).not.toBe(resolved.branchName); + }); + + it("uses explicit branch name when provided", () => { + const resolved = resolveCreateFlowDecision({ + task: "new task", + explicitBranchName: "my-branch", + localBranches: [], + taskBranches: [], + }); + + expect(resolved.branchName).toBe("my-branch"); }); it("fails when explicit branch already exists", () => { diff --git a/foundry/packages/backend/test/keys.test.ts b/foundry/packages/backend/test/keys.test.ts index ac5f3c8..c3b2a10 100644 --- a/foundry/packages/backend/test/keys.test.ts +++ b/foundry/packages/backend/test/keys.test.ts @@ -1,14 +1,13 @@ import { describe, expect, it } from "vitest"; -import { githubDataKey, historyKey, organizationKey, repositoryKey, taskKey, taskSandboxKey } from "../src/actors/keys.js"; +import { auditLogKey, githubDataKey, organizationKey, taskKey, taskSandboxKey } from "../src/actors/keys.js"; describe("actor keys", () => { it("prefixes every key with organization namespace", () => { const keys = [ organizationKey("default"), - repositoryKey("default", "repo"), taskKey("default", "repo", "task"), taskSandboxKey("default", "sbx"), - historyKey("default", "repo"), + auditLogKey("default"), githubDataKey("default"), ]; diff --git a/foundry/packages/backend/test/organization-isolation.test.ts b/foundry/packages/backend/test/organization-isolation.test.ts index fcd1950..f5d58f2 100644 --- a/foundry/packages/backend/test/organization-isolation.test.ts +++ b/foundry/packages/backend/test/organization-isolation.test.ts @@ -8,6 +8,7 @@ import { describe, expect, it } from "vitest"; import { setupTest } from "rivetkit/test"; import { organizationKey } from "../src/actors/keys.js"; import { registry } from "../src/actors/index.js"; +import { organizationWorkflowQueueName } from "../src/actors/organization/queues.js"; import { repoIdFromRemote } from "../src/services/repo.js"; import { createTestDriver } from "./helpers/test-driver.js"; import { createTestRuntimeContext } from "./helpers/test-context.js"; @@ -51,8 +52,8 @@ describe("organization isolation", () => { const { repoPath } = createRepo(); const repoId = repoIdFromRemote(repoPath); - await wsA.applyGithubRepositoryProjection({ repoId, remoteUrl: repoPath }); - await wsB.applyGithubRepositoryProjection({ repoId, remoteUrl: repoPath }); + await wsA.send(organizationWorkflowQueueName("organization.command.github.repository_projection.apply"), { repoId, remoteUrl: repoPath }, { wait: true }); + await wsB.send(organizationWorkflowQueueName("organization.command.github.repository_projection.apply"), { repoId, remoteUrl: repoPath }, { wait: true }); await wsA.createTask({ organizationId: "alpha", diff --git a/foundry/packages/backend/test/workbench-unread.test.ts b/foundry/packages/backend/test/workspace-unread.test.ts similarity index 92% rename from foundry/packages/backend/test/workbench-unread.test.ts rename to foundry/packages/backend/test/workspace-unread.test.ts index fc94e97..5f7221a 100644 --- a/foundry/packages/backend/test/workbench-unread.test.ts +++ b/foundry/packages/backend/test/workspace-unread.test.ts @@ -1,7 +1,7 @@ import { describe, expect, it } from "vitest"; -import { requireSendableSessionMeta, shouldMarkSessionUnreadForStatus, shouldRecreateSessionForModelChange } from "../src/actors/task/workbench.js"; +import { requireSendableSessionMeta, shouldMarkSessionUnreadForStatus, shouldRecreateSessionForModelChange } from "../src/actors/task/workspace.js"; -describe("workbench unread status transitions", () => { +describe("workspace unread status transitions", () => { it("marks unread when a running session first becomes idle", () => { expect(shouldMarkSessionUnreadForStatus({ thinkingSinceMs: Date.now() - 1_000 }, "idle")).toBe(true); }); @@ -15,7 +15,7 @@ describe("workbench unread status transitions", () => { }); }); -describe("workbench model changes", () => { +describe("workspace model changes", () => { it("recreates an unused ready session so the selected model takes effect", () => { expect( shouldRecreateSessionForModelChange({ @@ -58,9 +58,9 @@ describe("workbench model changes", () => { }); }); -describe("workbench send readiness", () => { +describe("workspace send readiness", () => { it("rejects unknown sessions", () => { - expect(() => requireSendableSessionMeta(null, "session-1")).toThrow("Unknown workbench session: session-1"); + expect(() => requireSendableSessionMeta(null, "session-1")).toThrow("Unknown workspace session: session-1"); }); it("rejects pending sessions", () => { diff --git a/foundry/packages/cli/src/tui.ts b/foundry/packages/cli/src/tui.ts index c3aba9e..062bb95 100644 --- a/foundry/packages/cli/src/tui.ts +++ b/foundry/packages/cli/src/tui.ts @@ -1,4 +1,4 @@ -import type { AppConfig, TaskRecord } from "@sandbox-agent/foundry-shared"; +import type { AppConfig, TaskRecord, WorkspaceTaskDetail } from "@sandbox-agent/foundry-shared"; import { spawnSync } from "node:child_process"; import { createBackendClientFromConfig, filterTasks, formatRelativeAge, groupTaskStatus } from "@sandbox-agent/foundry-client"; import { CLI_BUILD_ID } from "./build-id.js"; @@ -51,14 +51,28 @@ interface DisplayRow { age: string; } +type TuiTaskRow = TaskRecord & Pick & { activeSessionId?: string | null }; + interface RenderOptions { width?: number; height?: number; } -async function listDetailedTasks(client: ReturnType, organizationId: string): Promise { +async function listDetailedTasks(client: ReturnType, organizationId: string): Promise { const rows = await client.listTasks(organizationId); - return await Promise.all(rows.map(async (row) => await client.getTask(organizationId, row.taskId))); + return await Promise.all( + rows.map(async (row) => { + const [task, detail] = await Promise.all([ + client.getTask(organizationId, row.repoId, row.taskId), + client.getTaskDetail(organizationId, row.repoId, row.taskId).catch(() => null), + ]); + return { + ...task, + pullRequest: detail?.pullRequest ?? null, + activeSessionId: detail?.activeSessionId ?? null, + }; + }), + ); } function pad(input: string, width: number): string { @@ -143,29 +157,17 @@ function agentSymbol(status: TaskRecord["status"]): string { return "-"; } -function toDisplayRow(row: TaskRecord): DisplayRow { - const conflictPrefix = row.conflictsWithMain === "true" ? "\u26A0 " : ""; - - const prLabel = row.prUrl ? `#${row.prUrl.match(/\/pull\/(\d+)/)?.[1] ?? "?"}` : row.prSubmitted ? "sub" : "-"; - - const ciLabel = row.ciStatus ?? "-"; - const reviewLabel = row.reviewStatus - ? row.reviewStatus === "approved" - ? "ok" - : row.reviewStatus === "changes_requested" - ? "chg" - : row.reviewStatus === "pending" - ? "..." - : row.reviewStatus - : "-"; +function toDisplayRow(row: TuiTaskRow): DisplayRow { + const prLabel = row.pullRequest ? `#${row.pullRequest.number}` : "-"; + const reviewLabel = row.pullRequest ? (row.pullRequest.isDraft ? "draft" : row.pullRequest.state.toLowerCase()) : "-"; return { - name: `${conflictPrefix}${row.title || row.branchName}`, - diff: row.diffStat ?? "-", + name: row.title || row.branchName || row.taskId, + diff: "-", agent: agentSymbol(row.status), pr: prLabel, - author: row.prAuthor ?? "-", - ci: ciLabel, + author: row.pullRequest?.authorLogin ?? "-", + ci: "-", review: reviewLabel, age: formatRelativeAge(row.updatedAt), }; @@ -186,7 +188,7 @@ function helpLines(width: number): string[] { } export function formatRows( - rows: TaskRecord[], + rows: TuiTaskRow[], selected: number, organizationId: string, status: string, @@ -336,8 +338,8 @@ export async function runTui(config: AppConfig, organizationId: string): Promise renderer.root.add(text); renderer.start(); - let allRows: TaskRecord[] = []; - let filteredRows: TaskRecord[] = []; + let allRows: TuiTaskRow[] = []; + let filteredRows: TuiTaskRow[] = []; let selected = 0; let searchQuery = ""; let showHelp = false; @@ -393,7 +395,7 @@ export async function runTui(config: AppConfig, organizationId: string): Promise render(); }; - const selectedRow = (): TaskRecord | null => { + const selectedRow = (): TuiTaskRow | null => { if (filteredRows.length === 0) { return null; } @@ -522,7 +524,7 @@ export async function runTui(config: AppConfig, organizationId: string): Promise render(); void (async () => { try { - const result = await client.switchTask(organizationId, row.taskId); + const result = await client.switchTask(organizationId, row.repoId, row.taskId); close(`cd ${result.switchTarget}`); } catch (err) { busy = false; @@ -543,7 +545,7 @@ export async function runTui(config: AppConfig, organizationId: string): Promise render(); void (async () => { try { - const result = await client.attachTask(organizationId, row.taskId); + const result = await client.attachTask(organizationId, row.repoId, row.taskId); close(`target=${result.target} session=${result.sessionId ?? "none"}`); } catch (err) { busy = false; @@ -559,7 +561,11 @@ export async function runTui(config: AppConfig, organizationId: string): Promise if (!row) { return; } - void runActionWithRefresh(`archiving ${row.taskId}`, async () => client.runAction(organizationId, row.taskId, "archive"), `archived ${row.taskId}`); + void runActionWithRefresh( + `archiving ${row.taskId}`, + async () => client.runAction(organizationId, row.repoId, row.taskId, "archive"), + `archived ${row.taskId}`, + ); return; } @@ -568,7 +574,11 @@ export async function runTui(config: AppConfig, organizationId: string): Promise if (!row) { return; } - void runActionWithRefresh(`syncing ${row.taskId}`, async () => client.runAction(organizationId, row.taskId, "sync"), `synced ${row.taskId}`); + void runActionWithRefresh( + `syncing ${row.taskId}`, + async () => client.runAction(organizationId, row.repoId, row.taskId, "sync"), + `synced ${row.taskId}`, + ); return; } @@ -580,8 +590,8 @@ export async function runTui(config: AppConfig, organizationId: string): Promise void runActionWithRefresh( `merging ${row.taskId}`, async () => { - await client.runAction(organizationId, row.taskId, "merge"); - await client.runAction(organizationId, row.taskId, "archive"); + await client.runAction(organizationId, row.repoId, row.taskId, "merge"); + await client.runAction(organizationId, row.repoId, row.taskId, "archive"); }, `merged+archived ${row.taskId}`, ); @@ -590,14 +600,15 @@ export async function runTui(config: AppConfig, organizationId: string): Promise if (ctrl && name === "o") { const row = selectedRow(); - if (!row?.prUrl) { + const prUrl = row?.pullRequest?.url ?? null; + if (!prUrl) { status = "no PR URL available for this task"; render(); return; } const openCmd = process.platform === "darwin" ? "open" : "xdg-open"; - spawnSync(openCmd, [row.prUrl], { stdio: "ignore" }); - status = `opened ${row.prUrl}`; + spawnSync(openCmd, [prUrl], { stdio: "ignore" }); + status = `opened ${prUrl}`; render(); return; } diff --git a/foundry/packages/cli/test/tui-format.test.ts b/foundry/packages/cli/test/tui-format.test.ts index 9ba0feb..15d3fe8 100644 --- a/foundry/packages/cli/test/tui-format.test.ts +++ b/foundry/packages/cli/test/tui-format.test.ts @@ -3,7 +3,7 @@ import type { TaskRecord } from "@sandbox-agent/foundry-shared"; import { filterTasks, fuzzyMatch } from "@sandbox-agent/foundry-client"; import { formatRows } from "../src/tui.js"; -const sample: TaskRecord = { +const sample = { organizationId: "default", repoId: "repo-a", repoRemote: "https://example.com/repo-a.git", @@ -13,33 +13,22 @@ const sample: TaskRecord = { task: "Do test", sandboxProviderId: "local", status: "running", - statusMessage: null, activeSandboxId: "sandbox-1", - activeSessionId: "session-1", + pullRequest: null, sandboxes: [ { sandboxId: "sandbox-1", sandboxProviderId: "local", + sandboxActorId: null, switchTarget: "sandbox://local/sandbox-1", cwd: null, createdAt: 1, updatedAt: 1, }, ], - agentType: null, - prSubmitted: false, - diffStat: null, - prUrl: null, - prAuthor: null, - ciStatus: null, - reviewStatus: null, - reviewer: null, - conflictsWithMain: null, - hasUnpushed: null, - parentBranch: null, createdAt: 1, updatedAt: 1, -}; +} satisfies TaskRecord & { pullRequest: null; activeSessionId?: null }; describe("formatRows", () => { it("renders rust-style table header and empty state", () => { diff --git a/foundry/packages/client/package.json b/foundry/packages/client/package.json index 98079d5..fa73dab 100644 --- a/foundry/packages/client/package.json +++ b/foundry/packages/client/package.json @@ -6,12 +6,12 @@ "main": "dist/index.js", "types": "dist/index.d.ts", "scripts": { - "build": "tsup src/index.ts --format esm --dts", + "build": "tsup src/index.ts --format esm --dts --tsconfig tsconfig.build.json", "typecheck": "tsc --noEmit", "test": "vitest run", "test:e2e:full": "HF_ENABLE_DAEMON_FULL_E2E=1 vitest run test/e2e/full-integration-e2e.test.ts", - "test:e2e:workbench": "HF_ENABLE_DAEMON_WORKBENCH_E2E=1 vitest run test/e2e/workbench-e2e.test.ts", - "test:e2e:workbench-load": "HF_ENABLE_DAEMON_WORKBENCH_LOAD_E2E=1 vitest run test/e2e/workbench-load-e2e.test.ts" + "test:e2e:workspace": "HF_ENABLE_DAEMON_WORKBENCH_E2E=1 vitest run test/e2e/workspace-e2e.test.ts", + "test:e2e:workspace-load": "HF_ENABLE_DAEMON_WORKBENCH_LOAD_E2E=1 vitest run test/e2e/workspace-load-e2e.test.ts" }, "dependencies": { "@sandbox-agent/foundry-shared": "workspace:*", diff --git a/foundry/packages/client/src/app-client.ts b/foundry/packages/client/src/app-client.ts index 16968cf..0bf5526 100644 --- a/foundry/packages/client/src/app-client.ts +++ b/foundry/packages/client/src/app-client.ts @@ -4,6 +4,7 @@ import type { FoundryOrganization, FoundryUser, UpdateFoundryOrganizationProfileInput, + WorkspaceModelId, } from "@sandbox-agent/foundry-shared"; import type { BackendClient } from "./backend-client.js"; import { getMockFoundryAppClient } from "./mock-app.js"; @@ -17,6 +18,7 @@ export interface FoundryAppClient { skipStarterRepo(): Promise; starStarterRepo(organizationId: string): Promise; selectOrganization(organizationId: string): Promise; + setDefaultModel(model: WorkspaceModelId): Promise; updateOrganizationProfile(input: UpdateFoundryOrganizationProfileInput): Promise; triggerGithubSync(organizationId: string): Promise; completeHostedCheckout(organizationId: string, planId: FoundryBillingPlanId): Promise; diff --git a/foundry/packages/client/src/backend-client.ts b/foundry/packages/client/src/backend-client.ts index 14e5661..c2222cc 100644 --- a/foundry/packages/client/src/backend-client.ts +++ b/foundry/packages/client/src/backend-client.ts @@ -7,28 +7,30 @@ import type { CreateTaskInput, AppEvent, SessionEvent, + SandboxProcessSnapshot, SandboxProcessesEvent, TaskRecord, TaskSummary, - TaskWorkbenchChangeModelInput, - TaskWorkbenchCreateTaskInput, - TaskWorkbenchCreateTaskResponse, - TaskWorkbenchDiffInput, - TaskWorkbenchRenameInput, - TaskWorkbenchRenameSessionInput, - TaskWorkbenchSelectInput, - TaskWorkbenchSetSessionUnreadInput, - TaskWorkbenchSendMessageInput, - TaskWorkbenchSnapshot, - TaskWorkbenchSessionInput, - TaskWorkbenchUpdateDraftInput, + TaskWorkspaceChangeModelInput, + TaskWorkspaceChangeOwnerInput, + TaskWorkspaceCreateTaskInput, + TaskWorkspaceCreateTaskResponse, + TaskWorkspaceDiffInput, + TaskWorkspaceRenameInput, + TaskWorkspaceRenameSessionInput, + TaskWorkspaceSelectInput, + TaskWorkspaceSetSessionUnreadInput, + TaskWorkspaceSendMessageInput, + TaskWorkspaceSnapshot, + TaskWorkspaceSessionInput, + TaskWorkspaceUpdateDraftInput, TaskEvent, - WorkbenchTaskDetail, - WorkbenchTaskSummary, - WorkbenchSessionDetail, + WorkspaceTaskDetail, + WorkspaceTaskSummary, + WorkspaceSessionDetail, OrganizationEvent, OrganizationSummarySnapshot, - HistoryEvent, + AuditLogEvent as HistoryEvent, HistoryQueryInput, SandboxProviderId, RepoOverview, @@ -37,8 +39,10 @@ import type { StarSandboxAgentRepoResult, SwitchResult, UpdateFoundryOrganizationProfileInput, + WorkspaceModelGroup, + WorkspaceModelId, } from "@sandbox-agent/foundry-shared"; -import type { ProcessCreateRequest, ProcessInfo, ProcessLogFollowQuery, ProcessLogsResponse, ProcessSignalQuery } from "sandbox-agent"; +import type { ProcessCreateRequest, ProcessLogFollowQuery, ProcessLogsResponse, ProcessSignalQuery } from "sandbox-agent"; import { createMockBackendClient } from "./mock/backend-client.js"; import { taskKey, taskSandboxKey, organizationKey } from "./keys.js"; @@ -64,7 +68,7 @@ export interface SandboxSessionEventRecord { payload: unknown; } -export type SandboxProcessRecord = ProcessInfo; +export type SandboxProcessRecord = SandboxProcessSnapshot; export interface ActorConn { on(event: string, listener: (payload: any) => void): () => void; @@ -72,45 +76,45 @@ export interface ActorConn { dispose(): Promise; } +interface AuthSessionScopedInput { + authSessionId?: string; +} + interface OrganizationHandle { connect(): ActorConn; listRepos(input: { organizationId: string }): Promise; createTask(input: CreateTaskInput): Promise; listTasks(input: { organizationId: string; repoId?: string }): Promise; getRepoOverview(input: { organizationId: string; repoId: string }): Promise; - history(input: HistoryQueryInput): Promise; - switchTask(taskId: string): Promise; - getTask(input: { organizationId: string; taskId: string }): Promise; - attachTask(input: { organizationId: string; taskId: string; reason?: string }): Promise<{ target: string; sessionId: string | null }>; - pushTask(input: { organizationId: string; taskId: string; reason?: string }): Promise; - syncTask(input: { organizationId: string; taskId: string; reason?: string }): Promise; - mergeTask(input: { organizationId: string; taskId: string; reason?: string }): Promise; - archiveTask(input: { organizationId: string; taskId: string; reason?: string }): Promise; - killTask(input: { organizationId: string; taskId: string; reason?: string }): Promise; + auditLog(input: HistoryQueryInput): Promise; + switchTask(input: { repoId: string; taskId: string }): Promise; + getTask(input: { organizationId: string; repoId: string; taskId: string }): Promise; + attachTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise<{ target: string; sessionId: string | null }>; + pushTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise; + syncTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise; + mergeTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise; + archiveTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise; + killTask(input: { organizationId: string; repoId: string; taskId: string; reason?: string }): Promise; useOrganization(input: { organizationId: string }): Promise<{ organizationId: string }>; starSandboxAgentRepo(input: StarSandboxAgentRepoInput): Promise; getOrganizationSummary(input: { organizationId: string }): Promise; - applyTaskSummaryUpdate(input: { taskSummary: WorkbenchTaskSummary }): Promise; - removeTaskSummary(input: { taskId: string }): Promise; - reconcileWorkbenchState(input: { organizationId: string }): Promise; - createWorkbenchTask(input: TaskWorkbenchCreateTaskInput): Promise; - markWorkbenchUnread(input: TaskWorkbenchSelectInput): Promise; - renameWorkbenchTask(input: TaskWorkbenchRenameInput): Promise; - renameWorkbenchBranch(input: TaskWorkbenchRenameInput): Promise; - createWorkbenchSession(input: TaskWorkbenchSelectInput & { model?: string }): Promise<{ sessionId: string }>; - renameWorkbenchSession(input: TaskWorkbenchRenameSessionInput): Promise; - setWorkbenchSessionUnread(input: TaskWorkbenchSetSessionUnreadInput): Promise; - updateWorkbenchDraft(input: TaskWorkbenchUpdateDraftInput): Promise; - changeWorkbenchModel(input: TaskWorkbenchChangeModelInput): Promise; - sendWorkbenchMessage(input: TaskWorkbenchSendMessageInput): Promise; - stopWorkbenchSession(input: TaskWorkbenchSessionInput): Promise; - closeWorkbenchSession(input: TaskWorkbenchSessionInput): Promise; - publishWorkbenchPr(input: TaskWorkbenchSelectInput): Promise; - revertWorkbenchFile(input: TaskWorkbenchDiffInput): Promise; - reloadGithubOrganization(): Promise; - reloadGithubPullRequests(): Promise; - reloadGithubRepository(input: { repoId: string }): Promise; - reloadGithubPullRequest(input: { repoId: string; prNumber: number }): Promise; + createWorkspaceTask(input: TaskWorkspaceCreateTaskInput & AuthSessionScopedInput): Promise; + markWorkspaceUnread(input: TaskWorkspaceSelectInput & AuthSessionScopedInput): Promise; + renameWorkspaceTask(input: TaskWorkspaceRenameInput & AuthSessionScopedInput): Promise; + createWorkspaceSession(input: TaskWorkspaceSelectInput & { model?: string } & AuthSessionScopedInput): Promise<{ sessionId: string }>; + renameWorkspaceSession(input: TaskWorkspaceRenameSessionInput & AuthSessionScopedInput): Promise; + selectWorkspaceSession(input: TaskWorkspaceSessionInput & AuthSessionScopedInput): Promise; + setWorkspaceSessionUnread(input: TaskWorkspaceSetSessionUnreadInput & AuthSessionScopedInput): Promise; + updateWorkspaceDraft(input: TaskWorkspaceUpdateDraftInput & AuthSessionScopedInput): Promise; + changeWorkspaceModel(input: TaskWorkspaceChangeModelInput & AuthSessionScopedInput): Promise; + sendWorkspaceMessage(input: TaskWorkspaceSendMessageInput & AuthSessionScopedInput): Promise; + stopWorkspaceSession(input: TaskWorkspaceSessionInput & AuthSessionScopedInput): Promise; + closeWorkspaceSession(input: TaskWorkspaceSessionInput & AuthSessionScopedInput): Promise; + publishWorkspacePr(input: TaskWorkspaceSelectInput & AuthSessionScopedInput): Promise; + changeWorkspaceTaskOwner(input: TaskWorkspaceChangeOwnerInput & AuthSessionScopedInput): Promise; + revertWorkspaceFile(input: TaskWorkspaceDiffInput & AuthSessionScopedInput): Promise; + adminReloadGithubOrganization(): Promise; + adminReloadGithubRepository(input: { repoId: string }): Promise; } interface AppOrganizationHandle { @@ -119,6 +123,7 @@ interface AppOrganizationHandle { skipAppStarterRepo(input: { sessionId: string }): Promise; starAppStarterRepo(input: { sessionId: string; organizationId: string }): Promise; selectAppOrganization(input: { sessionId: string; organizationId: string }): Promise; + setAppDefaultModel(input: { sessionId: string; defaultModel: WorkspaceModelId }): Promise; updateAppOrganizationProfile(input: UpdateFoundryOrganizationProfileInput & { sessionId: string }): Promise; triggerAppRepoImport(input: { sessionId: string; organizationId: string }): Promise; beginAppGithubInstall(input: { sessionId: string; organizationId: string }): Promise<{ url: string }>; @@ -130,9 +135,9 @@ interface AppOrganizationHandle { } interface TaskHandle { - getTaskSummary(): Promise; - getTaskDetail(): Promise; - getSessionDetail(input: { sessionId: string }): Promise; + getTaskSummary(): Promise; + getTaskDetail(input?: AuthSessionScopedInput): Promise; + getSessionDetail(input: { sessionId: string } & AuthSessionScopedInput): Promise; connect(): ActorConn; } @@ -157,6 +162,7 @@ interface TaskSandboxHandle { rawSendSessionMethod(sessionId: string, method: string, params: Record): Promise; destroySession(sessionId: string): Promise; sandboxAgentConnection(): Promise<{ endpoint: string; token?: string }>; + listWorkspaceModelGroups(): Promise; providerState(): Promise<{ sandboxProviderId: SandboxProviderId; sandboxId: string; state: string; at: number }>; } @@ -179,6 +185,7 @@ export interface BackendClientOptions { endpoint: string; defaultOrganizationId?: string; mode?: "remote" | "mock"; + encoding?: "json" | "cbor" | "bare"; } export interface BackendClient { @@ -192,6 +199,7 @@ export interface BackendClient { skipAppStarterRepo(): Promise; starAppStarterRepo(organizationId: string): Promise; selectAppOrganization(organizationId: string): Promise; + setAppDefaultModel(defaultModel: WorkspaceModelId): Promise; updateAppOrganizationProfile(input: UpdateFoundryOrganizationProfileInput): Promise; triggerAppRepoImport(organizationId: string): Promise; reconnectAppGithub(organizationId: string): Promise; @@ -204,11 +212,11 @@ export interface BackendClient { createTask(input: CreateTaskInput): Promise; listTasks(organizationId: string, repoId?: string): Promise; getRepoOverview(organizationId: string, repoId: string): Promise; - getTask(organizationId: string, taskId: string): Promise; + getTask(organizationId: string, repoId: string, taskId: string): Promise; listHistory(input: HistoryQueryInput): Promise; - switchTask(organizationId: string, taskId: string): Promise; - attachTask(organizationId: string, taskId: string): Promise<{ target: string; sessionId: string | null }>; - runAction(organizationId: string, taskId: string, action: TaskAction): Promise; + switchTask(organizationId: string, repoId: string, taskId: string): Promise; + attachTask(organizationId: string, repoId: string, taskId: string): Promise<{ target: string; sessionId: string | null }>; + runAction(organizationId: string, repoId: string, taskId: string, action: TaskAction): Promise; createSandboxSession(input: { organizationId: string; sandboxProviderId: SandboxProviderId; @@ -279,29 +287,29 @@ export interface BackendClient { sandboxId: string, ): Promise<{ sandboxProviderId: SandboxProviderId; sandboxId: string; state: string; at: number }>; getSandboxAgentConnection(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise<{ endpoint: string; token?: string }>; + getSandboxWorkspaceModelGroups(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise; getOrganizationSummary(organizationId: string): Promise; - getTaskDetail(organizationId: string, repoId: string, taskId: string): Promise; - getSessionDetail(organizationId: string, repoId: string, taskId: string, sessionId: string): Promise; - getWorkbench(organizationId: string): Promise; - subscribeWorkbench(organizationId: string, listener: () => void): () => void; - createWorkbenchTask(organizationId: string, input: TaskWorkbenchCreateTaskInput): Promise; - markWorkbenchUnread(organizationId: string, input: TaskWorkbenchSelectInput): Promise; - renameWorkbenchTask(organizationId: string, input: TaskWorkbenchRenameInput): Promise; - renameWorkbenchBranch(organizationId: string, input: TaskWorkbenchRenameInput): Promise; - createWorkbenchSession(organizationId: string, input: TaskWorkbenchSelectInput & { model?: string }): Promise<{ sessionId: string }>; - renameWorkbenchSession(organizationId: string, input: TaskWorkbenchRenameSessionInput): Promise; - setWorkbenchSessionUnread(organizationId: string, input: TaskWorkbenchSetSessionUnreadInput): Promise; - updateWorkbenchDraft(organizationId: string, input: TaskWorkbenchUpdateDraftInput): Promise; - changeWorkbenchModel(organizationId: string, input: TaskWorkbenchChangeModelInput): Promise; - sendWorkbenchMessage(organizationId: string, input: TaskWorkbenchSendMessageInput): Promise; - stopWorkbenchSession(organizationId: string, input: TaskWorkbenchSessionInput): Promise; - closeWorkbenchSession(organizationId: string, input: TaskWorkbenchSessionInput): Promise; - publishWorkbenchPr(organizationId: string, input: TaskWorkbenchSelectInput): Promise; - revertWorkbenchFile(organizationId: string, input: TaskWorkbenchDiffInput): Promise; - reloadGithubOrganization(organizationId: string): Promise; - reloadGithubPullRequests(organizationId: string): Promise; - reloadGithubRepository(organizationId: string, repoId: string): Promise; - reloadGithubPullRequest(organizationId: string, repoId: string, prNumber: number): Promise; + getTaskDetail(organizationId: string, repoId: string, taskId: string): Promise; + getSessionDetail(organizationId: string, repoId: string, taskId: string, sessionId: string): Promise; + getWorkspace(organizationId: string): Promise; + subscribeWorkspace(organizationId: string, listener: () => void): () => void; + createWorkspaceTask(organizationId: string, input: TaskWorkspaceCreateTaskInput): Promise; + markWorkspaceUnread(organizationId: string, input: TaskWorkspaceSelectInput): Promise; + renameWorkspaceTask(organizationId: string, input: TaskWorkspaceRenameInput): Promise; + createWorkspaceSession(organizationId: string, input: TaskWorkspaceSelectInput & { model?: string }): Promise<{ sessionId: string }>; + renameWorkspaceSession(organizationId: string, input: TaskWorkspaceRenameSessionInput): Promise; + selectWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise; + setWorkspaceSessionUnread(organizationId: string, input: TaskWorkspaceSetSessionUnreadInput): Promise; + updateWorkspaceDraft(organizationId: string, input: TaskWorkspaceUpdateDraftInput): Promise; + changeWorkspaceModel(organizationId: string, input: TaskWorkspaceChangeModelInput): Promise; + sendWorkspaceMessage(organizationId: string, input: TaskWorkspaceSendMessageInput): Promise; + stopWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise; + closeWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise; + publishWorkspacePr(organizationId: string, input: TaskWorkspaceSelectInput): Promise; + changeWorkspaceTaskOwner(organizationId: string, input: TaskWorkspaceChangeOwnerInput): Promise; + revertWorkspaceFile(organizationId: string, input: TaskWorkspaceDiffInput): Promise; + adminReloadGithubOrganization(organizationId: string): Promise; + adminReloadGithubRepository(organizationId: string, repoId: string): Promise; health(): Promise<{ ok: true }>; useOrganization(organizationId: string): Promise<{ organizationId: string }>; starSandboxAgentRepo(organizationId: string): Promise; @@ -409,8 +417,8 @@ export function createBackendClient(options: BackendClientOptions): BackendClien const endpoints = deriveBackendEndpoints(options.endpoint); const rivetApiEndpoint = endpoints.rivetEndpoint; const appApiEndpoint = endpoints.appEndpoint; - const client = createClient({ endpoint: rivetApiEndpoint }) as unknown as RivetClient; - const workbenchSubscriptions = new Map< + const client = createClient({ endpoint: rivetApiEndpoint, encoding: options.encoding }) as unknown as RivetClient; + const workspaceSubscriptions = new Map< string, { listeners: Set<() => void>; @@ -461,6 +469,16 @@ export function createBackendClient(options: BackendClientOptions): BackendClien return typeof sessionId === "string" && sessionId.length > 0 ? sessionId : null; }; + const getAuthSessionInput = async (): Promise => { + const authSessionId = await getSessionId(); + return authSessionId ? { authSessionId } : undefined; + }; + + const withAuthSessionInput = async (input: TInput): Promise => { + const authSessionInput = await getAuthSessionInput(); + return authSessionInput ? { ...input, ...authSessionInput } : input; + }; + const organization = async (organizationId: string): Promise => client.organization.getOrCreate(organizationKey(organizationId), { createWithInput: organizationId, @@ -471,7 +489,15 @@ export function createBackendClient(options: BackendClientOptions): BackendClien createWithInput: "app", }) as unknown as AppOrganizationHandle; - const task = async (organizationId: string, repoId: string, taskId: string): Promise => client.task.get(taskKey(organizationId, repoId, taskId)); + // getOrCreate is intentional here — this is the ONLY lazy creation point for + // virtual tasks (PR-driven entries that exist in the org's local tables but + // have no task actor yet). The task actor self-initializes from org data in + // getCurrentRecord(). Backend code must NEVER use getOrCreateTask except in + // createTaskMutation. See backend/CLAUDE.md "Lazy Task Actor Creation". + const task = async (organizationId: string, repoId: string, taskId: string): Promise => + client.task.getOrCreate(taskKey(organizationId, repoId, taskId), { + createWithInput: { organizationId, repoId, taskId }, + }); const sandboxByKey = async (organizationId: string, _providerId: SandboxProviderId, sandboxId: string): Promise => { return (client as any).taskSandbox.get(taskSandboxKey(organizationId, sandboxId)); @@ -493,17 +519,15 @@ export function createBackendClient(options: BackendClientOptions): BackendClien for (const row of candidates) { try { - const detail = await ws.getTask({ organizationId, taskId: row.taskId }); + const detail = await ws.getTask({ organizationId, repoId: row.repoId, taskId: row.taskId }); if (detail.sandboxProviderId !== sandboxProviderId) { continue; } - const sandbox = detail.sandboxes.find( + const sandboxes = detail.sandboxes as Array<(typeof detail.sandboxes)[number] & { sandboxActorId?: string }>; + const sandbox = sandboxes.find( (sb) => - sb.sandboxId === sandboxId && - sb.sandboxProviderId === sandboxProviderId && - typeof (sb as any).sandboxActorId === "string" && - (sb as any).sandboxActorId.length > 0, - ) as { sandboxActorId?: string } | undefined; + sb.sandboxId === sandboxId && sb.sandboxProviderId === sandboxProviderId && typeof sb.sandboxActorId === "string" && sb.sandboxActorId.length > 0, + ); if (sandbox?.sandboxActorId) { return (client as any).taskSandbox.getForId(sandbox.sandboxActorId); } @@ -563,67 +587,81 @@ export function createBackendClient(options: BackendClientOptions): BackendClien } }; - const getWorkbenchCompat = async (organizationId: string): Promise => { + const getTaskDetailWithAuth = async (organizationId: string, repoId: string, taskIdValue: string): Promise => { + return (await task(organizationId, repoId, taskIdValue)).getTaskDetail(await getAuthSessionInput()); + }; + + const getSessionDetailWithAuth = async (organizationId: string, repoId: string, taskIdValue: string, sessionId: string): Promise => { + return (await task(organizationId, repoId, taskIdValue)).getSessionDetail(await withAuthSessionInput({ sessionId })); + }; + + const getWorkspaceCompat = async (organizationId: string): Promise => { + const authSessionInput = await getAuthSessionInput(); const summary = await (await organization(organizationId)).getOrganizationSummary({ organizationId }); - const tasks = ( - await Promise.all( - summary.taskSummaries.map(async (taskSummary) => { - let detail; - try { - detail = await (await task(organizationId, taskSummary.repoId, taskSummary.id)).getTaskDetail(); - } catch (error) { - if (isActorNotFoundError(error)) { - return null; - } - throw error; + const resolvedTasks = await Promise.all( + summary.taskSummaries.map(async (taskSummary) => { + let detail; + try { + const taskHandle = await task(organizationId, taskSummary.repoId, taskSummary.id); + detail = await taskHandle.getTaskDetail(authSessionInput); + } catch (error) { + if (isActorNotFoundError(error)) { + return null; } - const sessionDetails = await Promise.all( - detail.sessionsSummary.map(async (session) => { - try { - const full = await (await task(organizationId, detail.repoId, detail.id)).getSessionDetail({ sessionId: session.id }); - return [session.id, full] as const; - } catch (error) { - if (isActorNotFoundError(error)) { - return null; - } - throw error; + throw error; + } + const sessionDetails = await Promise.all( + detail.sessionsSummary.map(async (session) => { + try { + const full = await (await task(organizationId, detail.repoId, detail.id)).getSessionDetail({ + sessionId: session.id, + ...(authSessionInput ?? {}), + }); + return [session.id, full] as const; + } catch (error) { + if (isActorNotFoundError(error)) { + return null; } - }), - ); - const sessionDetailsById = new Map(sessionDetails.filter((entry): entry is readonly [string, WorkbenchSessionDetail] => entry !== null)); - return { - id: detail.id, - repoId: detail.repoId, - title: detail.title, - status: detail.status, - repoName: detail.repoName, - updatedAtMs: detail.updatedAtMs, - branch: detail.branch, - pullRequest: detail.pullRequest, - sessions: detail.sessionsSummary.map((session) => { - const full = sessionDetailsById.get(session.id); - return { - id: session.id, - sessionId: session.sessionId, - sessionName: session.sessionName, - agent: session.agent, - model: session.model, - status: session.status, - thinkingSinceMs: session.thinkingSinceMs, - unread: session.unread, - created: session.created, - draft: full?.draft ?? { text: "", attachments: [], updatedAtMs: null }, - transcript: full?.transcript ?? [], - }; - }), - fileChanges: detail.fileChanges, - diffs: detail.diffs, - fileTree: detail.fileTree, - minutesUsed: detail.minutesUsed, - }; - }), - ) - ).filter((task): task is TaskWorkbenchSnapshot["tasks"][number] => task !== null); + throw error; + } + }), + ); + const sessionDetailsById = new Map(sessionDetails.filter((entry): entry is readonly [string, WorkspaceSessionDetail] => entry !== null)); + return { + id: detail.id, + repoId: detail.repoId, + title: detail.title, + status: detail.status, + repoName: detail.repoName, + updatedAtMs: detail.updatedAtMs, + branch: detail.branch, + pullRequest: detail.pullRequest, + activeSessionId: detail.activeSessionId ?? null, + sessions: detail.sessionsSummary.map((session) => { + const full = sessionDetailsById.get(session.id); + return { + id: session.id, + sessionId: session.sessionId, + sessionName: session.sessionName, + agent: session.agent, + model: session.model, + status: session.status, + thinkingSinceMs: session.thinkingSinceMs, + unread: session.unread, + created: session.created, + draft: full?.draft ?? { text: "", attachments: [], updatedAtMs: null }, + transcript: full?.transcript ?? [], + }; + }), + fileChanges: detail.fileChanges, + diffs: detail.diffs, + fileTree: detail.fileTree, + minutesUsed: detail.minutesUsed, + activeSandboxId: detail.activeSandboxId ?? null, + }; + }), + ); + const tasks = resolvedTasks.filter((task): task is Exclude<(typeof resolvedTasks)[number], null> => task !== null); const repositories = summary.repos .map((repo) => ({ @@ -642,14 +680,14 @@ export function createBackendClient(options: BackendClientOptions): BackendClien }; }; - const subscribeWorkbench = (organizationId: string, listener: () => void): (() => void) => { - let entry = workbenchSubscriptions.get(organizationId); + const subscribeWorkspace = (organizationId: string, listener: () => void): (() => void) => { + let entry = workspaceSubscriptions.get(organizationId); if (!entry) { entry = { listeners: new Set(), disposeConnPromise: null, }; - workbenchSubscriptions.set(organizationId, entry); + workspaceSubscriptions.set(organizationId, entry); } entry.listeners.add(listener); @@ -658,8 +696,8 @@ export function createBackendClient(options: BackendClientOptions): BackendClien entry.disposeConnPromise = (async () => { const handle = await organization(organizationId); const conn = (handle as any).connect(); - const unsubscribeEvent = conn.on("workbenchUpdated", () => { - const current = workbenchSubscriptions.get(organizationId); + const unsubscribeEvent = conn.on("organizationUpdated", () => { + const current = workspaceSubscriptions.get(organizationId); if (!current) { return; } @@ -677,7 +715,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien } return () => { - const current = workbenchSubscriptions.get(organizationId); + const current = workspaceSubscriptions.get(organizationId); if (!current) { return; } @@ -686,7 +724,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien return; } - workbenchSubscriptions.delete(organizationId); + workspaceSubscriptions.delete(organizationId); void current.disposeConnPromise?.then(async (disposeConn) => { await disposeConn?.(); }); @@ -849,6 +887,14 @@ export function createBackendClient(options: BackendClientOptions): BackendClien return await (await appOrganization()).selectAppOrganization({ sessionId, organizationId }); }, + async setAppDefaultModel(defaultModel: WorkspaceModelId): Promise { + const sessionId = await getSessionId(); + if (!sessionId) { + throw new Error("No active auth session"); + } + return await (await appOrganization()).setAppDefaultModel({ sessionId, defaultModel }); + }, + async updateAppOrganizationProfile(input: UpdateFoundryOrganizationProfileInput): Promise { const sessionId = await getSessionId(); if (!sessionId) { @@ -948,33 +994,36 @@ export function createBackendClient(options: BackendClientOptions): BackendClien return (await organization(organizationId)).getRepoOverview({ organizationId, repoId }); }, - async getTask(organizationId: string, taskId: string): Promise { + async getTask(organizationId: string, repoId: string, taskId: string): Promise { return (await organization(organizationId)).getTask({ organizationId, + repoId, taskId, }); }, async listHistory(input: HistoryQueryInput): Promise { - return (await organization(input.organizationId)).history(input); + return (await organization(input.organizationId)).auditLog(input); }, - async switchTask(organizationId: string, taskId: string): Promise { - return (await organization(organizationId)).switchTask(taskId); + async switchTask(organizationId: string, repoId: string, taskId: string): Promise { + return (await organization(organizationId)).switchTask({ repoId, taskId }); }, - async attachTask(organizationId: string, taskId: string): Promise<{ target: string; sessionId: string | null }> { + async attachTask(organizationId: string, repoId: string, taskId: string): Promise<{ target: string; sessionId: string | null }> { return (await organization(organizationId)).attachTask({ organizationId, + repoId, taskId, reason: "cli.attach", }); }, - async runAction(organizationId: string, taskId: string, action: TaskAction): Promise { + async runAction(organizationId: string, repoId: string, taskId: string, action: TaskAction): Promise { if (action === "push") { await (await organization(organizationId)).pushTask({ organizationId, + repoId, taskId, reason: "cli.push", }); @@ -983,6 +1032,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (action === "sync") { await (await organization(organizationId)).syncTask({ organizationId, + repoId, taskId, reason: "cli.sync", }); @@ -991,6 +1041,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (action === "merge") { await (await organization(organizationId)).mergeTask({ organizationId, + repoId, taskId, reason: "cli.merge", }); @@ -999,6 +1050,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien if (action === "archive") { await (await organization(organizationId)).archiveTask({ organizationId, + repoId, taskId, reason: "cli.archive", }); @@ -1006,6 +1058,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien } await (await organization(organizationId)).killTask({ organizationId, + repoId, taskId, reason: "cli.kill", }); @@ -1156,96 +1209,96 @@ export function createBackendClient(options: BackendClientOptions): BackendClien return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.sandboxAgentConnection()); }, + async getSandboxWorkspaceModelGroups(organizationId: string, sandboxProviderId: SandboxProviderId, sandboxId: string): Promise { + return await withSandboxHandle(organizationId, sandboxProviderId, sandboxId, async (handle) => handle.listWorkspaceModelGroups()); + }, + async getOrganizationSummary(organizationId: string): Promise { return (await organization(organizationId)).getOrganizationSummary({ organizationId }); }, - async getTaskDetail(organizationId: string, repoId: string, taskIdValue: string): Promise { - return (await task(organizationId, repoId, taskIdValue)).getTaskDetail(); + async getTaskDetail(organizationId: string, repoId: string, taskIdValue: string): Promise { + return await getTaskDetailWithAuth(organizationId, repoId, taskIdValue); }, - async getSessionDetail(organizationId: string, repoId: string, taskIdValue: string, sessionId: string): Promise { - return (await task(organizationId, repoId, taskIdValue)).getSessionDetail({ sessionId }); + async getSessionDetail(organizationId: string, repoId: string, taskIdValue: string, sessionId: string): Promise { + return await getSessionDetailWithAuth(organizationId, repoId, taskIdValue, sessionId); }, - async getWorkbench(organizationId: string): Promise { - return await getWorkbenchCompat(organizationId); + async getWorkspace(organizationId: string): Promise { + return await getWorkspaceCompat(organizationId); }, - subscribeWorkbench(organizationId: string, listener: () => void): () => void { - return subscribeWorkbench(organizationId, listener); + subscribeWorkspace(organizationId: string, listener: () => void): () => void { + return subscribeWorkspace(organizationId, listener); }, - async createWorkbenchTask(organizationId: string, input: TaskWorkbenchCreateTaskInput): Promise { - return (await organization(organizationId)).createWorkbenchTask(input); + async createWorkspaceTask(organizationId: string, input: TaskWorkspaceCreateTaskInput): Promise { + return (await organization(organizationId)).createWorkspaceTask(await withAuthSessionInput(input)); }, - async markWorkbenchUnread(organizationId: string, input: TaskWorkbenchSelectInput): Promise { - await (await organization(organizationId)).markWorkbenchUnread(input); + async markWorkspaceUnread(organizationId: string, input: TaskWorkspaceSelectInput): Promise { + await (await organization(organizationId)).markWorkspaceUnread(await withAuthSessionInput(input)); }, - async renameWorkbenchTask(organizationId: string, input: TaskWorkbenchRenameInput): Promise { - await (await organization(organizationId)).renameWorkbenchTask(input); + async renameWorkspaceTask(organizationId: string, input: TaskWorkspaceRenameInput): Promise { + await (await organization(organizationId)).renameWorkspaceTask(await withAuthSessionInput(input)); }, - async renameWorkbenchBranch(organizationId: string, input: TaskWorkbenchRenameInput): Promise { - await (await organization(organizationId)).renameWorkbenchBranch(input); + async createWorkspaceSession(organizationId: string, input: TaskWorkspaceSelectInput & { model?: string }): Promise<{ sessionId: string }> { + return await (await organization(organizationId)).createWorkspaceSession(await withAuthSessionInput(input)); }, - async createWorkbenchSession(organizationId: string, input: TaskWorkbenchSelectInput & { model?: string }): Promise<{ sessionId: string }> { - return await (await organization(organizationId)).createWorkbenchSession(input); + async renameWorkspaceSession(organizationId: string, input: TaskWorkspaceRenameSessionInput): Promise { + await (await organization(organizationId)).renameWorkspaceSession(await withAuthSessionInput(input)); }, - async renameWorkbenchSession(organizationId: string, input: TaskWorkbenchRenameSessionInput): Promise { - await (await organization(organizationId)).renameWorkbenchSession(input); + async selectWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise { + await (await organization(organizationId)).selectWorkspaceSession(await withAuthSessionInput(input)); }, - async setWorkbenchSessionUnread(organizationId: string, input: TaskWorkbenchSetSessionUnreadInput): Promise { - await (await organization(organizationId)).setWorkbenchSessionUnread(input); + async setWorkspaceSessionUnread(organizationId: string, input: TaskWorkspaceSetSessionUnreadInput): Promise { + await (await organization(organizationId)).setWorkspaceSessionUnread(await withAuthSessionInput(input)); }, - async updateWorkbenchDraft(organizationId: string, input: TaskWorkbenchUpdateDraftInput): Promise { - await (await organization(organizationId)).updateWorkbenchDraft(input); + async updateWorkspaceDraft(organizationId: string, input: TaskWorkspaceUpdateDraftInput): Promise { + await (await organization(organizationId)).updateWorkspaceDraft(await withAuthSessionInput(input)); }, - async changeWorkbenchModel(organizationId: string, input: TaskWorkbenchChangeModelInput): Promise { - await (await organization(organizationId)).changeWorkbenchModel(input); + async changeWorkspaceModel(organizationId: string, input: TaskWorkspaceChangeModelInput): Promise { + await (await organization(organizationId)).changeWorkspaceModel(await withAuthSessionInput(input)); }, - async sendWorkbenchMessage(organizationId: string, input: TaskWorkbenchSendMessageInput): Promise { - await (await organization(organizationId)).sendWorkbenchMessage(input); + async sendWorkspaceMessage(organizationId: string, input: TaskWorkspaceSendMessageInput): Promise { + await (await organization(organizationId)).sendWorkspaceMessage(await withAuthSessionInput(input)); }, - async stopWorkbenchSession(organizationId: string, input: TaskWorkbenchSessionInput): Promise { - await (await organization(organizationId)).stopWorkbenchSession(input); + async stopWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise { + await (await organization(organizationId)).stopWorkspaceSession(await withAuthSessionInput(input)); }, - async closeWorkbenchSession(organizationId: string, input: TaskWorkbenchSessionInput): Promise { - await (await organization(organizationId)).closeWorkbenchSession(input); + async closeWorkspaceSession(organizationId: string, input: TaskWorkspaceSessionInput): Promise { + await (await organization(organizationId)).closeWorkspaceSession(await withAuthSessionInput(input)); }, - async publishWorkbenchPr(organizationId: string, input: TaskWorkbenchSelectInput): Promise { - await (await organization(organizationId)).publishWorkbenchPr(input); + async publishWorkspacePr(organizationId: string, input: TaskWorkspaceSelectInput): Promise { + await (await organization(organizationId)).publishWorkspacePr(await withAuthSessionInput(input)); }, - async revertWorkbenchFile(organizationId: string, input: TaskWorkbenchDiffInput): Promise { - await (await organization(organizationId)).revertWorkbenchFile(input); + async changeWorkspaceTaskOwner(organizationId: string, input: TaskWorkspaceChangeOwnerInput): Promise { + await (await organization(organizationId)).changeWorkspaceTaskOwner(await withAuthSessionInput(input)); }, - async reloadGithubOrganization(organizationId: string): Promise { - await (await organization(organizationId)).reloadGithubOrganization(); + async revertWorkspaceFile(organizationId: string, input: TaskWorkspaceDiffInput): Promise { + await (await organization(organizationId)).revertWorkspaceFile(await withAuthSessionInput(input)); }, - async reloadGithubPullRequests(organizationId: string): Promise { - await (await organization(organizationId)).reloadGithubPullRequests(); + async adminReloadGithubOrganization(organizationId: string): Promise { + await (await organization(organizationId)).adminReloadGithubOrganization(); }, - async reloadGithubRepository(organizationId: string, repoId: string): Promise { - await (await organization(organizationId)).reloadGithubRepository({ repoId }); - }, - - async reloadGithubPullRequest(organizationId: string, repoId: string, prNumber: number): Promise { - await (await organization(organizationId)).reloadGithubPullRequest({ repoId, prNumber }); + async adminReloadGithubRepository(organizationId: string, repoId: string): Promise { + await (await organization(organizationId)).adminReloadGithubRepository({ repoId }); }, async health(): Promise<{ ok: true }> { diff --git a/foundry/packages/client/src/index.ts b/foundry/packages/client/src/index.ts index 87909a9..e28745f 100644 --- a/foundry/packages/client/src/index.ts +++ b/foundry/packages/client/src/index.ts @@ -8,4 +8,4 @@ export * from "./subscription/use-subscription.js"; export * from "./keys.js"; export * from "./mock-app.js"; export * from "./view-model.js"; -export * from "./workbench-client.js"; +export * from "./workspace-client.js"; diff --git a/foundry/packages/client/src/keys.ts b/foundry/packages/client/src/keys.ts index 314f16a..7242aae 100644 --- a/foundry/packages/client/src/keys.ts +++ b/foundry/packages/client/src/keys.ts @@ -4,18 +4,14 @@ export function organizationKey(organizationId: string): ActorKey { return ["org", organizationId]; } -export function repositoryKey(organizationId: string, repoId: string): ActorKey { - return ["org", organizationId, "repository", repoId]; -} - export function taskKey(organizationId: string, repoId: string, taskId: string): ActorKey { - return ["org", organizationId, "repository", repoId, "task", taskId]; + return ["org", organizationId, "task", repoId, taskId]; } export function taskSandboxKey(organizationId: string, sandboxId: string): ActorKey { return ["org", organizationId, "sandbox", sandboxId]; } -export function historyKey(organizationId: string, repoId: string): ActorKey { - return ["org", organizationId, "repository", repoId, "history"]; +export function auditLogKey(organizationId: string): ActorKey { + return ["org", organizationId, "audit-log"]; } diff --git a/foundry/packages/client/src/mock-app.ts b/foundry/packages/client/src/mock-app.ts index 0fa6fc7..00fd9ca 100644 --- a/foundry/packages/client/src/mock-app.ts +++ b/foundry/packages/client/src/mock-app.ts @@ -1,4 +1,8 @@ -import type { WorkbenchModelId } from "@sandbox-agent/foundry-shared"; +import { DEFAULT_WORKSPACE_MODEL_GROUPS, DEFAULT_WORKSPACE_MODEL_ID, type WorkspaceModelId } from "@sandbox-agent/foundry-shared"; + +const claudeModels = DEFAULT_WORKSPACE_MODEL_GROUPS.find((group) => group.agentKind === "Claude")?.models ?? []; +const CLAUDE_SECONDARY_MODEL_ID = claudeModels[1]?.id ?? claudeModels[0]?.id ?? DEFAULT_WORKSPACE_MODEL_ID; +const CLAUDE_TERTIARY_MODEL_ID = claudeModels[2]?.id ?? CLAUDE_SECONDARY_MODEL_ID; import { injectMockLatency } from "./mock/latency.js"; import rivetDevFixture from "../../../scripts/data/rivet-dev.json" with { type: "json" }; @@ -16,6 +20,7 @@ export interface MockFoundryUser { githubLogin: string; roleLabel: string; eligibleOrganizationIds: string[]; + defaultModel: WorkspaceModelId; } export interface MockFoundryOrganizationMember { @@ -61,7 +66,6 @@ export interface MockFoundryOrganizationSettings { slug: string; primaryDomain: string; seatAccrualMode: "first_prompt"; - defaultModel: WorkbenchModelId; autoImportRepos: boolean; } @@ -111,6 +115,7 @@ export interface MockFoundryAppClient { skipStarterRepo(): Promise; starStarterRepo(organizationId: string): Promise; selectOrganization(organizationId: string): Promise; + setDefaultModel(model: WorkspaceModelId): Promise; updateOrganizationProfile(input: UpdateMockOrganizationProfileInput): Promise; triggerGithubSync(organizationId: string): Promise; completeHostedCheckout(organizationId: string, planId: MockBillingPlanId): Promise; @@ -180,7 +185,6 @@ function buildRivetOrganization(): MockFoundryOrganization { slug: "rivet", primaryDomain: "rivet.dev", seatAccrualMode: "first_prompt", - defaultModel: "gpt-5.3-codex", autoImportRepos: true, }, github: { @@ -233,6 +237,7 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { githubLogin: "nathan", roleLabel: "Founder", eligibleOrganizationIds: ["personal-nathan", "acme", "rivet"], + defaultModel: DEFAULT_WORKSPACE_MODEL_ID, }, { id: "user-maya", @@ -241,6 +246,7 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { githubLogin: "maya", roleLabel: "Staff Engineer", eligibleOrganizationIds: ["acme"], + defaultModel: CLAUDE_SECONDARY_MODEL_ID, }, { id: "user-jamie", @@ -249,6 +255,7 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { githubLogin: "jamie", roleLabel: "Platform Lead", eligibleOrganizationIds: ["personal-jamie", "rivet"], + defaultModel: CLAUDE_TERTIARY_MODEL_ID, }, ], organizations: [ @@ -261,7 +268,6 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { slug: "nathan", primaryDomain: "personal", seatAccrualMode: "first_prompt", - defaultModel: "claude-sonnet-4", autoImportRepos: true, }, github: { @@ -297,7 +303,6 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { slug: "acme", primaryDomain: "acme.dev", seatAccrualMode: "first_prompt", - defaultModel: "claude-sonnet-4", autoImportRepos: true, }, github: { @@ -342,7 +347,6 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot { slug: "jamie", primaryDomain: "personal", seatAccrualMode: "first_prompt", - defaultModel: "claude-opus-4", autoImportRepos: true, }, github: { @@ -538,6 +542,18 @@ class MockFoundryAppStore implements MockFoundryAppClient { } } + async setDefaultModel(model: WorkspaceModelId): Promise { + await this.injectAsyncLatency(); + const currentUserId = this.snapshot.auth.currentUserId; + if (!currentUserId) { + throw new Error("No signed-in mock user"); + } + this.updateSnapshot((current) => ({ + ...current, + users: current.users.map((user) => (user.id === currentUserId ? { ...user, defaultModel: model } : user)), + })); + } + async updateOrganizationProfile(input: UpdateMockOrganizationProfileInput): Promise { await this.injectAsyncLatency(); this.requireOrganization(input.organizationId); diff --git a/foundry/packages/client/src/mock/backend-client.ts b/foundry/packages/client/src/mock/backend-client.ts index 011192d..191f68c 100644 --- a/foundry/packages/client/src/mock/backend-client.ts +++ b/foundry/packages/client/src/mock/backend-client.ts @@ -6,25 +6,26 @@ import type { SessionEvent, TaskRecord, TaskSummary, - TaskWorkbenchChangeModelInput, - TaskWorkbenchCreateTaskInput, - TaskWorkbenchCreateTaskResponse, - TaskWorkbenchDiffInput, - TaskWorkbenchRenameInput, - TaskWorkbenchRenameSessionInput, - TaskWorkbenchSelectInput, - TaskWorkbenchSetSessionUnreadInput, - TaskWorkbenchSendMessageInput, - TaskWorkbenchSnapshot, - TaskWorkbenchSessionInput, - TaskWorkbenchUpdateDraftInput, + TaskWorkspaceChangeModelInput, + TaskWorkspaceCreateTaskInput, + TaskWorkspaceCreateTaskResponse, + TaskWorkspaceDiffInput, + TaskWorkspaceRenameInput, + TaskWorkspaceRenameSessionInput, + TaskWorkspaceSelectInput, + TaskWorkspaceSetSessionUnreadInput, + TaskWorkspaceSendMessageInput, + TaskWorkspaceSnapshot, + TaskWorkspaceSessionInput, + TaskWorkspaceUpdateDraftInput, TaskEvent, - WorkbenchSessionDetail, - WorkbenchTaskDetail, - WorkbenchTaskSummary, + WorkspaceSessionDetail, + WorkspaceModelGroup, + WorkspaceTaskDetail, + WorkspaceTaskSummary, OrganizationEvent, OrganizationSummarySnapshot, - HistoryEvent, + AuditLogEvent as HistoryEvent, HistoryQueryInput, SandboxProviderId, RepoOverview, @@ -32,9 +33,10 @@ import type { StarSandboxAgentRepoResult, SwitchResult, } from "@sandbox-agent/foundry-shared"; +import { DEFAULT_WORKSPACE_MODEL_GROUPS } from "@sandbox-agent/foundry-shared"; import type { ProcessCreateRequest, ProcessLogFollowQuery, ProcessLogsResponse, ProcessSignalQuery } from "sandbox-agent"; import type { ActorConn, BackendClient, SandboxProcessRecord, SandboxSessionEventRecord, SandboxSessionRecord } from "../backend-client.js"; -import { getSharedMockWorkbenchClient } from "./workbench-client.js"; +import { getSharedMockWorkspaceClient } from "./workspace-client.js"; interface MockProcessRecord extends SandboxProcessRecord { logText: string; @@ -89,7 +91,7 @@ function toTaskStatus(status: TaskRecord["status"], archived: boolean): TaskReco } export function createMockBackendClient(defaultOrganizationId = "default"): BackendClient { - const workbench = getSharedMockWorkbenchClient(); + const workspace = getSharedMockWorkspaceClient(); const listenersBySandboxId = new Map void>>(); const processesBySandboxId = new Map(); const connectionListeners = new Map void>>(); @@ -97,7 +99,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back let nextProcessId = 1; const requireTask = (taskId: string) => { - const task = workbench.getSnapshot().tasks.find((candidate) => candidate.id === taskId); + const task = workspace.getSnapshot().tasks.find((candidate) => candidate.id === taskId); if (!task) { throw new Error(`Unknown mock task ${taskId}`); } @@ -164,7 +166,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back async dispose(): Promise {}, }); - const buildTaskSummary = (task: TaskWorkbenchSnapshot["tasks"][number]): WorkbenchTaskSummary => ({ + const buildTaskSummary = (task: TaskWorkspaceSnapshot["tasks"][number]): WorkspaceTaskSummary => ({ id: task.id, repoId: task.repoId, title: task.title, @@ -173,6 +175,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back updatedAtMs: task.updatedAtMs, branch: task.branch, pullRequest: task.pullRequest, + activeSessionId: task.activeSessionId ?? task.sessions[0]?.id ?? null, sessionsSummary: task.sessions.map((tab) => ({ id: tab.id, sessionId: tab.sessionId, @@ -185,18 +188,13 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back unread: tab.unread, created: tab.created, })), + primaryUserLogin: null, + primaryUserAvatarUrl: null, }); - const buildTaskDetail = (task: TaskWorkbenchSnapshot["tasks"][number]): WorkbenchTaskDetail => ({ + const buildTaskDetail = (task: TaskWorkspaceSnapshot["tasks"][number]): WorkspaceTaskDetail => ({ ...buildTaskSummary(task), task: task.title, - agentType: task.sessions[0]?.agent === "Codex" ? "codex" : "claude", - runtimeStatus: toTaskStatus(task.status === "archived" ? "archived" : "running", task.status === "archived"), - statusMessage: task.status === "archived" ? "archived" : "mock sandbox ready", - activeSessionId: task.sessions[0]?.sessionId ?? null, - diffStat: task.fileChanges.length > 0 ? `+${task.fileChanges.length}/-${task.fileChanges.length}` : "+0/-0", - prUrl: task.pullRequest ? `https://example.test/pr/${task.pullRequest.number}` : null, - reviewStatus: null, fileChanges: task.fileChanges, diffs: task.diffs, fileTree: task.fileTree, @@ -206,12 +204,13 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back sandboxProviderId: "local", sandboxId: task.id, cwd: mockCwd(task.repoName, task.id), + url: null, }, ], activeSandboxId: task.id, }); - const buildSessionDetail = (task: TaskWorkbenchSnapshot["tasks"][number], sessionId: string): WorkbenchSessionDetail => { + const buildSessionDetail = (task: TaskWorkspaceSnapshot["tasks"][number], sessionId: string): WorkspaceSessionDetail => { const tab = task.sessions.find((candidate) => candidate.id === sessionId); if (!tab) { throw new Error(`Unknown mock session ${sessionId} for task ${task.id}`); @@ -232,10 +231,24 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }; const buildOrganizationSummary = (): OrganizationSummarySnapshot => { - const snapshot = workbench.getSnapshot(); + const snapshot = workspace.getSnapshot(); const taskSummaries = snapshot.tasks.map(buildTaskSummary); return { organizationId: defaultOrganizationId, + github: { + connectedAccount: "mock", + installationStatus: "connected", + syncStatus: "synced", + importedRepoCount: snapshot.repos.length, + lastSyncLabel: "Synced just now", + lastSyncAt: nowMs(), + lastWebhookAt: null, + lastWebhookEvent: "", + syncGeneration: 1, + syncPhase: null, + processedRepositoryCount: snapshot.repos.length, + totalRepositoryCount: snapshot.repos.length, + }, repos: snapshot.repos.map((repo) => { const repoTasks = taskSummaries.filter((task) => task.repoId === repo.id); return { @@ -246,7 +259,6 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }; }), taskSummaries, - openPullRequests: [], }; }; @@ -256,20 +268,16 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back `sandbox:${organizationId}:${sandboxProviderId}:${sandboxId}`; const emitOrganizationSnapshot = (): void => { - const summary = buildOrganizationSummary(); - const latestTask = [...summary.taskSummaries].sort((left, right) => right.updatedAtMs - left.updatedAtMs)[0] ?? null; - if (latestTask) { - emitConnectionEvent(organizationScope(defaultOrganizationId), "organizationUpdated", { - type: "taskSummaryUpdated", - taskSummary: latestTask, - } satisfies OrganizationEvent); - } + emitConnectionEvent(organizationScope(defaultOrganizationId), "organizationUpdated", { + type: "organizationUpdated", + snapshot: buildOrganizationSummary(), + } satisfies OrganizationEvent); }; const emitTaskUpdate = (taskId: string): void => { const task = requireTask(taskId); emitConnectionEvent(taskScope(defaultOrganizationId, task.repoId, task.id), "taskUpdated", { - type: "taskDetailUpdated", + type: "taskUpdated", detail: buildTaskDetail(task), } satisfies TaskEvent); }; @@ -303,9 +311,8 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back task: task.title, sandboxProviderId: "local", status: toTaskStatus(archived ? "archived" : "running", archived), - statusMessage: archived ? "archived" : "mock sandbox ready", + pullRequest: null, activeSandboxId: task.id, - activeSessionId: task.sessions[0]?.sessionId ?? null, sandboxes: [ { sandboxId: task.id, @@ -317,17 +324,6 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back updatedAt: task.updatedAtMs, }, ], - agentType: task.sessions[0]?.agent === "Codex" ? "codex" : "claude", - prSubmitted: Boolean(task.pullRequest), - diffStat: task.fileChanges.length > 0 ? `+${task.fileChanges.length}/-${task.fileChanges.length}` : "+0/-0", - prUrl: task.pullRequest ? `https://example.test/pr/${task.pullRequest.number}` : null, - prAuthor: task.pullRequest ? "mock" : null, - ciStatus: null, - reviewStatus: null, - reviewer: null, - conflictsWithMain: "0", - hasUnpushed: task.fileChanges.length > 0 ? "1" : "0", - parentBranch: null, createdAt: task.updatedAtMs, updatedAt: task.updatedAtMs, }; @@ -400,6 +396,10 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back return unsupportedAppSnapshot(); }, + async setAppDefaultModel(): Promise { + return unsupportedAppSnapshot(); + }, + async updateAppOrganizationProfile(): Promise { return unsupportedAppSnapshot(); }, @@ -433,7 +433,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }, async listRepos(_organizationId: string): Promise { - return workbench.getSnapshot().repos.map((repo) => ({ + return workspace.getSnapshot().repos.map((repo) => ({ organizationId: defaultOrganizationId, repoId: repo.id, remoteUrl: mockRepoRemote(repo.label), @@ -447,7 +447,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }, async listTasks(_organizationId: string, repoId?: string): Promise { - return workbench + return workspace .getSnapshot() .tasks.filter((task) => !repoId || task.repoId === repoId) .map((task) => ({ @@ -457,6 +457,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back branchName: task.branch, title: task.title, status: task.status === "archived" ? "archived" : "running", + pullRequest: null, updatedAt: task.updatedAtMs, })); }, @@ -464,7 +465,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back async getRepoOverview(_organizationId: string, _repoId: string): Promise { notSupported("getRepoOverview"); }, - async getTask(_organizationId: string, taskId: string): Promise { + async getTask(_organizationId: string, _repoId: string, taskId: string): Promise { return buildTaskRecord(taskId); }, @@ -472,7 +473,7 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back return []; }, - async switchTask(_organizationId: string, taskId: string): Promise { + async switchTask(_organizationId: string, _repoId: string, taskId: string): Promise { return { organizationId: defaultOrganizationId, taskId, @@ -481,14 +482,14 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back }; }, - async attachTask(_organizationId: string, taskId: string): Promise<{ target: string; sessionId: string | null }> { + async attachTask(_organizationId: string, _repoId: string, taskId: string): Promise<{ target: string; sessionId: string | null }> { return { target: `mock://${taskId}`, sessionId: requireTask(taskId).sessions[0]?.sessionId ?? null, }; }, - async runAction(_organizationId: string, _taskId: string): Promise { + async runAction(_organizationId: string, _repoId: string, _taskId: string): Promise { notSupported("runAction"); }, @@ -637,28 +638,32 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back return { endpoint: "mock://terminal-unavailable" }; }, + async getSandboxWorkspaceModelGroups(_organizationId: string, _sandboxProviderId: SandboxProviderId, _sandboxId: string): Promise { + return DEFAULT_WORKSPACE_MODEL_GROUPS; + }, + async getOrganizationSummary(): Promise { return buildOrganizationSummary(); }, - async getTaskDetail(_organizationId: string, _repoId: string, taskId: string): Promise { + async getTaskDetail(_organizationId: string, _repoId: string, taskId: string): Promise { return buildTaskDetail(requireTask(taskId)); }, - async getSessionDetail(_organizationId: string, _repoId: string, taskId: string, sessionId: string): Promise { + async getSessionDetail(_organizationId: string, _repoId: string, taskId: string, sessionId: string): Promise { return buildSessionDetail(requireTask(taskId), sessionId); }, - async getWorkbench(): Promise { - return workbench.getSnapshot(); + async getWorkspace(): Promise { + return workspace.getSnapshot(); }, - subscribeWorkbench(_organizationId: string, listener: () => void): () => void { - return workbench.subscribe(listener); + subscribeWorkspace(_organizationId: string, listener: () => void): () => void { + return workspace.subscribe(listener); }, - async createWorkbenchTask(_organizationId: string, input: TaskWorkbenchCreateTaskInput): Promise { - const created = await workbench.createTask(input); + async createWorkspaceTask(_organizationId: string, input: TaskWorkspaceCreateTaskInput): Promise { + const created = await workspace.createTask(input); emitOrganizationSnapshot(); emitTaskUpdate(created.taskId); if (created.sessionId) { @@ -667,99 +672,104 @@ export function createMockBackendClient(defaultOrganizationId = "default"): Back return created; }, - async markWorkbenchUnread(_organizationId: string, input: TaskWorkbenchSelectInput): Promise { - await workbench.markTaskUnread(input); + async markWorkspaceUnread(_organizationId: string, input: TaskWorkspaceSelectInput): Promise { + await workspace.markTaskUnread(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); }, - async renameWorkbenchTask(_organizationId: string, input: TaskWorkbenchRenameInput): Promise { - await workbench.renameTask(input); + async renameWorkspaceTask(_organizationId: string, input: TaskWorkspaceRenameInput): Promise { + await workspace.renameTask(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); }, - async renameWorkbenchBranch(_organizationId: string, input: TaskWorkbenchRenameInput): Promise { - await workbench.renameBranch(input); - emitOrganizationSnapshot(); - emitTaskUpdate(input.taskId); - }, - - async createWorkbenchSession(_organizationId: string, input: TaskWorkbenchSelectInput & { model?: string }): Promise<{ sessionId: string }> { - const created = await workbench.addSession(input); + async createWorkspaceSession(_organizationId: string, input: TaskWorkspaceSelectInput & { model?: string }): Promise<{ sessionId: string }> { + const created = await workspace.addSession(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); emitSessionUpdate(input.taskId, created.sessionId); return created; }, - async renameWorkbenchSession(_organizationId: string, input: TaskWorkbenchRenameSessionInput): Promise { - await workbench.renameSession(input); + async renameWorkspaceSession(_organizationId: string, input: TaskWorkspaceRenameSessionInput): Promise { + await workspace.renameSession(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); emitSessionUpdate(input.taskId, input.sessionId); }, - async setWorkbenchSessionUnread(_organizationId: string, input: TaskWorkbenchSetSessionUnreadInput): Promise { - await workbench.setSessionUnread(input); + async selectWorkspaceSession(_organizationId: string, input: TaskWorkspaceSessionInput): Promise { + await workspace.selectSession(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); emitSessionUpdate(input.taskId, input.sessionId); }, - async updateWorkbenchDraft(_organizationId: string, input: TaskWorkbenchUpdateDraftInput): Promise { - await workbench.updateDraft(input); + async setWorkspaceSessionUnread(_organizationId: string, input: TaskWorkspaceSetSessionUnreadInput): Promise { + await workspace.setSessionUnread(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); emitSessionUpdate(input.taskId, input.sessionId); }, - async changeWorkbenchModel(_organizationId: string, input: TaskWorkbenchChangeModelInput): Promise { - await workbench.changeModel(input); + async updateWorkspaceDraft(_organizationId: string, input: TaskWorkspaceUpdateDraftInput): Promise { + await workspace.updateDraft(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); emitSessionUpdate(input.taskId, input.sessionId); }, - async sendWorkbenchMessage(_organizationId: string, input: TaskWorkbenchSendMessageInput): Promise { - await workbench.sendMessage(input); + async changeWorkspaceModel(_organizationId: string, input: TaskWorkspaceChangeModelInput): Promise { + await workspace.changeModel(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); emitSessionUpdate(input.taskId, input.sessionId); }, - async stopWorkbenchSession(_organizationId: string, input: TaskWorkbenchSessionInput): Promise { - await workbench.stopAgent(input); + async sendWorkspaceMessage(_organizationId: string, input: TaskWorkspaceSendMessageInput): Promise { + await workspace.sendMessage(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); emitSessionUpdate(input.taskId, input.sessionId); }, - async closeWorkbenchSession(_organizationId: string, input: TaskWorkbenchSessionInput): Promise { - await workbench.closeSession(input); + async stopWorkspaceSession(_organizationId: string, input: TaskWorkspaceSessionInput): Promise { + await workspace.stopAgent(input); + emitOrganizationSnapshot(); + emitTaskUpdate(input.taskId); + emitSessionUpdate(input.taskId, input.sessionId); + }, + + async closeWorkspaceSession(_organizationId: string, input: TaskWorkspaceSessionInput): Promise { + await workspace.closeSession(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); }, - async publishWorkbenchPr(_organizationId: string, input: TaskWorkbenchSelectInput): Promise { - await workbench.publishPr(input); + async publishWorkspacePr(_organizationId: string, input: TaskWorkspaceSelectInput): Promise { + await workspace.publishPr(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); }, - async revertWorkbenchFile(_organizationId: string, input: TaskWorkbenchDiffInput): Promise { - await workbench.revertFile(input); + async changeWorkspaceTaskOwner( + _organizationId: string, + input: { repoId: string; taskId: string; targetUserId: string; targetUserName: string; targetUserEmail: string }, + ): Promise { + await workspace.changeOwner(input); emitOrganizationSnapshot(); emitTaskUpdate(input.taskId); }, - async reloadGithubOrganization(): Promise {}, + async revertWorkspaceFile(_organizationId: string, input: TaskWorkspaceDiffInput): Promise { + await workspace.revertFile(input); + emitOrganizationSnapshot(); + emitTaskUpdate(input.taskId); + }, - async reloadGithubPullRequests(): Promise {}, - - async reloadGithubRepository(): Promise {}, - - async reloadGithubPullRequest(): Promise {}, + async adminReloadGithubOrganization(): Promise {}, + async adminReloadGithubRepository(): Promise {}, async health(): Promise<{ ok: true }> { return { ok: true }; diff --git a/foundry/packages/client/src/mock/workbench-client.ts b/foundry/packages/client/src/mock/workspace-client.ts similarity index 74% rename from foundry/packages/client/src/mock/workbench-client.ts rename to foundry/packages/client/src/mock/workspace-client.ts index fbed2d0..7983e0f 100644 --- a/foundry/packages/client/src/mock/workbench-client.ts +++ b/foundry/packages/client/src/mock/workspace-client.ts @@ -1,33 +1,34 @@ import { MODEL_GROUPS, buildInitialMockLayoutViewModel, - groupWorkbenchRepositories, + groupWorkspaceRepositories, nowMs, providerAgent, randomReply, removeFileTreePath, slugify, uid, -} from "../workbench-model.js"; +} from "../workspace-model.js"; +import { DEFAULT_WORKSPACE_MODEL_ID, workspaceAgentForModel } from "@sandbox-agent/foundry-shared"; import type { - TaskWorkbenchAddSessionResponse, - TaskWorkbenchChangeModelInput, - TaskWorkbenchCreateTaskInput, - TaskWorkbenchCreateTaskResponse, - TaskWorkbenchDiffInput, - TaskWorkbenchRenameInput, - TaskWorkbenchRenameSessionInput, - TaskWorkbenchSelectInput, - TaskWorkbenchSetSessionUnreadInput, - TaskWorkbenchSendMessageInput, - TaskWorkbenchSnapshot, - TaskWorkbenchSessionInput, - TaskWorkbenchUpdateDraftInput, - WorkbenchSession as AgentSession, - WorkbenchTask as Task, - WorkbenchTranscriptEvent as TranscriptEvent, + TaskWorkspaceAddSessionResponse, + TaskWorkspaceChangeModelInput, + TaskWorkspaceCreateTaskInput, + TaskWorkspaceCreateTaskResponse, + TaskWorkspaceDiffInput, + TaskWorkspaceRenameInput, + TaskWorkspaceRenameSessionInput, + TaskWorkspaceSelectInput, + TaskWorkspaceSetSessionUnreadInput, + TaskWorkspaceSendMessageInput, + TaskWorkspaceSnapshot, + TaskWorkspaceSessionInput, + TaskWorkspaceUpdateDraftInput, + WorkspaceSession as AgentSession, + WorkspaceTask as Task, + WorkspaceTranscriptEvent as TranscriptEvent, } from "@sandbox-agent/foundry-shared"; -import type { TaskWorkbenchClient } from "../workbench-client.js"; +import type { TaskWorkspaceClient } from "../workspace-client.js"; function buildTranscriptEvent(params: { sessionId: string; @@ -47,12 +48,12 @@ function buildTranscriptEvent(params: { }; } -class MockWorkbenchStore implements TaskWorkbenchClient { +class MockWorkspaceStore implements TaskWorkspaceClient { private snapshot = buildInitialMockLayoutViewModel(); private listeners = new Set<() => void>(); private pendingTimers = new Map>(); - getSnapshot(): TaskWorkbenchSnapshot { + getSnapshot(): TaskWorkspaceSnapshot { return this.snapshot; } @@ -63,7 +64,7 @@ class MockWorkbenchStore implements TaskWorkbenchClient { }; } - async createTask(input: TaskWorkbenchCreateTaskInput): Promise { + async createTask(input: TaskWorkspaceCreateTaskInput): Promise { const id = uid(); const sessionId = `session-${id}`; const repo = this.snapshot.repos.find((candidate) => candidate.id === input.repoId); @@ -74,20 +75,19 @@ class MockWorkbenchStore implements TaskWorkbenchClient { id, repoId: repo.id, title: input.title?.trim() || "New Task", - status: "new", + status: "init_enqueue_provision", repoName: repo.label, updatedAtMs: nowMs(), branch: input.branch?.trim() || null, pullRequest: null, + activeSessionId: sessionId, sessions: [ { id: sessionId, sessionId: sessionId, sessionName: "Session 1", - agent: providerAgent( - MODEL_GROUPS.find((group) => group.models.some((model) => model.id === (input.model ?? "claude-sonnet-4")))?.provider ?? "Claude", - ), - model: input.model ?? "claude-sonnet-4", + agent: workspaceAgentForModel(input.model ?? DEFAULT_WORKSPACE_MODEL_ID, MODEL_GROUPS), + model: input.model ?? DEFAULT_WORKSPACE_MODEL_ID, status: "idle", thinkingSinceMs: null, unread: false, @@ -109,7 +109,7 @@ class MockWorkbenchStore implements TaskWorkbenchClient { return { taskId: id, sessionId }; } - async markTaskUnread(input: TaskWorkbenchSelectInput): Promise { + async markTaskUnread(input: TaskWorkspaceSelectInput): Promise { this.updateTask(input.taskId, (task) => { const targetSession = task.sessions[task.sessions.length - 1] ?? null; if (!targetSession) { @@ -123,7 +123,7 @@ class MockWorkbenchStore implements TaskWorkbenchClient { }); } - async renameTask(input: TaskWorkbenchRenameInput): Promise { + async renameTask(input: TaskWorkspaceRenameInput): Promise { const value = input.value.trim(); if (!value) { throw new Error(`Cannot rename task ${input.taskId} to an empty title`); @@ -131,28 +131,32 @@ class MockWorkbenchStore implements TaskWorkbenchClient { this.updateTask(input.taskId, (task) => ({ ...task, title: value, updatedAtMs: nowMs() })); } - async renameBranch(input: TaskWorkbenchRenameInput): Promise { - const value = input.value.trim(); - if (!value) { - throw new Error(`Cannot rename branch for task ${input.taskId} to an empty value`); - } - this.updateTask(input.taskId, (task) => ({ ...task, branch: value, updatedAtMs: nowMs() })); - } - - async archiveTask(input: TaskWorkbenchSelectInput): Promise { + async archiveTask(input: TaskWorkspaceSelectInput): Promise { this.updateTask(input.taskId, (task) => ({ ...task, status: "archived", updatedAtMs: nowMs() })); } - async publishPr(input: TaskWorkbenchSelectInput): Promise { + async publishPr(input: TaskWorkspaceSelectInput): Promise { const nextPrNumber = Math.max(0, ...this.snapshot.tasks.map((task) => task.pullRequest?.number ?? 0)) + 1; this.updateTask(input.taskId, (task) => ({ ...task, updatedAtMs: nowMs(), - pullRequest: { number: nextPrNumber, status: "ready" }, + pullRequest: { + number: nextPrNumber, + status: "ready", + title: task.title, + state: "open", + url: `https://example.test/pr/${nextPrNumber}`, + headRefName: task.branch ?? `task/${task.id}`, + baseRefName: "main", + repoFullName: task.repoName, + authorLogin: "mock", + isDraft: false, + updatedAtMs: nowMs(), + }, })); } - async revertFile(input: TaskWorkbenchDiffInput): Promise { + async revertFile(input: TaskWorkspaceDiffInput): Promise { this.updateTask(input.taskId, (task) => { const file = task.fileChanges.find((entry) => entry.path === input.path); const nextDiffs = { ...task.diffs }; @@ -167,7 +171,7 @@ class MockWorkbenchStore implements TaskWorkbenchClient { }); } - async updateDraft(input: TaskWorkbenchUpdateDraftInput): Promise { + async updateDraft(input: TaskWorkspaceUpdateDraftInput): Promise { this.assertSession(input.taskId, input.sessionId); this.updateTask(input.taskId, (task) => ({ ...task, @@ -187,7 +191,7 @@ class MockWorkbenchStore implements TaskWorkbenchClient { })); } - async sendMessage(input: TaskWorkbenchSendMessageInput): Promise { + async sendMessage(input: TaskWorkspaceSendMessageInput): Promise { const text = input.text.trim(); if (!text) { throw new Error(`Cannot send an empty mock prompt for task ${input.taskId}`); @@ -197,7 +201,7 @@ class MockWorkbenchStore implements TaskWorkbenchClient { const startedAtMs = nowMs(); this.updateTask(input.taskId, (currentTask) => { - const isFirstOnTask = currentTask.status === "new"; + const isFirstOnTask = String(currentTask.status).startsWith("init_"); const newTitle = isFirstOnTask ? (text.length > 50 ? `${text.slice(0, 47)}...` : text) : currentTask.title; const newBranch = isFirstOnTask ? `feat/${slugify(newTitle)}` : currentTask.branch; const userMessageLines = [text, ...input.attachments.map((attachment) => `@ ${attachment.filePath}:${attachment.lineNumber}`)]; @@ -288,7 +292,7 @@ class MockWorkbenchStore implements TaskWorkbenchClient { this.pendingTimers.set(input.sessionId, timer); } - async stopAgent(input: TaskWorkbenchSessionInput): Promise { + async stopAgent(input: TaskWorkspaceSessionInput): Promise { this.assertSession(input.taskId, input.sessionId); const existing = this.pendingTimers.get(input.sessionId); if (existing) { @@ -311,14 +315,22 @@ class MockWorkbenchStore implements TaskWorkbenchClient { }); } - async setSessionUnread(input: TaskWorkbenchSetSessionUnreadInput): Promise { + async selectSession(input: TaskWorkspaceSessionInput): Promise { + this.assertSession(input.taskId, input.sessionId); + this.updateTask(input.taskId, (currentTask) => ({ + ...currentTask, + activeSessionId: input.sessionId, + })); + } + + async setSessionUnread(input: TaskWorkspaceSetSessionUnreadInput): Promise { this.updateTask(input.taskId, (currentTask) => ({ ...currentTask, sessions: currentTask.sessions.map((candidate) => (candidate.id === input.sessionId ? { ...candidate, unread: input.unread } : candidate)), })); } - async renameSession(input: TaskWorkbenchRenameSessionInput): Promise { + async renameSession(input: TaskWorkspaceRenameSessionInput): Promise { const title = input.title.trim(); if (!title) { throw new Error(`Cannot rename session ${input.sessionId} to an empty title`); @@ -329,7 +341,7 @@ class MockWorkbenchStore implements TaskWorkbenchClient { })); } - async closeSession(input: TaskWorkbenchSessionInput): Promise { + async closeSession(input: TaskWorkspaceSessionInput): Promise { this.updateTask(input.taskId, (currentTask) => { if (currentTask.sessions.length <= 1) { return currentTask; @@ -337,12 +349,16 @@ class MockWorkbenchStore implements TaskWorkbenchClient { return { ...currentTask, + activeSessionId: + currentTask.activeSessionId === input.sessionId + ? (currentTask.sessions.find((candidate) => candidate.id !== input.sessionId)?.id ?? null) + : currentTask.activeSessionId, sessions: currentTask.sessions.filter((candidate) => candidate.id !== input.sessionId), }; }); } - async addSession(input: TaskWorkbenchSelectInput): Promise { + async addSession(input: TaskWorkspaceSelectInput): Promise { this.assertTask(input.taskId); const nextSessionId = uid(); const nextSession: AgentSession = { @@ -350,8 +366,8 @@ class MockWorkbenchStore implements TaskWorkbenchClient { sessionId: nextSessionId, sandboxSessionId: null, sessionName: `Session ${this.requireTask(input.taskId).sessions.length + 1}`, - agent: "Claude", - model: "claude-sonnet-4", + agent: workspaceAgentForModel(DEFAULT_WORKSPACE_MODEL_ID, MODEL_GROUPS), + model: DEFAULT_WORKSPACE_MODEL_ID, status: "idle", thinkingSinceMs: null, unread: false, @@ -363,12 +379,13 @@ class MockWorkbenchStore implements TaskWorkbenchClient { this.updateTask(input.taskId, (currentTask) => ({ ...currentTask, updatedAtMs: nowMs(), + activeSessionId: nextSession.id, sessions: [...currentTask.sessions, nextSession], })); return { sessionId: nextSession.id }; } - async changeModel(input: TaskWorkbenchChangeModelInput): Promise { + async changeModel(input: TaskWorkspaceChangeModelInput): Promise { const group = MODEL_GROUPS.find((candidate) => candidate.models.some((entry) => entry.id === input.model)); if (!group) { throw new Error(`Unable to resolve model provider for ${input.model}`); @@ -377,16 +394,24 @@ class MockWorkbenchStore implements TaskWorkbenchClient { this.updateTask(input.taskId, (currentTask) => ({ ...currentTask, sessions: currentTask.sessions.map((candidate) => - candidate.id === input.sessionId ? { ...candidate, model: input.model, agent: providerAgent(group.provider) } : candidate, + candidate.id === input.sessionId ? { ...candidate, model: input.model, agent: workspaceAgentForModel(input.model, MODEL_GROUPS) } : candidate, ), })); } - private updateState(updater: (current: TaskWorkbenchSnapshot) => TaskWorkbenchSnapshot): void { + async changeOwner(input: { repoId: string; taskId: string; targetUserId: string; targetUserName: string; targetUserEmail: string }): Promise { + this.updateTask(input.taskId, (currentTask) => ({ + ...currentTask, + primaryUserLogin: input.targetUserName, + primaryUserAvatarUrl: null, + })); + } + + private updateState(updater: (current: TaskWorkspaceSnapshot) => TaskWorkspaceSnapshot): void { const nextSnapshot = updater(this.snapshot); this.snapshot = { ...nextSnapshot, - repositories: groupWorkbenchRepositories(nextSnapshot.repos, nextSnapshot.tasks), + repositories: groupWorkspaceRepositories(nextSnapshot.repos, nextSnapshot.tasks), }; this.notify(); } @@ -436,11 +461,11 @@ function candidateEventIndex(task: Task, sessionId: string): number { return (session?.transcript.length ?? 0) + 1; } -let sharedMockWorkbenchClient: TaskWorkbenchClient | null = null; +let sharedMockWorkspaceClient: TaskWorkspaceClient | null = null; -export function getSharedMockWorkbenchClient(): TaskWorkbenchClient { - if (!sharedMockWorkbenchClient) { - sharedMockWorkbenchClient = new MockWorkbenchStore(); +export function getSharedMockWorkspaceClient(): TaskWorkspaceClient { + if (!sharedMockWorkspaceClient) { + sharedMockWorkspaceClient = new MockWorkspaceStore(); } - return sharedMockWorkbenchClient; + return sharedMockWorkspaceClient; } diff --git a/foundry/packages/client/src/remote/app-client.ts b/foundry/packages/client/src/remote/app-client.ts index 6daa2c5..f1cb908 100644 --- a/foundry/packages/client/src/remote/app-client.ts +++ b/foundry/packages/client/src/remote/app-client.ts @@ -1,4 +1,4 @@ -import type { FoundryAppSnapshot, FoundryBillingPlanId, UpdateFoundryOrganizationProfileInput } from "@sandbox-agent/foundry-shared"; +import type { FoundryAppSnapshot, FoundryBillingPlanId, UpdateFoundryOrganizationProfileInput, WorkspaceModelId } from "@sandbox-agent/foundry-shared"; import type { BackendClient } from "../backend-client.js"; import type { FoundryAppClient } from "../app-client.js"; @@ -72,6 +72,11 @@ class RemoteFoundryAppStore implements FoundryAppClient { this.notify(); } + async setDefaultModel(model: WorkspaceModelId): Promise { + this.snapshot = await this.backend.setAppDefaultModel(model); + this.notify(); + } + async updateOrganizationProfile(input: UpdateFoundryOrganizationProfileInput): Promise { this.snapshot = await this.backend.updateAppOrganizationProfile(input); this.notify(); diff --git a/foundry/packages/client/src/remote/workbench-client.ts b/foundry/packages/client/src/remote/workbench-client.ts deleted file mode 100644 index 0dcbecb..0000000 --- a/foundry/packages/client/src/remote/workbench-client.ts +++ /dev/null @@ -1,198 +0,0 @@ -import type { - TaskWorkbenchAddSessionResponse, - TaskWorkbenchChangeModelInput, - TaskWorkbenchCreateTaskInput, - TaskWorkbenchCreateTaskResponse, - TaskWorkbenchDiffInput, - TaskWorkbenchRenameInput, - TaskWorkbenchRenameSessionInput, - TaskWorkbenchSelectInput, - TaskWorkbenchSetSessionUnreadInput, - TaskWorkbenchSendMessageInput, - TaskWorkbenchSnapshot, - TaskWorkbenchSessionInput, - TaskWorkbenchUpdateDraftInput, -} from "@sandbox-agent/foundry-shared"; -import type { BackendClient } from "../backend-client.js"; -import { groupWorkbenchRepositories } from "../workbench-model.js"; -import type { TaskWorkbenchClient } from "../workbench-client.js"; - -export interface RemoteWorkbenchClientOptions { - backend: BackendClient; - organizationId: string; -} - -class RemoteWorkbenchStore implements TaskWorkbenchClient { - private readonly backend: BackendClient; - private readonly organizationId: string; - private snapshot: TaskWorkbenchSnapshot; - private readonly listeners = new Set<() => void>(); - private unsubscribeWorkbench: (() => void) | null = null; - private refreshPromise: Promise | null = null; - private refreshRetryTimeout: ReturnType | null = null; - - constructor(options: RemoteWorkbenchClientOptions) { - this.backend = options.backend; - this.organizationId = options.organizationId; - this.snapshot = { - organizationId: options.organizationId, - repos: [], - repositories: [], - tasks: [], - }; - } - - getSnapshot(): TaskWorkbenchSnapshot { - return this.snapshot; - } - - subscribe(listener: () => void): () => void { - this.listeners.add(listener); - this.ensureStarted(); - return () => { - this.listeners.delete(listener); - if (this.listeners.size === 0 && this.refreshRetryTimeout) { - clearTimeout(this.refreshRetryTimeout); - this.refreshRetryTimeout = null; - } - if (this.listeners.size === 0 && this.unsubscribeWorkbench) { - this.unsubscribeWorkbench(); - this.unsubscribeWorkbench = null; - } - }; - } - - async createTask(input: TaskWorkbenchCreateTaskInput): Promise { - const created = await this.backend.createWorkbenchTask(this.organizationId, input); - await this.refresh(); - return created; - } - - async markTaskUnread(input: TaskWorkbenchSelectInput): Promise { - await this.backend.markWorkbenchUnread(this.organizationId, input); - await this.refresh(); - } - - async renameTask(input: TaskWorkbenchRenameInput): Promise { - await this.backend.renameWorkbenchTask(this.organizationId, input); - await this.refresh(); - } - - async renameBranch(input: TaskWorkbenchRenameInput): Promise { - await this.backend.renameWorkbenchBranch(this.organizationId, input); - await this.refresh(); - } - - async archiveTask(input: TaskWorkbenchSelectInput): Promise { - await this.backend.runAction(this.organizationId, input.taskId, "archive"); - await this.refresh(); - } - - async publishPr(input: TaskWorkbenchSelectInput): Promise { - await this.backend.publishWorkbenchPr(this.organizationId, input); - await this.refresh(); - } - - async revertFile(input: TaskWorkbenchDiffInput): Promise { - await this.backend.revertWorkbenchFile(this.organizationId, input); - await this.refresh(); - } - - async updateDraft(input: TaskWorkbenchUpdateDraftInput): Promise { - await this.backend.updateWorkbenchDraft(this.organizationId, input); - // Skip refresh — the server broadcast will trigger it, and the frontend - // holds local draft state to avoid the round-trip overwriting user input. - } - - async sendMessage(input: TaskWorkbenchSendMessageInput): Promise { - await this.backend.sendWorkbenchMessage(this.organizationId, input); - await this.refresh(); - } - - async stopAgent(input: TaskWorkbenchSessionInput): Promise { - await this.backend.stopWorkbenchSession(this.organizationId, input); - await this.refresh(); - } - - async setSessionUnread(input: TaskWorkbenchSetSessionUnreadInput): Promise { - await this.backend.setWorkbenchSessionUnread(this.organizationId, input); - await this.refresh(); - } - - async renameSession(input: TaskWorkbenchRenameSessionInput): Promise { - await this.backend.renameWorkbenchSession(this.organizationId, input); - await this.refresh(); - } - - async closeSession(input: TaskWorkbenchSessionInput): Promise { - await this.backend.closeWorkbenchSession(this.organizationId, input); - await this.refresh(); - } - - async addSession(input: TaskWorkbenchSelectInput): Promise { - const created = await this.backend.createWorkbenchSession(this.organizationId, input); - await this.refresh(); - return created; - } - - async changeModel(input: TaskWorkbenchChangeModelInput): Promise { - await this.backend.changeWorkbenchModel(this.organizationId, input); - await this.refresh(); - } - - private ensureStarted(): void { - if (!this.unsubscribeWorkbench) { - this.unsubscribeWorkbench = this.backend.subscribeWorkbench(this.organizationId, () => { - void this.refresh().catch(() => { - this.scheduleRefreshRetry(); - }); - }); - } - void this.refresh().catch(() => { - this.scheduleRefreshRetry(); - }); - } - - private scheduleRefreshRetry(): void { - if (this.refreshRetryTimeout || this.listeners.size === 0) { - return; - } - - this.refreshRetryTimeout = setTimeout(() => { - this.refreshRetryTimeout = null; - void this.refresh().catch(() => { - this.scheduleRefreshRetry(); - }); - }, 1_000); - } - - private async refresh(): Promise { - if (this.refreshPromise) { - await this.refreshPromise; - return; - } - - this.refreshPromise = (async () => { - const nextSnapshot = await this.backend.getWorkbench(this.organizationId); - if (this.refreshRetryTimeout) { - clearTimeout(this.refreshRetryTimeout); - this.refreshRetryTimeout = null; - } - this.snapshot = { - ...nextSnapshot, - repositories: nextSnapshot.repositories ?? groupWorkbenchRepositories(nextSnapshot.repos, nextSnapshot.tasks), - }; - for (const listener of [...this.listeners]) { - listener(); - } - })().finally(() => { - this.refreshPromise = null; - }); - - await this.refreshPromise; - } -} - -export function createRemoteWorkbenchClient(options: RemoteWorkbenchClientOptions): TaskWorkbenchClient { - return new RemoteWorkbenchStore(options); -} diff --git a/foundry/packages/client/src/remote/workspace-client.ts b/foundry/packages/client/src/remote/workspace-client.ts new file mode 100644 index 0000000..2a11f51 --- /dev/null +++ b/foundry/packages/client/src/remote/workspace-client.ts @@ -0,0 +1,204 @@ +import type { + TaskWorkspaceAddSessionResponse, + TaskWorkspaceChangeModelInput, + TaskWorkspaceChangeOwnerInput, + TaskWorkspaceCreateTaskInput, + TaskWorkspaceCreateTaskResponse, + TaskWorkspaceDiffInput, + TaskWorkspaceRenameInput, + TaskWorkspaceRenameSessionInput, + TaskWorkspaceSelectInput, + TaskWorkspaceSetSessionUnreadInput, + TaskWorkspaceSendMessageInput, + TaskWorkspaceSnapshot, + TaskWorkspaceSessionInput, + TaskWorkspaceUpdateDraftInput, +} from "@sandbox-agent/foundry-shared"; +import type { BackendClient } from "../backend-client.js"; +import { groupWorkspaceRepositories } from "../workspace-model.js"; +import type { TaskWorkspaceClient } from "../workspace-client.js"; + +export interface RemoteWorkspaceClientOptions { + backend: BackendClient; + organizationId: string; +} + +class RemoteWorkspaceStore implements TaskWorkspaceClient { + private readonly backend: BackendClient; + private readonly organizationId: string; + private snapshot: TaskWorkspaceSnapshot; + private readonly listeners = new Set<() => void>(); + private unsubscribeWorkspace: (() => void) | null = null; + private refreshPromise: Promise | null = null; + private refreshRetryTimeout: ReturnType | null = null; + + constructor(options: RemoteWorkspaceClientOptions) { + this.backend = options.backend; + this.organizationId = options.organizationId; + this.snapshot = { + organizationId: options.organizationId, + repos: [], + repositories: [], + tasks: [], + }; + } + + getSnapshot(): TaskWorkspaceSnapshot { + return this.snapshot; + } + + subscribe(listener: () => void): () => void { + this.listeners.add(listener); + this.ensureStarted(); + return () => { + this.listeners.delete(listener); + if (this.listeners.size === 0 && this.refreshRetryTimeout) { + clearTimeout(this.refreshRetryTimeout); + this.refreshRetryTimeout = null; + } + if (this.listeners.size === 0 && this.unsubscribeWorkspace) { + this.unsubscribeWorkspace(); + this.unsubscribeWorkspace = null; + } + }; + } + + async createTask(input: TaskWorkspaceCreateTaskInput): Promise { + const created = await this.backend.createWorkspaceTask(this.organizationId, input); + await this.refresh(); + return created; + } + + async markTaskUnread(input: TaskWorkspaceSelectInput): Promise { + await this.backend.markWorkspaceUnread(this.organizationId, input); + await this.refresh(); + } + + async renameTask(input: TaskWorkspaceRenameInput): Promise { + await this.backend.renameWorkspaceTask(this.organizationId, input); + await this.refresh(); + } + + async archiveTask(input: TaskWorkspaceSelectInput): Promise { + await this.backend.runAction(this.organizationId, input.repoId, input.taskId, "archive"); + await this.refresh(); + } + + async publishPr(input: TaskWorkspaceSelectInput): Promise { + await this.backend.publishWorkspacePr(this.organizationId, input); + await this.refresh(); + } + + async revertFile(input: TaskWorkspaceDiffInput): Promise { + await this.backend.revertWorkspaceFile(this.organizationId, input); + await this.refresh(); + } + + async updateDraft(input: TaskWorkspaceUpdateDraftInput): Promise { + await this.backend.updateWorkspaceDraft(this.organizationId, input); + // Skip refresh — the server broadcast will trigger it, and the frontend + // holds local draft state to avoid the round-trip overwriting user input. + } + + async sendMessage(input: TaskWorkspaceSendMessageInput): Promise { + await this.backend.sendWorkspaceMessage(this.organizationId, input); + await this.refresh(); + } + + async stopAgent(input: TaskWorkspaceSessionInput): Promise { + await this.backend.stopWorkspaceSession(this.organizationId, input); + await this.refresh(); + } + + async selectSession(input: TaskWorkspaceSessionInput): Promise { + await this.backend.selectWorkspaceSession(this.organizationId, input); + await this.refresh(); + } + + async setSessionUnread(input: TaskWorkspaceSetSessionUnreadInput): Promise { + await this.backend.setWorkspaceSessionUnread(this.organizationId, input); + await this.refresh(); + } + + async renameSession(input: TaskWorkspaceRenameSessionInput): Promise { + await this.backend.renameWorkspaceSession(this.organizationId, input); + await this.refresh(); + } + + async closeSession(input: TaskWorkspaceSessionInput): Promise { + await this.backend.closeWorkspaceSession(this.organizationId, input); + await this.refresh(); + } + + async addSession(input: TaskWorkspaceSelectInput): Promise { + const created = await this.backend.createWorkspaceSession(this.organizationId, input); + await this.refresh(); + return created; + } + + async changeModel(input: TaskWorkspaceChangeModelInput): Promise { + await this.backend.changeWorkspaceModel(this.organizationId, input); + await this.refresh(); + } + + async changeOwner(input: TaskWorkspaceChangeOwnerInput): Promise { + await this.backend.changeWorkspaceTaskOwner(this.organizationId, input); + await this.refresh(); + } + + private ensureStarted(): void { + if (!this.unsubscribeWorkspace) { + this.unsubscribeWorkspace = this.backend.subscribeWorkspace(this.organizationId, () => { + void this.refresh().catch(() => { + this.scheduleRefreshRetry(); + }); + }); + } + void this.refresh().catch(() => { + this.scheduleRefreshRetry(); + }); + } + + private scheduleRefreshRetry(): void { + if (this.refreshRetryTimeout || this.listeners.size === 0) { + return; + } + + this.refreshRetryTimeout = setTimeout(() => { + this.refreshRetryTimeout = null; + void this.refresh().catch(() => { + this.scheduleRefreshRetry(); + }); + }, 1_000); + } + + private async refresh(): Promise { + if (this.refreshPromise) { + await this.refreshPromise; + return; + } + + this.refreshPromise = (async () => { + const nextSnapshot = await this.backend.getWorkspace(this.organizationId); + if (this.refreshRetryTimeout) { + clearTimeout(this.refreshRetryTimeout); + this.refreshRetryTimeout = null; + } + this.snapshot = { + ...nextSnapshot, + repositories: nextSnapshot.repositories ?? groupWorkspaceRepositories(nextSnapshot.repos, nextSnapshot.tasks), + }; + for (const listener of [...this.listeners]) { + listener(); + } + })().finally(() => { + this.refreshPromise = null; + }); + + await this.refreshPromise; + } +} + +export function createRemoteWorkspaceClient(options: RemoteWorkspaceClientOptions): TaskWorkspaceClient { + return new RemoteWorkspaceStore(options); +} diff --git a/foundry/packages/client/src/subscription/remote-manager.ts b/foundry/packages/client/src/subscription/remote-manager.ts index 8cb2864..ae774c6 100644 --- a/foundry/packages/client/src/subscription/remote-manager.ts +++ b/foundry/packages/client/src/subscription/remote-manager.ts @@ -4,6 +4,11 @@ import { topicDefinitions, type TopicData, type TopicDefinition, type TopicKey, const GRACE_PERIOD_MS = 30_000; +/** Initial retry delay in ms. */ +const RETRY_BASE_MS = 1_000; +/** Maximum retry delay in ms. */ +const RETRY_MAX_MS = 30_000; + /** * Remote implementation of SubscriptionManager. * Each cache entry owns one actor connection plus one materialized snapshot. @@ -80,8 +85,12 @@ class TopicEntry { private unsubscribeEvent: (() => void) | null = null; private unsubscribeError: (() => void) | null = null; private teardownTimer: ReturnType | null = null; + private retryTimer: ReturnType | null = null; + private retryAttempt = 0; private startPromise: Promise | null = null; + private eventPromise: Promise = Promise.resolve(); private started = false; + private disposed = false; constructor( private readonly topicKey: TopicKey, @@ -135,7 +144,9 @@ class TopicEntry { } dispose(): void { + this.disposed = true; this.cancelTeardown(); + this.cancelRetry(); this.unsubscribeEvent?.(); this.unsubscribeError?.(); if (this.conn) { @@ -147,6 +158,55 @@ class TopicEntry { this.error = null; this.lastRefreshAt = null; this.started = false; + this.retryAttempt = 0; + } + + private cancelRetry(): void { + if (this.retryTimer) { + clearTimeout(this.retryTimer); + this.retryTimer = null; + } + } + + /** + * Schedules a retry with exponential backoff. Cleans up any existing + * connection state before reconnecting. + */ + private scheduleRetry(): void { + if (this.disposed || this.listenerCount === 0) { + return; + } + + const delay = Math.min(RETRY_BASE_MS * 2 ** this.retryAttempt, RETRY_MAX_MS); + this.retryAttempt++; + + this.retryTimer = setTimeout(() => { + this.retryTimer = null; + if (this.disposed || this.listenerCount === 0) { + return; + } + + // Tear down the old connection before retrying + this.cleanupConnection(); + this.started = false; + this.startPromise = this.start().finally(() => { + this.startPromise = null; + }); + }, delay); + } + + /** + * Cleans up connection resources without resetting data/status/retry state. + */ + private cleanupConnection(): void { + this.unsubscribeEvent?.(); + this.unsubscribeError?.(); + this.unsubscribeEvent = null; + this.unsubscribeError = null; + if (this.conn) { + void this.conn.dispose(); + } + this.conn = null; } private async start(): Promise { @@ -157,31 +217,56 @@ class TopicEntry { try { this.conn = await this.definition.connect(this.backend, this.params); this.unsubscribeEvent = this.conn.on(this.definition.event, (event: TEvent) => { - if (this.data === undefined) { - return; - } - this.data = this.definition.applyEvent(this.data, event); - this.lastRefreshAt = Date.now(); - this.notify(); + void this.applyEvent(event); }); this.unsubscribeError = this.conn.onError((error: unknown) => { this.status = "error"; this.error = error instanceof Error ? error : new Error(String(error)); this.notify(); + this.scheduleRetry(); }); this.data = await this.definition.fetchInitial(this.backend, this.params); this.status = "connected"; this.lastRefreshAt = Date.now(); this.started = true; + this.retryAttempt = 0; this.notify(); } catch (error) { this.status = "error"; this.error = error instanceof Error ? error : new Error(String(error)); this.started = false; this.notify(); + this.scheduleRetry(); } } + private applyEvent(event: TEvent): Promise { + this.eventPromise = this.eventPromise + .then(async () => { + if (!this.started || this.data === undefined) { + return; + } + + const nextData = await this.definition.applyEvent(this.backend, this.params, this.data, event); + if (!this.started) { + return; + } + + this.data = nextData; + this.status = "connected"; + this.error = null; + this.lastRefreshAt = Date.now(); + this.notify(); + }) + .catch((error) => { + this.status = "error"; + this.error = error instanceof Error ? error : new Error(String(error)); + this.notify(); + }); + + return this.eventPromise; + } + private notify(): void { for (const listener of [...this.listeners]) { listener(); diff --git a/foundry/packages/client/src/subscription/topics.ts b/foundry/packages/client/src/subscription/topics.ts index f6a0acc..bbda118 100644 --- a/foundry/packages/client/src/subscription/topics.ts +++ b/foundry/packages/client/src/subscription/topics.ts @@ -5,8 +5,8 @@ import type { SandboxProcessesEvent, SessionEvent, TaskEvent, - WorkbenchSessionDetail, - WorkbenchTaskDetail, + WorkspaceSessionDetail, + WorkspaceTaskDetail, OrganizationEvent, OrganizationSummarySnapshot, } from "@sandbox-agent/foundry-shared"; @@ -16,15 +16,15 @@ import type { ActorConn, BackendClient, SandboxProcessRecord } from "../backend- * Topic definitions for the subscription manager. * * Each topic describes one actor connection plus one materialized read model. - * Events always carry full replacement payloads for the changed entity so the - * client can replace cached state directly instead of reconstructing patches. + * Some topics can apply broadcast payloads directly, while others refetch + * through BackendClient so auth-scoped state stays user-specific. */ export interface TopicDefinition { key: (params: TParams) => string; event: string; connect: (backend: BackendClient, params: TParams) => Promise; fetchInitial: (backend: BackendClient, params: TParams) => Promise; - applyEvent: (current: TData, event: TEvent) => TData; + applyEvent: (backend: BackendClient, params: TParams, current: TData, event: TEvent) => Promise | TData; } export interface AppTopicParams {} @@ -48,23 +48,13 @@ export interface SandboxProcessesTopicParams { sandboxId: string; } -function upsertById(items: T[], nextItem: T, sort: (left: T, right: T) => number): T[] { - const filtered = items.filter((item) => item.id !== nextItem.id); - return [...filtered, nextItem].sort(sort); -} - -function upsertByPrId(items: T[], nextItem: T, sort: (left: T, right: T) => number): T[] { - const filtered = items.filter((item) => item.prId !== nextItem.prId); - return [...filtered, nextItem].sort(sort); -} - export const topicDefinitions = { app: { key: () => "app", event: "appUpdated", connect: (backend: BackendClient, _params: AppTopicParams) => backend.connectOrganization("app"), fetchInitial: (backend: BackendClient, _params: AppTopicParams) => backend.getAppSnapshot(), - applyEvent: (_current: FoundryAppSnapshot, event: AppEvent) => event.snapshot, + applyEvent: (_backend: BackendClient, _params: AppTopicParams, _current: FoundryAppSnapshot, event: AppEvent) => event.snapshot, } satisfies TopicDefinition, organization: { @@ -72,41 +62,8 @@ export const topicDefinitions = { event: "organizationUpdated", connect: (backend: BackendClient, params: OrganizationTopicParams) => backend.connectOrganization(params.organizationId), fetchInitial: (backend: BackendClient, params: OrganizationTopicParams) => backend.getOrganizationSummary(params.organizationId), - applyEvent: (current: OrganizationSummarySnapshot, event: OrganizationEvent) => { - switch (event.type) { - case "taskSummaryUpdated": - return { - ...current, - taskSummaries: upsertById(current.taskSummaries, event.taskSummary, (left, right) => right.updatedAtMs - left.updatedAtMs), - }; - case "taskRemoved": - return { - ...current, - taskSummaries: current.taskSummaries.filter((task) => task.id !== event.taskId), - }; - case "repoAdded": - case "repoUpdated": - return { - ...current, - repos: upsertById(current.repos, event.repo, (left, right) => right.latestActivityMs - left.latestActivityMs), - }; - case "repoRemoved": - return { - ...current, - repos: current.repos.filter((repo) => repo.id !== event.repoId), - }; - case "pullRequestUpdated": - return { - ...current, - openPullRequests: upsertByPrId(current.openPullRequests, event.pullRequest, (left, right) => right.updatedAtMs - left.updatedAtMs), - }; - case "pullRequestRemoved": - return { - ...current, - openPullRequests: current.openPullRequests.filter((pullRequest) => pullRequest.prId !== event.prId), - }; - } - }, + applyEvent: (_backend: BackendClient, _params: OrganizationTopicParams, _current: OrganizationSummarySnapshot, event: OrganizationEvent) => + event.snapshot, } satisfies TopicDefinition, task: { @@ -114,8 +71,9 @@ export const topicDefinitions = { event: "taskUpdated", connect: (backend: BackendClient, params: TaskTopicParams) => backend.connectTask(params.organizationId, params.repoId, params.taskId), fetchInitial: (backend: BackendClient, params: TaskTopicParams) => backend.getTaskDetail(params.organizationId, params.repoId, params.taskId), - applyEvent: (_current: WorkbenchTaskDetail, event: TaskEvent) => event.detail, - } satisfies TopicDefinition, + applyEvent: (backend: BackendClient, params: TaskTopicParams, _current: WorkspaceTaskDetail, _event: TaskEvent) => + backend.getTaskDetail(params.organizationId, params.repoId, params.taskId), + } satisfies TopicDefinition, session: { key: (params: SessionTopicParams) => `session:${params.organizationId}:${params.taskId}:${params.sessionId}`, @@ -123,13 +81,13 @@ export const topicDefinitions = { connect: (backend: BackendClient, params: SessionTopicParams) => backend.connectTask(params.organizationId, params.repoId, params.taskId), fetchInitial: (backend: BackendClient, params: SessionTopicParams) => backend.getSessionDetail(params.organizationId, params.repoId, params.taskId, params.sessionId), - applyEvent: (current: WorkbenchSessionDetail, event: SessionEvent) => { - if (event.session.sessionId !== current.sessionId) { + applyEvent: async (backend: BackendClient, params: SessionTopicParams, current: WorkspaceSessionDetail, event: SessionEvent) => { + if (event.session.sessionId !== params.sessionId) { return current; } - return event.session; + return await backend.getSessionDetail(params.organizationId, params.repoId, params.taskId, params.sessionId); }, - } satisfies TopicDefinition, + } satisfies TopicDefinition, sandboxProcesses: { key: (params: SandboxProcessesTopicParams) => `sandbox:${params.organizationId}:${params.sandboxProviderId}:${params.sandboxId}`, @@ -138,7 +96,8 @@ export const topicDefinitions = { backend.connectSandbox(params.organizationId, params.sandboxProviderId, params.sandboxId), fetchInitial: async (backend: BackendClient, params: SandboxProcessesTopicParams) => (await backend.listSandboxProcesses(params.organizationId, params.sandboxProviderId, params.sandboxId)).processes, - applyEvent: (_current: SandboxProcessRecord[], event: SandboxProcessesEvent) => event.processes, + applyEvent: (_backend: BackendClient, _params: SandboxProcessesTopicParams, _current: SandboxProcessRecord[], event: SandboxProcessesEvent) => + event.processes, } satisfies TopicDefinition, } as const; diff --git a/foundry/packages/client/src/view-model.ts b/foundry/packages/client/src/view-model.ts index c30ff2a..bd7a98c 100644 --- a/foundry/packages/client/src/view-model.ts +++ b/foundry/packages/client/src/view-model.ts @@ -65,7 +65,7 @@ export function filterTasks(rows: TaskRecord[], query: string): TaskRecord[] { } return rows.filter((row) => { - const fields = [row.branchName ?? "", row.title ?? "", row.taskId, row.task, row.prAuthor ?? "", row.reviewer ?? ""]; + const fields = [row.branchName ?? "", row.title ?? "", row.taskId, row.task]; return fields.some((field) => fuzzyMatch(field, q)); }); } diff --git a/foundry/packages/client/src/workbench-client.ts b/foundry/packages/client/src/workbench-client.ts deleted file mode 100644 index c317649..0000000 --- a/foundry/packages/client/src/workbench-client.ts +++ /dev/null @@ -1,64 +0,0 @@ -import type { - TaskWorkbenchAddSessionResponse, - TaskWorkbenchChangeModelInput, - TaskWorkbenchCreateTaskInput, - TaskWorkbenchCreateTaskResponse, - TaskWorkbenchDiffInput, - TaskWorkbenchRenameInput, - TaskWorkbenchRenameSessionInput, - TaskWorkbenchSelectInput, - TaskWorkbenchSetSessionUnreadInput, - TaskWorkbenchSendMessageInput, - TaskWorkbenchSnapshot, - TaskWorkbenchSessionInput, - TaskWorkbenchUpdateDraftInput, -} from "@sandbox-agent/foundry-shared"; -import type { BackendClient } from "./backend-client.js"; -import { getSharedMockWorkbenchClient } from "./mock/workbench-client.js"; -import { createRemoteWorkbenchClient } from "./remote/workbench-client.js"; - -export type TaskWorkbenchClientMode = "mock" | "remote"; - -export interface CreateTaskWorkbenchClientOptions { - mode: TaskWorkbenchClientMode; - backend?: BackendClient; - organizationId?: string; -} - -export interface TaskWorkbenchClient { - getSnapshot(): TaskWorkbenchSnapshot; - subscribe(listener: () => void): () => void; - createTask(input: TaskWorkbenchCreateTaskInput): Promise; - markTaskUnread(input: TaskWorkbenchSelectInput): Promise; - renameTask(input: TaskWorkbenchRenameInput): Promise; - renameBranch(input: TaskWorkbenchRenameInput): Promise; - archiveTask(input: TaskWorkbenchSelectInput): Promise; - publishPr(input: TaskWorkbenchSelectInput): Promise; - revertFile(input: TaskWorkbenchDiffInput): Promise; - updateDraft(input: TaskWorkbenchUpdateDraftInput): Promise; - sendMessage(input: TaskWorkbenchSendMessageInput): Promise; - stopAgent(input: TaskWorkbenchSessionInput): Promise; - setSessionUnread(input: TaskWorkbenchSetSessionUnreadInput): Promise; - renameSession(input: TaskWorkbenchRenameSessionInput): Promise; - closeSession(input: TaskWorkbenchSessionInput): Promise; - addSession(input: TaskWorkbenchSelectInput): Promise; - changeModel(input: TaskWorkbenchChangeModelInput): Promise; -} - -export function createTaskWorkbenchClient(options: CreateTaskWorkbenchClientOptions): TaskWorkbenchClient { - if (options.mode === "mock") { - return getSharedMockWorkbenchClient(); - } - - if (!options.backend) { - throw new Error("Remote task workbench client requires a backend client"); - } - if (!options.organizationId) { - throw new Error("Remote task workbench client requires a organization id"); - } - - return createRemoteWorkbenchClient({ - backend: options.backend, - organizationId: options.organizationId, - }); -} diff --git a/foundry/packages/client/src/workspace-client.ts b/foundry/packages/client/src/workspace-client.ts new file mode 100644 index 0000000..6662352 --- /dev/null +++ b/foundry/packages/client/src/workspace-client.ts @@ -0,0 +1,66 @@ +import type { + TaskWorkspaceAddSessionResponse, + TaskWorkspaceChangeModelInput, + TaskWorkspaceChangeOwnerInput, + TaskWorkspaceCreateTaskInput, + TaskWorkspaceCreateTaskResponse, + TaskWorkspaceDiffInput, + TaskWorkspaceRenameInput, + TaskWorkspaceRenameSessionInput, + TaskWorkspaceSelectInput, + TaskWorkspaceSetSessionUnreadInput, + TaskWorkspaceSendMessageInput, + TaskWorkspaceSnapshot, + TaskWorkspaceSessionInput, + TaskWorkspaceUpdateDraftInput, +} from "@sandbox-agent/foundry-shared"; +import type { BackendClient } from "./backend-client.js"; +import { getSharedMockWorkspaceClient } from "./mock/workspace-client.js"; +import { createRemoteWorkspaceClient } from "./remote/workspace-client.js"; + +export type TaskWorkspaceClientMode = "mock" | "remote"; + +export interface CreateTaskWorkspaceClientOptions { + mode: TaskWorkspaceClientMode; + backend?: BackendClient; + organizationId?: string; +} + +export interface TaskWorkspaceClient { + getSnapshot(): TaskWorkspaceSnapshot; + subscribe(listener: () => void): () => void; + createTask(input: TaskWorkspaceCreateTaskInput): Promise; + markTaskUnread(input: TaskWorkspaceSelectInput): Promise; + renameTask(input: TaskWorkspaceRenameInput): Promise; + archiveTask(input: TaskWorkspaceSelectInput): Promise; + publishPr(input: TaskWorkspaceSelectInput): Promise; + revertFile(input: TaskWorkspaceDiffInput): Promise; + updateDraft(input: TaskWorkspaceUpdateDraftInput): Promise; + sendMessage(input: TaskWorkspaceSendMessageInput): Promise; + stopAgent(input: TaskWorkspaceSessionInput): Promise; + selectSession(input: TaskWorkspaceSessionInput): Promise; + setSessionUnread(input: TaskWorkspaceSetSessionUnreadInput): Promise; + renameSession(input: TaskWorkspaceRenameSessionInput): Promise; + closeSession(input: TaskWorkspaceSessionInput): Promise; + addSession(input: TaskWorkspaceSelectInput): Promise; + changeModel(input: TaskWorkspaceChangeModelInput): Promise; + changeOwner(input: TaskWorkspaceChangeOwnerInput): Promise; +} + +export function createTaskWorkspaceClient(options: CreateTaskWorkspaceClientOptions): TaskWorkspaceClient { + if (options.mode === "mock") { + return getSharedMockWorkspaceClient(); + } + + if (!options.backend) { + throw new Error("Remote task workspace client requires a backend client"); + } + if (!options.organizationId) { + throw new Error("Remote task workspace client requires a organization id"); + } + + return createRemoteWorkspaceClient({ + backend: options.backend, + organizationId: options.organizationId, + }); +} diff --git a/foundry/packages/client/src/workbench-model.ts b/foundry/packages/client/src/workspace-model.ts similarity index 90% rename from foundry/packages/client/src/workbench-model.ts rename to foundry/packages/client/src/workspace-model.ts index afe9e8b..290794b 100644 --- a/foundry/packages/client/src/workbench-model.ts +++ b/foundry/packages/client/src/workspace-model.ts @@ -1,40 +1,28 @@ +import { + DEFAULT_WORKSPACE_MODEL_ID, + DEFAULT_WORKSPACE_MODEL_GROUPS as SharedModelGroups, + workspaceModelLabel as sharedWorkspaceModelLabel, + workspaceProviderAgent as sharedWorkspaceProviderAgent, +} from "@sandbox-agent/foundry-shared"; import type { - WorkbenchAgentKind as AgentKind, - WorkbenchSession as AgentSession, - WorkbenchDiffLineKind as DiffLineKind, - WorkbenchFileTreeNode as FileTreeNode, - WorkbenchTask as Task, - TaskWorkbenchSnapshot, - WorkbenchHistoryEvent as HistoryEvent, - WorkbenchModelGroup as ModelGroup, - WorkbenchModelId as ModelId, - WorkbenchParsedDiffLine as ParsedDiffLine, - WorkbenchRepositorySection, - WorkbenchRepo, - WorkbenchTranscriptEvent as TranscriptEvent, + WorkspaceAgentKind as AgentKind, + WorkspaceSession as AgentSession, + WorkspaceDiffLineKind as DiffLineKind, + WorkspaceFileTreeNode as FileTreeNode, + WorkspaceTask as Task, + TaskWorkspaceSnapshot, + WorkspaceHistoryEvent as HistoryEvent, + WorkspaceModelGroup as ModelGroup, + WorkspaceModelId as ModelId, + WorkspaceParsedDiffLine as ParsedDiffLine, + WorkspaceRepositorySection, + WorkspaceRepo, + WorkspaceTranscriptEvent as TranscriptEvent, } from "@sandbox-agent/foundry-shared"; import rivetDevFixture from "../../../scripts/data/rivet-dev.json" with { type: "json" }; -export const MODEL_GROUPS: ModelGroup[] = [ - { - provider: "Claude", - models: [ - { id: "claude-sonnet-4", label: "Sonnet 4" }, - { id: "claude-opus-4", label: "Opus 4" }, - ], - }, - { - provider: "OpenAI", - models: [ - { id: "gpt-5.3-codex", label: "GPT-5.3 Codex" }, - { id: "gpt-5.4", label: "GPT-5.4" }, - { id: "gpt-5.2-codex", label: "GPT-5.2 Codex" }, - { id: "gpt-5.1-codex-max", label: "GPT-5.1 Codex Max" }, - { id: "gpt-5.2", label: "GPT-5.2" }, - { id: "gpt-5.1-codex-mini", label: "GPT-5.1 Codex Mini" }, - ], - }, -]; +export const MODEL_GROUPS: ModelGroup[] = SharedModelGroups; +export const DEFAULT_MODEL_ID: ModelId = DEFAULT_WORKSPACE_MODEL_ID; const MOCK_REPLIES = [ "Got it. I'll work on that now. Let me start by examining the relevant files...", @@ -73,15 +61,11 @@ export function formatMessageDuration(durationMs: number): string { } export function modelLabel(id: ModelId): string { - const group = MODEL_GROUPS.find((candidate) => candidate.models.some((model) => model.id === id)); - const model = group?.models.find((candidate) => candidate.id === id); - return model && group ? `${group.provider} ${model.label}` : id; + return sharedWorkspaceModelLabel(id, MODEL_GROUPS); } export function providerAgent(provider: string): AgentKind { - if (provider === "Claude") return "Claude"; - if (provider === "OpenAI") return "Codex"; - return "Cursor"; + return sharedWorkspaceProviderAgent(provider); } export function slugify(text: string): string { @@ -204,6 +188,29 @@ export function buildHistoryEvents(sessions: AgentSession[]): HistoryEvent[] { .sort((left, right) => messageOrder(left.messageId) - messageOrder(right.messageId)); } +function buildPullRequestSummary(params: { + number: number; + title: string; + branch: string; + repoName: string; + updatedAtMs: number; + status: "ready" | "draft"; +}) { + return { + number: params.number, + status: params.status, + title: params.title, + state: "open", + url: `https://github.com/${params.repoName}/pull/${params.number}`, + headRefName: params.branch, + baseRefName: "main", + repoFullName: params.repoName, + authorLogin: "mock", + isDraft: params.status === "draft", + updatedAtMs: params.updatedAtMs, + }; +} + function transcriptFromLegacyMessages(sessionId: string, messages: LegacyMessage[]): TranscriptEvent[] { return messages.map((message, index) => ({ id: message.id, @@ -315,14 +322,21 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/sandbox-agent", updatedAtMs: minutesAgo(8), branch: "NathanFlurry/pi-bootstrap-fix", - pullRequest: { number: 227, status: "ready" }, + pullRequest: buildPullRequestSummary({ + number: 227, + title: "Normalize Pi ACP bootstrap payloads", + branch: "NathanFlurry/pi-bootstrap-fix", + repoName: "rivet-dev/sandbox-agent", + updatedAtMs: minutesAgo(8), + status: "ready", + }), sessions: [ { id: "t1", sessionId: "t1", sessionName: "Pi payload fix", agent: "Claude", - model: "claude-sonnet-4", + model: "sonnet", status: "idle", thinkingSinceMs: null, unread: false, @@ -484,14 +498,21 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/sandbox-agent", updatedAtMs: minutesAgo(3), branch: "feat/builtin-agent-skills", - pullRequest: { number: 223, status: "draft" }, + pullRequest: buildPullRequestSummary({ + number: 223, + title: "Auto-inject builtin agent skills at startup", + branch: "feat/builtin-agent-skills", + repoName: "rivet-dev/sandbox-agent", + updatedAtMs: minutesAgo(3), + status: "draft", + }), sessions: [ { id: "t3", sessionId: "t3", sessionName: "Skills injection", agent: "Claude", - model: "claude-opus-4", + model: "opus", status: "running", thinkingSinceMs: NOW_MS - 45_000, unread: false, @@ -584,14 +605,21 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/sandbox-agent", updatedAtMs: minutesAgo(45), branch: "hooks-example", - pullRequest: { number: 225, status: "ready" }, + pullRequest: buildPullRequestSummary({ + number: 225, + title: "Add hooks example for Claude, Codex, and OpenCode", + branch: "hooks-example", + repoName: "rivet-dev/sandbox-agent", + updatedAtMs: minutesAgo(45), + status: "ready", + }), sessions: [ { id: "t4", sessionId: "t4", sessionName: "Example docs", agent: "Claude", - model: "claude-sonnet-4", + model: "sonnet", status: "idle", thinkingSinceMs: null, unread: false, @@ -659,14 +687,21 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/rivet", updatedAtMs: minutesAgo(15), branch: "actor-reschedule-endpoint", - pullRequest: { number: 4400, status: "ready" }, + pullRequest: buildPullRequestSummary({ + number: 4400, + title: "Add actor reschedule endpoint", + branch: "actor-reschedule-endpoint", + repoName: "rivet-dev/rivet", + updatedAtMs: minutesAgo(15), + status: "ready", + }), sessions: [ { id: "t5", sessionId: "t5", sessionName: "Reschedule API", agent: "Claude", - model: "claude-sonnet-4", + model: "sonnet", status: "idle", thinkingSinceMs: null, unread: false, @@ -793,14 +828,21 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/rivet", updatedAtMs: minutesAgo(35), branch: "feat/dynamic-actors", - pullRequest: { number: 4395, status: "draft" }, + pullRequest: buildPullRequestSummary({ + number: 4395, + title: "Dynamic actors", + branch: "feat/dynamic-actors", + repoName: "rivet-dev/rivet", + updatedAtMs: minutesAgo(35), + status: "draft", + }), sessions: [ { id: "t6", sessionId: "t6", sessionName: "Dynamic actors impl", agent: "Claude", - model: "claude-opus-4", + model: "opus", status: "idle", thinkingSinceMs: null, unread: true, @@ -850,14 +892,21 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/vbare", updatedAtMs: minutesAgo(25), branch: "fix-use-full-cloud-run-pool-name", - pullRequest: { number: 235, status: "ready" }, + pullRequest: buildPullRequestSummary({ + number: 235, + title: "Use full cloud run pool name for routing", + branch: "fix-use-full-cloud-run-pool-name", + repoName: "rivet-dev/vbare", + updatedAtMs: minutesAgo(25), + status: "ready", + }), sessions: [ { id: "t7", sessionId: "t7", sessionName: "Pool routing fix", agent: "Claude", - model: "claude-sonnet-4", + model: "sonnet", status: "idle", thinkingSinceMs: null, unread: false, @@ -959,14 +1008,21 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/skills", updatedAtMs: minutesAgo(50), branch: "fix-guard-support-https-targets", - pullRequest: { number: 125, status: "ready" }, + pullRequest: buildPullRequestSummary({ + number: 125, + title: "Route compute gateway path correctly", + branch: "fix-guard-support-https-targets", + repoName: "rivet-dev/skills", + updatedAtMs: minutesAgo(50), + status: "ready", + }), sessions: [ { id: "t8", sessionId: "t8", sessionName: "Guard routing", agent: "Claude", - model: "claude-sonnet-4", + model: "sonnet", status: "idle", thinkingSinceMs: null, unread: false, @@ -1073,14 +1129,21 @@ export function buildInitialTasks(): Task[] { repoName: "rivet-dev/skills", updatedAtMs: minutesAgo(2 * 24 * 60), branch: "chore-move-compute-gateway-to", - pullRequest: { number: 123, status: "ready" }, + pullRequest: buildPullRequestSummary({ + number: 123, + title: "Move compute gateway to guard", + branch: "chore-move-compute-gateway-to", + repoName: "rivet-dev/skills", + updatedAtMs: minutesAgo(2 * 24 * 60), + status: "ready", + }), sessions: [ { id: "t9", sessionId: "t9", sessionName: "Gateway migration", agent: "Claude", - model: "claude-sonnet-4", + model: "sonnet", status: "idle", thinkingSinceMs: null, unread: false, @@ -1166,8 +1229,6 @@ export function buildInitialTasks(): Task[] { repoId: "sandbox-agent", title: "Fix broken auth middleware (error demo)", status: "error", - runtimeStatus: "error", - statusMessage: "session:error", repoName: "rivet-dev/sandbox-agent", updatedAtMs: minutesAgo(2), branch: "fix/auth-middleware", @@ -1178,7 +1239,7 @@ export function buildInitialTasks(): Task[] { sessionId: "status-error-session", sessionName: "Auth fix", agent: "Claude", - model: "claude-sonnet-4", + model: "sonnet", status: "error", thinkingSinceMs: null, unread: false, @@ -1197,9 +1258,7 @@ export function buildInitialTasks(): Task[] { id: "status-provisioning", repoId: "sandbox-agent", title: "Add rate limiting to API gateway (provisioning demo)", - status: "new", - runtimeStatus: "init_enqueue_provision", - statusMessage: "Queueing sandbox provisioning.", + status: "init_enqueue_provision", repoName: "rivet-dev/sandbox-agent", updatedAtMs: minutesAgo(0), branch: null, @@ -1211,7 +1270,7 @@ export function buildInitialTasks(): Task[] { sandboxSessionId: null, sessionName: "Session 1", agent: "Claude", - model: "claude-sonnet-4", + model: "sonnet", status: "pending_provision", thinkingSinceMs: null, unread: false, @@ -1259,7 +1318,6 @@ export function buildInitialTasks(): Task[] { repoId: "sandbox-agent", title: "Refactor WebSocket handler (running demo)", status: "running", - runtimeStatus: "running", repoName: "rivet-dev/sandbox-agent", updatedAtMs: minutesAgo(1), branch: "refactor/ws-handler", @@ -1300,7 +1358,7 @@ export function buildInitialTasks(): Task[] { * Uses real public repos so the mock sidebar matches what an actual rivet-dev * organization would show after a GitHub sync. */ -function buildMockRepos(): WorkbenchRepo[] { +function buildMockRepos(): WorkspaceRepo[] { return rivetDevFixture.repos.map((r) => ({ id: repoIdFromFullName(r.fullName), label: r.fullName, @@ -1313,55 +1371,19 @@ function repoIdFromFullName(fullName: string): string { return parts[parts.length - 1] ?? fullName; } -/** - * Build task entries from open PR fixture data. - * Maps to the backend's PR sync behavior (RepositoryPrSyncActor) where PRs - * appear as first-class sidebar items even without an associated task. - * Each open PR gets a lightweight task entry so it shows in the sidebar. - */ -function buildPrTasks(): Task[] { - // Collect branch names already claimed by hand-written tasks so we don't duplicate - const existingBranches = new Set( - buildInitialTasks() - .map((t) => t.branch) - .filter(Boolean), - ); - - return rivetDevFixture.openPullRequests - .filter((pr) => !existingBranches.has(pr.headRefName)) - .map((pr) => { - const repoId = repoIdFromFullName(pr.repoFullName); - return { - id: `pr-${repoId}-${pr.number}`, - repoId, - title: pr.title, - status: "idle" as const, - repoName: pr.repoFullName, - updatedAtMs: new Date(pr.updatedAt).getTime(), - branch: pr.headRefName, - pullRequest: { number: pr.number, status: pr.draft ? ("draft" as const) : ("ready" as const) }, - sessions: [], - fileChanges: [], - diffs: {}, - fileTree: [], - minutesUsed: 0, - }; - }); -} - -export function buildInitialMockLayoutViewModel(): TaskWorkbenchSnapshot { +export function buildInitialMockLayoutViewModel(): TaskWorkspaceSnapshot { const repos = buildMockRepos(); - const tasks = [...buildInitialTasks(), ...buildPrTasks()]; + const tasks = buildInitialTasks(); return { organizationId: "default", repos, - repositories: groupWorkbenchRepositories(repos, tasks), + repositories: groupWorkspaceRepositories(repos, tasks), tasks, }; } -export function groupWorkbenchRepositories(repos: WorkbenchRepo[], tasks: Task[]): WorkbenchRepositorySection[] { - const grouped = new Map(); +export function groupWorkspaceRepositories(repos: WorkspaceRepo[], tasks: Task[]): WorkspaceRepositorySection[] { + const grouped = new Map(); for (const repo of repos) { grouped.set(repo.id, { diff --git a/foundry/packages/client/test/e2e/full-integration-e2e.test.ts b/foundry/packages/client/test/e2e/full-integration-e2e.test.ts index 8446892..21eaf6b 100644 --- a/foundry/packages/client/test/e2e/full-integration-e2e.test.ts +++ b/foundry/packages/client/test/e2e/full-integration-e2e.test.ts @@ -1,6 +1,6 @@ import { randomUUID } from "node:crypto"; import { describe, expect, it } from "vitest"; -import type { HistoryEvent, RepoOverview } from "@sandbox-agent/foundry-shared"; +import type { AuditLogEvent as HistoryEvent, RepoOverview } from "@sandbox-agent/foundry-shared"; import { createBackendClient } from "../../src/backend-client.js"; import { requireImportedRepo } from "./helpers.js"; @@ -132,11 +132,11 @@ describe("e2e(client): full integration stack workflow", () => { 90_000, 1_000, async () => client.getRepoOverview(organizationId, repo.repoId), - (value) => value.branches.some((row) => row.branchName === seededBranch), + (value) => value.branches.some((row: RepoOverview["branches"][number]) => row.branchName === seededBranch), ); const postActionOverview = await client.getRepoOverview(organizationId, repo.repoId); - const seededRow = postActionOverview.branches.find((row) => row.branchName === seededBranch); + const seededRow = postActionOverview.branches.find((row: RepoOverview["branches"][number]) => row.branchName === seededBranch); expect(Boolean(seededRow)).toBe(true); expect(postActionOverview.fetchedAt).toBeGreaterThanOrEqual(overview.fetchedAt); } finally { diff --git a/foundry/packages/client/test/e2e/github-pr-e2e.test.ts b/foundry/packages/client/test/e2e/github-pr-e2e.test.ts index 83101fb..89dd638 100644 --- a/foundry/packages/client/test/e2e/github-pr-e2e.test.ts +++ b/foundry/packages/client/test/e2e/github-pr-e2e.test.ts @@ -1,5 +1,5 @@ import { describe, expect, it } from "vitest"; -import type { TaskRecord, HistoryEvent } from "@sandbox-agent/foundry-shared"; +import type { AuditLogEvent as HistoryEvent, TaskRecord } from "@sandbox-agent/foundry-shared"; import { createBackendClient } from "../../src/backend-client.js"; import { requireImportedRepo } from "./helpers.js"; @@ -80,9 +80,10 @@ function parseHistoryPayload(event: HistoryEvent): Record { } } -async function debugDump(client: ReturnType, organizationId: string, taskId: string): Promise { +async function debugDump(client: ReturnType, organizationId: string, repoId: string, taskId: string): Promise { try { - const task = await client.getTask(organizationId, taskId); + const task = await client.getTask(organizationId, repoId, taskId); + const detail = await client.getTaskDetail(organizationId, repoId, taskId).catch(() => null); const history = await client.listHistory({ organizationId, taskId, limit: 80 }).catch(() => []); const historySummary = history .slice(0, 20) @@ -90,10 +91,11 @@ async function debugDump(client: ReturnType, organiz .join("\n"); let sessionEventsSummary = ""; - if (task.activeSandboxId && task.activeSessionId) { + const activeSessionId = detail?.activeSessionId ?? null; + if (task.activeSandboxId && activeSessionId) { const events = await client .listSandboxSessionEvents(organizationId, task.sandboxProviderId, task.activeSandboxId, { - sessionId: task.activeSessionId, + sessionId: activeSessionId, limit: 50, }) .then((r) => r.items) @@ -109,13 +111,11 @@ async function debugDump(client: ReturnType, organiz JSON.stringify( { status: task.status, - statusMessage: task.statusMessage, title: task.title, branchName: task.branchName, activeSandboxId: task.activeSandboxId, - activeSessionId: task.activeSessionId, - prUrl: task.prUrl, - prSubmitted: task.prSubmitted, + activeSessionId, + pullRequestUrl: detail?.pullRequest?.url ?? null, }, null, 2, @@ -189,7 +189,7 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { // Cold local sandbox startup can exceed a few minutes on first run. 8 * 60_000, 1_000, - async () => client.getTask(organizationId, created.taskId), + async () => client.getTask(organizationId, repo.repoId, created.taskId), (h) => Boolean(h.title && h.branchName && h.activeSandboxId), (h) => { if (h.status !== lastStatus) { @@ -200,18 +200,18 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { } }, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, created.taskId); + const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }); branchName = namedAndProvisioned.branchName!; sandboxId = namedAndProvisioned.activeSandboxId!; - const withSession = await poll( + const withSession = await poll>>( "task to create active session", 3 * 60_000, 1_500, - async () => client.getTask(organizationId, created.taskId), + async () => client.getTaskDetail(organizationId, repo.repoId, created.taskId), (h) => Boolean(h.activeSessionId), (h) => { if (h.status === "error") { @@ -219,7 +219,7 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { } }, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, created.taskId); + const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }); @@ -231,14 +231,14 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { 2_000, async () => ( - await client.listSandboxSessionEvents(organizationId, withSession.sandboxProviderId, sandboxId!, { + await client.listSandboxSessionEvents(organizationId, namedAndProvisioned.sandboxProviderId, sandboxId!, { sessionId: sessionId!, limit: 40, }) ).items, (events) => events.length > 0, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, created.taskId); + const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }); @@ -246,7 +246,7 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { "task to reach idle state", 8 * 60_000, 2_000, - async () => client.getTask(organizationId, created.taskId), + async () => client.getTask(organizationId, repo.repoId, created.taskId), (h) => h.status === "idle", (h) => { if (h.status === "error") { @@ -254,7 +254,7 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { } }, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, created.taskId); + const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }); @@ -266,7 +266,7 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { (events) => events.some((e) => e.kind === "task.pr_created"), ) .catch(async (err) => { - const dump = await debugDump(client, organizationId, created.taskId); + const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }) .then((events) => events.find((e) => e.kind === "task.pr_created")!); @@ -287,16 +287,16 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { expect(prFiles.some((f) => f.filename === expectedFile)).toBe(true); // Close the task and assert the sandbox is released (stopped). - await client.runAction(organizationId, created.taskId, "archive"); + await client.runAction(organizationId, repo.repoId, created.taskId, "archive"); - await poll( + await poll>>( "task to become archived (session released)", 60_000, 1_000, - async () => client.getTask(organizationId, created.taskId), + async () => client.getTaskDetail(organizationId, repo.repoId, created.taskId), (h) => h.status === "archived" && h.activeSessionId === null, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, created.taskId); + const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); throw new Error(`${err instanceof Error ? err.message : String(err)}\n${dump}`); }); @@ -311,7 +311,7 @@ describe("e2e: backend -> sandbox-agent -> git -> PR", () => { return st.includes("destroyed") || st.includes("stopped") || st.includes("suspended") || st.includes("paused"); }, ).catch(async (err) => { - const dump = await debugDump(client, organizationId, created.taskId); + const dump = await debugDump(client, organizationId, repo.repoId, created.taskId); const state = await client.sandboxProviderState(organizationId, "local", sandboxId!).catch(() => null); throw new Error(`${err instanceof Error ? err.message : String(err)}\n` + `sandbox state: ${state ? state.state : "unknown"}\n` + `${dump}`); }); diff --git a/foundry/packages/client/test/e2e/workbench-e2e.test.ts b/foundry/packages/client/test/e2e/workspace-e2e.test.ts similarity index 78% rename from foundry/packages/client/test/e2e/workbench-e2e.test.ts rename to foundry/packages/client/test/e2e/workspace-e2e.test.ts index 5442795..1de2065 100644 --- a/foundry/packages/client/test/e2e/workbench-e2e.test.ts +++ b/foundry/packages/client/test/e2e/workspace-e2e.test.ts @@ -1,5 +1,5 @@ import { describe, expect, it } from "vitest"; -import type { TaskWorkbenchSnapshot, WorkbenchSession, WorkbenchTask, WorkbenchModelId, WorkbenchTranscriptEvent } from "@sandbox-agent/foundry-shared"; +import type { TaskWorkspaceSnapshot, WorkspaceSession, WorkspaceTask, WorkspaceModelId, WorkspaceTranscriptEvent } from "@sandbox-agent/foundry-shared"; import { createBackendClient } from "../../src/backend-client.js"; import { requireImportedRepo } from "./helpers.js"; @@ -13,21 +13,9 @@ function requiredEnv(name: string): string { return value; } -function workbenchModelEnv(name: string, fallback: WorkbenchModelId): WorkbenchModelId { +function workspaceModelEnv(name: string, fallback: WorkspaceModelId): WorkspaceModelId { const value = process.env[name]?.trim(); - switch (value) { - case "claude-sonnet-4": - case "claude-opus-4": - case "gpt-5.3-codex": - case "gpt-5.4": - case "gpt-5.2-codex": - case "gpt-5.1-codex-max": - case "gpt-5.2": - case "gpt-5.1-codex-mini": - return value; - default: - return fallback; - } + return value && value.length > 0 ? value : fallback; } async function sleep(ms: number): Promise { @@ -50,7 +38,7 @@ async function poll(label: string, timeoutMs: number, intervalMs: number, fn: } } -function findTask(snapshot: TaskWorkbenchSnapshot, taskId: string): WorkbenchTask { +function findTask(snapshot: TaskWorkspaceSnapshot, taskId: string): WorkspaceTask { const task = snapshot.tasks.find((candidate) => candidate.id === taskId); if (!task) { throw new Error(`task ${taskId} missing from snapshot`); @@ -58,7 +46,7 @@ function findTask(snapshot: TaskWorkbenchSnapshot, taskId: string): WorkbenchTas return task; } -function findTab(task: WorkbenchTask, sessionId: string): WorkbenchSession { +function findTab(task: WorkspaceTask, sessionId: string): WorkspaceSession { const tab = task.sessions.find((candidate) => candidate.id === sessionId); if (!tab) { throw new Error(`tab ${sessionId} missing from task ${task.id}`); @@ -66,7 +54,7 @@ function findTab(task: WorkbenchTask, sessionId: string): WorkbenchSession { return tab; } -function extractEventText(event: WorkbenchTranscriptEvent): string { +function extractEventText(event: WorkspaceTranscriptEvent): string { const payload = event.payload; if (!payload || typeof payload !== "object") { return String(payload ?? ""); @@ -127,7 +115,7 @@ function extractEventText(event: WorkbenchTranscriptEvent): string { return JSON.stringify(payload); } -function transcriptIncludesAgentText(transcript: WorkbenchTranscriptEvent[], expectedText: string): boolean { +function transcriptIncludesAgentText(transcript: WorkspaceTranscriptEvent[], expectedText: string): boolean { return transcript .filter((event) => event.sender === "agent") .map((event) => extractEventText(event)) @@ -135,15 +123,15 @@ function transcriptIncludesAgentText(transcript: WorkbenchTranscriptEvent[], exp .includes(expectedText); } -describe("e2e(client): workbench flows", () => { +describe("e2e(client): workspace flows", () => { it.skipIf(!RUN_WORKBENCH_E2E)( - "creates a task from an imported repo, adds sessions, exchanges messages, and manages workbench state", + "creates a task from an imported repo, adds sessions, exchanges messages, and manages workspace state", { timeout: 20 * 60_000 }, async () => { const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/v1/rivet"; const organizationId = process.env.HF_E2E_WORKSPACE?.trim() || "default"; const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO"); - const model = workbenchModelEnv("HF_E2E_MODEL", "gpt-5.3-codex"); + const model = workspaceModelEnv("HF_E2E_MODEL", "gpt-5.3-codex"); const runId = `wb-${Date.now().toString(36)}`; const expectedFile = `${runId}.txt`; const expectedInitialReply = `WORKBENCH_READY_${runId}`; @@ -155,9 +143,9 @@ describe("e2e(client): workbench flows", () => { }); const repo = await requireImportedRepo(client, organizationId, repoRemote); - const created = await client.createWorkbenchTask(organizationId, { + const created = await client.createWorkspaceTask(organizationId, { repoId: repo.repoId, - title: `Workbench E2E ${runId}`, + title: `Workspace E2E ${runId}`, branch: `e2e/${runId}`, model, task: `Reply with exactly: ${expectedInitialReply}`, @@ -167,7 +155,7 @@ describe("e2e(client): workbench flows", () => { "task provisioning", 12 * 60_000, 2_000, - async () => findTask(await client.getWorkbench(organizationId), created.taskId), + async () => findTask(await client.getWorkspace(organizationId), created.taskId), (task) => task.branch === `e2e/${runId}` && task.sessions.length > 0, ); @@ -177,7 +165,7 @@ describe("e2e(client): workbench flows", () => { "initial agent response", 12 * 60_000, 2_000, - async () => findTask(await client.getWorkbench(organizationId), created.taskId), + async () => findTask(await client.getWorkspace(organizationId), created.taskId), (task) => { const tab = findTab(task, primaryTab.id); return task.status === "idle" && tab.status === "idle" && transcriptIncludesAgentText(tab.transcript, expectedInitialReply); @@ -187,28 +175,33 @@ describe("e2e(client): workbench flows", () => { expect(findTab(initialCompleted, primaryTab.id).sessionId).toBeTruthy(); expect(transcriptIncludesAgentText(findTab(initialCompleted, primaryTab.id).transcript, expectedInitialReply)).toBe(true); - await client.renameWorkbenchTask(organizationId, { + await client.renameWorkspaceTask(organizationId, { + repoId: repo.repoId, taskId: created.taskId, - value: `Workbench E2E ${runId} Renamed`, + value: `Workspace E2E ${runId} Renamed`, }); - await client.renameWorkbenchSession(organizationId, { + await client.renameWorkspaceSession(organizationId, { + repoId: repo.repoId, taskId: created.taskId, sessionId: primaryTab.id, title: "Primary Session", }); - const secondTab = await client.createWorkbenchSession(organizationId, { + const secondTab = await client.createWorkspaceSession(organizationId, { + repoId: repo.repoId, taskId: created.taskId, model, }); - await client.renameWorkbenchSession(organizationId, { + await client.renameWorkspaceSession(organizationId, { + repoId: repo.repoId, taskId: created.taskId, sessionId: secondTab.sessionId, title: "Follow-up Session", }); - await client.updateWorkbenchDraft(organizationId, { + await client.updateWorkspaceDraft(organizationId, { + repoId: repo.repoId, taskId: created.taskId, sessionId: secondTab.sessionId, text: [ @@ -226,11 +219,12 @@ describe("e2e(client): workbench flows", () => { ], }); - const drafted = findTask(await client.getWorkbench(organizationId), created.taskId); + const drafted = findTask(await client.getWorkspace(organizationId), created.taskId); expect(findTab(drafted, secondTab.sessionId).draft.text).toContain(expectedReply); expect(findTab(drafted, secondTab.sessionId).draft.attachments).toHaveLength(1); - await client.sendWorkbenchMessage(organizationId, { + await client.sendWorkspaceMessage(organizationId, { + repoId: repo.repoId, taskId: created.taskId, sessionId: secondTab.sessionId, text: [ @@ -252,7 +246,7 @@ describe("e2e(client): workbench flows", () => { "follow-up session response", 10 * 60_000, 2_000, - async () => findTask(await client.getWorkbench(organizationId), created.taskId), + async () => findTask(await client.getWorkspace(organizationId), created.taskId), (task) => { const tab = findTab(task, secondTab.sessionId); return ( @@ -265,17 +259,19 @@ describe("e2e(client): workbench flows", () => { expect(transcriptIncludesAgentText(secondTranscript, expectedReply)).toBe(true); expect(withSecondReply.fileChanges.some((file) => file.path === expectedFile)).toBe(true); - await client.setWorkbenchSessionUnread(organizationId, { + await client.setWorkspaceSessionUnread(organizationId, { + repoId: repo.repoId, taskId: created.taskId, sessionId: secondTab.sessionId, unread: false, }); - await client.markWorkbenchUnread(organizationId, { taskId: created.taskId }); + await client.markWorkspaceUnread(organizationId, { repoId: repo.repoId, taskId: created.taskId }); - const unreadSnapshot = findTask(await client.getWorkbench(organizationId), created.taskId); + const unreadSnapshot = findTask(await client.getWorkspace(organizationId), created.taskId); expect(unreadSnapshot.sessions.some((tab) => tab.unread)).toBe(true); - await client.closeWorkbenchSession(organizationId, { + await client.closeWorkspaceSession(organizationId, { + repoId: repo.repoId, taskId: created.taskId, sessionId: secondTab.sessionId, }); @@ -284,26 +280,27 @@ describe("e2e(client): workbench flows", () => { "secondary session closed", 30_000, 1_000, - async () => findTask(await client.getWorkbench(organizationId), created.taskId), + async () => findTask(await client.getWorkspace(organizationId), created.taskId), (task) => !task.sessions.some((tab) => tab.id === secondTab.sessionId), ); expect(closedSnapshot.sessions).toHaveLength(1); - await client.revertWorkbenchFile(organizationId, { + await client.revertWorkspaceFile(organizationId, { + repoId: repo.repoId, taskId: created.taskId, path: expectedFile, }); const revertedSnapshot = await poll( - "file revert reflected in workbench", + "file revert reflected in workspace", 30_000, 1_000, - async () => findTask(await client.getWorkbench(organizationId), created.taskId), + async () => findTask(await client.getWorkspace(organizationId), created.taskId), (task) => !task.fileChanges.some((file) => file.path === expectedFile), ); expect(revertedSnapshot.fileChanges.some((file) => file.path === expectedFile)).toBe(false); - expect(revertedSnapshot.title).toBe(`Workbench E2E ${runId} Renamed`); + expect(revertedSnapshot.title).toBe(`Workspace E2E ${runId} Renamed`); expect(findTab(revertedSnapshot, primaryTab.id).sessionName).toBe("Primary Session"); }, ); diff --git a/foundry/packages/client/test/e2e/workbench-load-e2e.test.ts b/foundry/packages/client/test/e2e/workspace-load-e2e.test.ts similarity index 85% rename from foundry/packages/client/test/e2e/workbench-load-e2e.test.ts rename to foundry/packages/client/test/e2e/workspace-load-e2e.test.ts index b358b80..f9fc244 100644 --- a/foundry/packages/client/test/e2e/workbench-load-e2e.test.ts +++ b/foundry/packages/client/test/e2e/workspace-load-e2e.test.ts @@ -1,11 +1,11 @@ import { describe, expect, it } from "vitest"; import { createFoundryLogger, - type TaskWorkbenchSnapshot, - type WorkbenchSession, - type WorkbenchTask, - type WorkbenchModelId, - type WorkbenchTranscriptEvent, + type TaskWorkspaceSnapshot, + type WorkspaceSession, + type WorkspaceTask, + type WorkspaceModelId, + type WorkspaceTranscriptEvent, } from "@sandbox-agent/foundry-shared"; import { createBackendClient } from "../../src/backend-client.js"; import { requireImportedRepo } from "./helpers.js"; @@ -14,7 +14,7 @@ const RUN_WORKBENCH_LOAD_E2E = process.env.HF_ENABLE_DAEMON_WORKBENCH_LOAD_E2E = const logger = createFoundryLogger({ service: "foundry-client-e2e", bindings: { - suite: "workbench-load", + suite: "workspace-load", }, }); @@ -26,21 +26,9 @@ function requiredEnv(name: string): string { return value; } -function workbenchModelEnv(name: string, fallback: WorkbenchModelId): WorkbenchModelId { +function workspaceModelEnv(name: string, fallback: WorkspaceModelId): WorkspaceModelId { const value = process.env[name]?.trim(); - switch (value) { - case "claude-sonnet-4": - case "claude-opus-4": - case "gpt-5.3-codex": - case "gpt-5.4": - case "gpt-5.2-codex": - case "gpt-5.1-codex-max": - case "gpt-5.2": - case "gpt-5.1-codex-mini": - return value; - default: - return fallback; - } + return value && value.length > 0 ? value : fallback; } function intEnv(name: string, fallback: number): number { @@ -72,7 +60,7 @@ async function poll(label: string, timeoutMs: number, intervalMs: number, fn: } } -function findTask(snapshot: TaskWorkbenchSnapshot, taskId: string): WorkbenchTask { +function findTask(snapshot: TaskWorkspaceSnapshot, taskId: string): WorkspaceTask { const task = snapshot.tasks.find((candidate) => candidate.id === taskId); if (!task) { throw new Error(`task ${taskId} missing from snapshot`); @@ -80,7 +68,7 @@ function findTask(snapshot: TaskWorkbenchSnapshot, taskId: string): WorkbenchTas return task; } -function findTab(task: WorkbenchTask, sessionId: string): WorkbenchSession { +function findTab(task: WorkspaceTask, sessionId: string): WorkspaceSession { const tab = task.sessions.find((candidate) => candidate.id === sessionId); if (!tab) { throw new Error(`tab ${sessionId} missing from task ${task.id}`); @@ -88,7 +76,7 @@ function findTab(task: WorkbenchTask, sessionId: string): WorkbenchSession { return tab; } -function extractEventText(event: WorkbenchTranscriptEvent): string { +function extractEventText(event: WorkspaceTranscriptEvent): string { const payload = event.payload; if (!payload || typeof payload !== "object") { return String(payload ?? ""); @@ -138,7 +126,7 @@ function extractEventText(event: WorkbenchTranscriptEvent): string { return typeof envelope.method === "string" ? envelope.method : JSON.stringify(payload); } -function transcriptIncludesAgentText(transcript: WorkbenchTranscriptEvent[], expectedText: string): boolean { +function transcriptIncludesAgentText(transcript: WorkspaceTranscriptEvent[], expectedText: string): boolean { return transcript .filter((event) => event.sender === "agent") .map((event) => extractEventText(event)) @@ -150,7 +138,7 @@ function average(values: number[]): number { return values.reduce((sum, value) => sum + value, 0) / Math.max(values.length, 1); } -async function measureWorkbenchSnapshot( +async function measureWorkspaceSnapshot( client: ReturnType, organizationId: string, iterations: number, @@ -163,11 +151,11 @@ async function measureWorkbenchSnapshot( transcriptEventCount: number; }> { const durations: number[] = []; - let snapshot: TaskWorkbenchSnapshot | null = null; + let snapshot: TaskWorkspaceSnapshot | null = null; for (let index = 0; index < iterations; index += 1) { const startedAt = performance.now(); - snapshot = await client.getWorkbench(organizationId); + snapshot = await client.getWorkspace(organizationId); durations.push(performance.now() - startedAt); } @@ -191,12 +179,12 @@ async function measureWorkbenchSnapshot( }; } -describe("e2e(client): workbench load", () => { +describe("e2e(client): workspace load", () => { it.skipIf(!RUN_WORKBENCH_LOAD_E2E)("runs a simple sequential load profile against the real backend", { timeout: 30 * 60_000 }, async () => { const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/v1/rivet"; const organizationId = process.env.HF_E2E_WORKSPACE?.trim() || "default"; const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO"); - const model = workbenchModelEnv("HF_E2E_MODEL", "gpt-5.3-codex"); + const model = workspaceModelEnv("HF_E2E_MODEL", "gpt-5.3-codex"); const taskCount = intEnv("HF_LOAD_TASK_COUNT", 3); const extraSessionCount = intEnv("HF_LOAD_EXTRA_SESSION_COUNT", 2); const pollIntervalMs = intEnv("HF_LOAD_POLL_INTERVAL_MS", 2_000); @@ -220,16 +208,16 @@ describe("e2e(client): workbench load", () => { transcriptEventCount: number; }> = []; - snapshotSeries.push(await measureWorkbenchSnapshot(client, organizationId, 2)); + snapshotSeries.push(await measureWorkspaceSnapshot(client, organizationId, 2)); for (let taskIndex = 0; taskIndex < taskCount; taskIndex += 1) { const runId = `load-${taskIndex}-${Date.now().toString(36)}`; const initialReply = `LOAD_INIT_${runId}`; const createStartedAt = performance.now(); - const created = await client.createWorkbenchTask(organizationId, { + const created = await client.createWorkspaceTask(organizationId, { repoId: repo.repoId, - title: `Workbench Load ${runId}`, + title: `Workspace Load ${runId}`, branch: `load/${runId}`, model, task: `Reply with exactly: ${initialReply}`, @@ -241,7 +229,7 @@ describe("e2e(client): workbench load", () => { `task ${runId} provisioning`, 12 * 60_000, pollIntervalMs, - async () => findTask(await client.getWorkbench(organizationId), created.taskId), + async () => findTask(await client.getWorkspace(organizationId), created.taskId), (task) => { const tab = task.sessions[0]; return Boolean(tab && task.status === "idle" && tab.status === "idle" && transcriptIncludesAgentText(tab.transcript, initialReply)); @@ -256,13 +244,15 @@ describe("e2e(client): workbench load", () => { for (let sessionIndex = 0; sessionIndex < extraSessionCount; sessionIndex += 1) { const expectedReply = `LOAD_REPLY_${runId}_${sessionIndex}`; const createSessionStartedAt = performance.now(); - const createdSession = await client.createWorkbenchSession(organizationId, { + const createdSession = await client.createWorkspaceSession(organizationId, { + repoId: repo.repoId, taskId: created.taskId, model, }); createSessionLatencies.push(performance.now() - createSessionStartedAt); - await client.sendWorkbenchMessage(organizationId, { + await client.sendWorkspaceMessage(organizationId, { + repoId: repo.repoId, taskId: created.taskId, sessionId: createdSession.sessionId, text: `Run pwd in the repo, then reply with exactly: ${expectedReply}`, @@ -274,7 +264,7 @@ describe("e2e(client): workbench load", () => { `task ${runId} session ${sessionIndex} reply`, 10 * 60_000, pollIntervalMs, - async () => findTask(await client.getWorkbench(organizationId), created.taskId), + async () => findTask(await client.getWorkspace(organizationId), created.taskId), (task) => { const tab = findTab(task, createdSession.sessionId); return tab.status === "idle" && transcriptIncludesAgentText(tab.transcript, expectedReply); @@ -285,14 +275,14 @@ describe("e2e(client): workbench load", () => { expect(transcriptIncludesAgentText(findTab(withReply, createdSession.sessionId).transcript, expectedReply)).toBe(true); } - const snapshotMetrics = await measureWorkbenchSnapshot(client, organizationId, 3); + const snapshotMetrics = await measureWorkspaceSnapshot(client, organizationId, 3); snapshotSeries.push(snapshotMetrics); logger.info( { taskIndex: taskIndex + 1, ...snapshotMetrics, }, - "workbench_load_snapshot", + "workspace_load_snapshot", ); } @@ -314,7 +304,7 @@ describe("e2e(client): workbench load", () => { snapshotTranscriptFinalCount: lastSnapshot.transcriptEventCount, }; - logger.info(summary, "workbench_load_summary"); + logger.info(summary, "workspace_load_summary"); expect(createTaskLatencies.length).toBe(taskCount); expect(provisionLatencies.length).toBe(taskCount); diff --git a/foundry/packages/client/test/keys.test.ts b/foundry/packages/client/test/keys.test.ts index 9bd6477..6b93ec1 100644 --- a/foundry/packages/client/test/keys.test.ts +++ b/foundry/packages/client/test/keys.test.ts @@ -1,15 +1,9 @@ import { describe, expect, it } from "vitest"; -import { historyKey, organizationKey, repositoryKey, taskKey, taskSandboxKey } from "../src/keys.js"; +import { auditLogKey, organizationKey, taskKey, taskSandboxKey } from "../src/keys.js"; describe("actor keys", () => { it("prefixes every key with organization namespace", () => { - const keys = [ - organizationKey("default"), - repositoryKey("default", "repo"), - taskKey("default", "repo", "task"), - taskSandboxKey("default", "sbx"), - historyKey("default", "repo"), - ]; + const keys = [organizationKey("default"), taskKey("default", "repo", "task"), taskSandboxKey("default", "sbx"), auditLogKey("default")]; for (const key of keys) { expect(key[0]).toBe("org"); diff --git a/foundry/packages/client/test/subscription-manager.test.ts b/foundry/packages/client/test/subscription-manager.test.ts index 9908113..f0a29c2 100644 --- a/foundry/packages/client/test/subscription-manager.test.ts +++ b/foundry/packages/client/test/subscription-manager.test.ts @@ -50,6 +50,20 @@ class FakeActorConn implements ActorConn { function organizationSnapshot(): OrganizationSummarySnapshot { return { organizationId: "org-1", + github: { + connectedAccount: "octocat", + installationStatus: "connected", + syncStatus: "synced", + importedRepoCount: 1, + lastSyncLabel: "Synced just now", + lastSyncAt: 10, + lastWebhookAt: null, + lastWebhookEvent: "", + syncGeneration: 1, + syncPhase: null, + processedRepositoryCount: 1, + totalRepositoryCount: 1, + }, repos: [{ id: "repo-1", label: "repo-1", taskCount: 1, latestActivityMs: 10 }], taskSummaries: [ { @@ -61,10 +75,12 @@ function organizationSnapshot(): OrganizationSummarySnapshot { updatedAtMs: 10, branch: "main", pullRequest: null, + activeSessionId: null, sessionsSummary: [], + primaryUserLogin: null, + primaryUserAvatarUrl: null, }, ], - openPullRequests: [], }; } @@ -115,20 +131,46 @@ describe("RemoteSubscriptionManager", () => { ]); conn.emit("organizationUpdated", { - type: "taskSummaryUpdated", - taskSummary: { - id: "task-1", - repoId: "repo-1", - title: "Updated task", - status: "running", - repoName: "repo-1", - updatedAtMs: 20, - branch: "feature/live", - pullRequest: null, - sessionsSummary: [], + type: "organizationUpdated", + snapshot: { + organizationId: "org-1", + github: { + connectedAccount: "octocat", + installationStatus: "connected", + syncStatus: "syncing", + importedRepoCount: 1, + lastSyncLabel: "Syncing repositories...", + lastSyncAt: 10, + lastWebhookAt: null, + lastWebhookEvent: "", + syncGeneration: 2, + syncPhase: "syncing_branches", + processedRepositoryCount: 1, + totalRepositoryCount: 3, + }, + repos: [], + taskSummaries: [ + { + id: "task-1", + repoId: "repo-1", + title: "Updated task", + status: "running", + repoName: "repo-1", + updatedAtMs: 20, + branch: "feature/live", + pullRequest: null, + activeSessionId: null, + sessionsSummary: [], + primaryUserLogin: null, + primaryUserAvatarUrl: null, + }, + ], }, } satisfies OrganizationEvent); + // applyEvent chains onto an internal promise — flush the microtask queue + await flushAsyncWork(); + expect(manager.getSnapshot("organization", params)?.taskSummaries[0]?.title).toBe("Updated task"); expect(listenerA).toHaveBeenCalled(); expect(listenerB).toHaveBeenCalled(); diff --git a/foundry/packages/client/test/view-model.test.ts b/foundry/packages/client/test/view-model.test.ts index b494135..d418c2f 100644 --- a/foundry/packages/client/test/view-model.test.ts +++ b/foundry/packages/client/test/view-model.test.ts @@ -12,9 +12,8 @@ const sample: TaskRecord = { task: "Do test", sandboxProviderId: "local", status: "running", - statusMessage: null, activeSandboxId: "sandbox-1", - activeSessionId: "session-1", + pullRequest: null, sandboxes: [ { sandboxId: "sandbox-1", @@ -26,17 +25,6 @@ const sample: TaskRecord = { updatedAt: 1, }, ], - agentType: null, - prSubmitted: false, - diffStat: null, - prUrl: null, - prAuthor: null, - ciStatus: null, - reviewStatus: null, - reviewer: null, - conflictsWithMain: null, - hasUnpushed: null, - parentBranch: null, createdAt: 1, updatedAt: 1, }; diff --git a/foundry/packages/client/tsconfig.build.json b/foundry/packages/client/tsconfig.build.json new file mode 100644 index 0000000..35bcdb2 --- /dev/null +++ b/foundry/packages/client/tsconfig.build.json @@ -0,0 +1,6 @@ +{ + "extends": "./tsconfig.json", + "compilerOptions": { + "ignoreDeprecations": "6.0" + } +} diff --git a/foundry/packages/frontend/src/components/dev-panel.tsx b/foundry/packages/frontend/src/components/dev-panel.tsx index 56907ff..947331e 100644 --- a/foundry/packages/frontend/src/components/dev-panel.tsx +++ b/foundry/packages/frontend/src/components/dev-panel.tsx @@ -6,11 +6,10 @@ import { subscriptionManager } from "../lib/subscription"; import type { FoundryAppSnapshot, FoundryOrganization, - TaskStatus, - TaskWorkbenchSnapshot, - WorkbenchSandboxSummary, - WorkbenchSessionSummary, - WorkbenchTaskStatus, + TaskWorkspaceSnapshot, + WorkspaceSandboxSummary, + WorkspaceSessionSummary, + WorkspaceTaskStatus, } from "@sandbox-agent/foundry-shared"; import { useSubscription } from "@sandbox-agent/foundry-client"; import type { DebugSubscriptionTopic } from "@sandbox-agent/foundry-client"; @@ -18,7 +17,7 @@ import { describeTaskState } from "../features/tasks/status"; interface DevPanelProps { organizationId: string; - snapshot: TaskWorkbenchSnapshot; + snapshot: TaskWorkspaceSnapshot; organization?: FoundryOrganization | null; focusedTask?: DevPanelFocusedTask | null; } @@ -27,14 +26,12 @@ export interface DevPanelFocusedTask { id: string; repoId: string; title: string | null; - status: WorkbenchTaskStatus; - runtimeStatus?: TaskStatus | null; - statusMessage?: string | null; + status: WorkspaceTaskStatus; branch?: string | null; activeSandboxId?: string | null; activeSessionId?: string | null; - sandboxes?: WorkbenchSandboxSummary[]; - sessions?: WorkbenchSessionSummary[]; + sandboxes?: WorkspaceSandboxSummary[]; + sessions?: WorkspaceSessionSummary[]; } interface TopicInfo { @@ -80,7 +77,7 @@ function timeAgo(ts: number | null): string { } function statusColor(status: string, t: ReturnType): string { - if (status === "new" || status.startsWith("init_") || status.startsWith("archive_") || status.startsWith("kill_") || status.startsWith("pending_")) { + if (status.startsWith("init_") || status.startsWith("archive_") || status.startsWith("kill_") || status.startsWith("pending_")) { return t.statusWarning; } switch (status) { @@ -159,14 +156,16 @@ export const DevPanel = memo(function DevPanel({ organizationId, snapshot, organ }, [now]); const appState = useSubscription(subscriptionManager, "app", {}); + const organizationState = useSubscription(subscriptionManager, "organization", { organizationId }); const appSnapshot: FoundryAppSnapshot | null = appState.data ?? null; + const liveGithub = organizationState.data?.github ?? organization?.github ?? null; const repos = snapshot.repos ?? []; const tasks = snapshot.tasks ?? []; const prCount = tasks.filter((task) => task.pullRequest != null).length; - const focusedTaskStatus = focusedTask?.runtimeStatus ?? focusedTask?.status ?? null; - const focusedTaskState = describeTaskState(focusedTaskStatus, focusedTask?.statusMessage ?? null); - const lastWebhookAt = organization?.github.lastWebhookAt ?? null; + const focusedTaskStatus = focusedTask?.status ?? null; + const focusedTaskState = describeTaskState(focusedTaskStatus); + const lastWebhookAt = liveGithub?.lastWebhookAt ?? null; const hasRecentWebhook = lastWebhookAt != null && now - lastWebhookAt < 5 * 60_000; const totalOrgs = appSnapshot?.organizations.length ?? 0; const authStatus = appSnapshot?.auth.status ?? "unknown"; @@ -442,7 +441,7 @@ export const DevPanel = memo(function DevPanel({ organizationId, snapshot, organ {/* GitHub */}
- {organization ? ( + {liveGithub ? (
App Install - - {organization.github.installationStatus.replace(/_/g, " ")} + + {liveGithub.installationStatus.replace(/_/g, " ")}
@@ -465,15 +464,13 @@ export const DevPanel = memo(function DevPanel({ organizationId, snapshot, organ width: "5px", height: "5px", borderRadius: "50%", - backgroundColor: syncStatusColor(organization.github.syncStatus, t), + backgroundColor: syncStatusColor(liveGithub.syncStatus, t), flexShrink: 0, })} /> Sync - {organization.github.syncStatus} - {organization.github.lastSyncAt != null && ( - {timeAgo(organization.github.lastSyncAt)} - )} + {liveGithub.syncStatus} + {liveGithub.lastSyncAt != null && {timeAgo(liveGithub.lastSyncAt)}}
Webhook {lastWebhookAt != null ? ( - {organization.github.lastWebhookEvent} · {timeAgo(lastWebhookAt)} + {liveGithub.lastWebhookEvent} · {timeAgo(lastWebhookAt)} ) : ( never received )}
- - + + +
- {organization.github.connectedAccount && ( -
@{organization.github.connectedAccount}
- )} - {organization.github.lastSyncLabel && ( -
last sync: {organization.github.lastSyncLabel}
+ {liveGithub.connectedAccount &&
@{liveGithub.connectedAccount}
} + {liveGithub.lastSyncLabel &&
last sync: {liveGithub.lastSyncLabel}
} + {liveGithub.syncPhase && ( +
+ phase: {liveGithub.syncPhase.replace(/^syncing_/, "").replace(/_/g, " ")} ({liveGithub.processedRepositoryCount}/ + {liveGithub.totalRepositoryCount}) +
)}
) : ( diff --git a/foundry/packages/frontend/src/components/mock-layout.tsx b/foundry/packages/frontend/src/components/mock-layout.tsx index 1ff4d35..4089e01 100644 --- a/foundry/packages/frontend/src/components/mock-layout.tsx +++ b/foundry/packages/frontend/src/components/mock-layout.tsx @@ -1,14 +1,17 @@ import { memo, useCallback, useEffect, useLayoutEffect, useMemo, useRef, useState, type PointerEvent as ReactPointerEvent } from "react"; +import { useQuery } from "@tanstack/react-query"; import { useNavigate } from "@tanstack/react-router"; import { useStyletron } from "baseui"; import { + DEFAULT_WORKSPACE_MODEL_GROUPS, + DEFAULT_WORKSPACE_MODEL_ID, createErrorContext, type FoundryOrganization, - type TaskWorkbenchSnapshot, - type WorkbenchOpenPrSummary, - type WorkbenchSessionSummary, - type WorkbenchTaskDetail, - type WorkbenchTaskSummary, + type TaskWorkspaceSnapshot, + type WorkspaceModelGroup, + type WorkspaceSessionSummary, + type WorkspaceTaskDetail, + type WorkspaceTaskSummary, } from "@sandbox-agent/foundry-shared"; import { useSubscription } from "@sandbox-agent/foundry-client"; @@ -39,7 +42,7 @@ import { type Message, type ModelId, } from "./mock-layout/view-model"; -import { activeMockOrganization, useMockAppSnapshot } from "../lib/mock-app"; +import { activeMockOrganization, activeMockUser, getMockOrganizationById, useMockAppClient, useMockAppSnapshot } from "../lib/mock-app"; import { backendClient } from "../lib/backend"; import { subscriptionManager } from "../lib/subscription"; import { describeTaskState, isProvisioningTaskStatus } from "../features/tasks/status"; @@ -77,29 +80,38 @@ function sanitizeActiveSessionId(task: Task, sessionId: string | null | undefine return openDiffs.length > 0 ? diffTabId(openDiffs[openDiffs.length - 1]!) : lastAgentSessionId; } -function githubInstallationWarningTitle(organization: FoundryOrganization): string { - return organization.github.installationStatus === "install_required" ? "GitHub App not installed" : "GitHub App needs reconnection"; +type GithubStatusView = Pick< + FoundryOrganization["github"], + "connectedAccount" | "installationStatus" | "syncStatus" | "importedRepoCount" | "lastSyncLabel" +> & { + syncPhase?: string | null; + processedRepositoryCount?: number; + totalRepositoryCount?: number; +}; + +function githubInstallationWarningTitle(github: GithubStatusView): string { + return github.installationStatus === "install_required" ? "GitHub App not installed" : "GitHub App needs reconnection"; } -function githubInstallationWarningDetail(organization: FoundryOrganization): string { - const statusDetail = organization.github.lastSyncLabel.trim(); +function githubInstallationWarningDetail(github: GithubStatusView): string { + const statusDetail = github.lastSyncLabel.trim(); const requirementDetail = - organization.github.installationStatus === "install_required" + github.installationStatus === "install_required" ? "Webhooks are required for Foundry to function. Repo sync and PR updates will not work until the GitHub App is installed for this organization." : "Webhook delivery is unavailable. Repo sync and PR updates will not work until the GitHub App is reconnected."; return statusDetail ? `${requirementDetail} ${statusDetail}.` : requirementDetail; } function GithubInstallationWarning({ - organization, + github, css, t, }: { - organization: FoundryOrganization; + github: GithubStatusView; css: ReturnType[0]; t: ReturnType; }) { - if (organization.github.installationStatus === "connected") { + if (github.installationStatus === "connected") { return null; } @@ -123,15 +135,15 @@ function GithubInstallationWarning({ >
-
{githubInstallationWarningTitle(organization)}
-
{githubInstallationWarningDetail(organization)}
+
{githubInstallationWarningTitle(github)}
+
{githubInstallationWarningDetail(github)}
); } function toSessionModel( - summary: WorkbenchSessionSummary, + summary: WorkspaceSessionSummary, sessionDetail?: { draft: Task["sessions"][number]["draft"]; transcript: Task["sessions"][number]["transcript"] }, ): Task["sessions"][number] { return { @@ -155,8 +167,8 @@ function toSessionModel( } function toTaskModel( - summary: WorkbenchTaskSummary, - detail?: WorkbenchTaskDetail, + summary: WorkspaceTaskSummary, + detail?: WorkspaceTaskDetail, sessionCache?: Map, ): Task { const sessions = detail?.sessionsSummary ?? summary.sessionsSummary; @@ -164,53 +176,21 @@ function toTaskModel( id: summary.id, repoId: summary.repoId, title: detail?.title ?? summary.title, - status: detail?.runtimeStatus ?? detail?.status ?? summary.status, - runtimeStatus: detail?.runtimeStatus, - statusMessage: detail?.statusMessage ?? null, + status: detail?.status ?? summary.status, repoName: detail?.repoName ?? summary.repoName, updatedAtMs: detail?.updatedAtMs ?? summary.updatedAtMs, branch: detail?.branch ?? summary.branch, pullRequest: detail?.pullRequest ?? summary.pullRequest, + activeSessionId: detail?.activeSessionId ?? summary.activeSessionId ?? null, sessions: sessions.map((session) => toSessionModel(session, sessionCache?.get(session.id))), fileChanges: detail?.fileChanges ?? [], diffs: detail?.diffs ?? {}, fileTree: detail?.fileTree ?? [], minutesUsed: detail?.minutesUsed ?? 0, + sandboxes: detail?.sandboxes ?? [], activeSandboxId: detail?.activeSandboxId ?? null, - }; -} - -const OPEN_PR_TASK_PREFIX = "pr:"; - -function openPrTaskId(prId: string): string { - return `${OPEN_PR_TASK_PREFIX}${prId}`; -} - -function isOpenPrTaskId(taskId: string): boolean { - return taskId.startsWith(OPEN_PR_TASK_PREFIX); -} - -function toOpenPrTaskModel(pullRequest: WorkbenchOpenPrSummary): Task { - return { - id: openPrTaskId(pullRequest.prId), - repoId: pullRequest.repoId, - title: pullRequest.title, - status: "new", - runtimeStatus: undefined, - statusMessage: pullRequest.authorLogin ? `@${pullRequest.authorLogin}` : null, - repoName: pullRequest.repoFullName, - updatedAtMs: pullRequest.updatedAtMs, - branch: pullRequest.headRefName, - pullRequest: { - number: pullRequest.number, - status: pullRequest.isDraft ? "draft" : "ready", - }, - sessions: [], - fileChanges: [], - diffs: {}, - fileTree: [], - minutesUsed: 0, - activeSandboxId: null, + primaryUserLogin: detail?.primaryUserLogin ?? summary.primaryUserLogin ?? null, + primaryUserAvatarUrl: detail?.primaryUserAvatarUrl ?? summary.primaryUserAvatarUrl ?? null, }; } @@ -230,18 +210,41 @@ function sessionStateMessage(tab: Task["sessions"][number] | null | undefined): return null; } -function groupRepositories(repos: Array<{ id: string; label: string }>, tasks: Task[]) { +function groupRepositories( + repos: Array<{ id: string; label: string }>, + tasks: Task[], + openPullRequests?: Array<{ + repoId: string; + repoFullName: string; + number: number; + title: string; + state: string; + url: string; + headRefName: string; + authorLogin: string | null; + isDraft: boolean; + }>, +) { return repos .map((repo) => ({ id: repo.id, label: repo.label, updatedAtMs: tasks.filter((task) => task.repoId === repo.id).reduce((latest, task) => Math.max(latest, task.updatedAtMs), 0), tasks: tasks.filter((task) => task.repoId === repo.id).sort((left, right) => right.updatedAtMs - left.updatedAtMs), + pullRequests: (openPullRequests ?? []).filter((pr) => pr.repoId === repo.id), })) - .filter((repo) => repo.tasks.length > 0); + .sort((a, b) => { + // Repos with tasks first, then repos with PRs, then alphabetical + const aHasActivity = a.tasks.length > 0 || a.pullRequests.length > 0; + const bHasActivity = b.tasks.length > 0 || b.pullRequests.length > 0; + if (aHasActivity && !bHasActivity) return -1; + if (!aHasActivity && bHasActivity) return 1; + if (a.updatedAtMs !== b.updatedAtMs) return b.updatedAtMs - a.updatedAtMs; + return a.label.localeCompare(b.label); + }); } -interface WorkbenchActions { +interface WorkspaceActions { createTask(input: { repoId: string; task: string; @@ -250,28 +253,27 @@ interface WorkbenchActions { onBranch?: string; model?: ModelId; }): Promise<{ taskId: string; sessionId?: string }>; - markTaskUnread(input: { taskId: string }): Promise; - renameTask(input: { taskId: string; value: string }): Promise; - renameBranch(input: { taskId: string; value: string }): Promise; - archiveTask(input: { taskId: string }): Promise; - publishPr(input: { taskId: string }): Promise; - revertFile(input: { taskId: string; path: string }): Promise; - updateDraft(input: { taskId: string; sessionId: string; text: string; attachments: LineAttachment[] }): Promise; - sendMessage(input: { taskId: string; sessionId: string; text: string; attachments: LineAttachment[] }): Promise; - stopAgent(input: { taskId: string; sessionId: string }): Promise; - setSessionUnread(input: { taskId: string; sessionId: string; unread: boolean }): Promise; - renameSession(input: { taskId: string; sessionId: string; title: string }): Promise; - closeSession(input: { taskId: string; sessionId: string }): Promise; - addSession(input: { taskId: string; model?: string }): Promise<{ sessionId: string }>; - changeModel(input: { taskId: string; sessionId: string; model: ModelId }): Promise; - reloadGithubOrganization(): Promise; - reloadGithubPullRequests(): Promise; - reloadGithubRepository(repoId: string): Promise; - reloadGithubPullRequest(repoId: string, prNumber: number): Promise; + markTaskUnread(input: { repoId: string; taskId: string }): Promise; + renameTask(input: { repoId: string; taskId: string; value: string }): Promise; + archiveTask(input: { repoId: string; taskId: string }): Promise; + publishPr(input: { repoId: string; taskId: string }): Promise; + revertFile(input: { repoId: string; taskId: string; path: string }): Promise; + updateDraft(input: { repoId: string; taskId: string; sessionId: string; text: string; attachments: LineAttachment[] }): Promise; + sendMessage(input: { repoId: string; taskId: string; sessionId: string; text: string; attachments: LineAttachment[] }): Promise; + stopAgent(input: { repoId: string; taskId: string; sessionId: string }): Promise; + selectSession(input: { repoId: string; taskId: string; sessionId: string }): Promise; + setSessionUnread(input: { repoId: string; taskId: string; sessionId: string; unread: boolean }): Promise; + renameSession(input: { repoId: string; taskId: string; sessionId: string; title: string }): Promise; + closeSession(input: { repoId: string; taskId: string; sessionId: string }): Promise; + addSession(input: { repoId: string; taskId: string; model?: string }): Promise<{ sessionId: string }>; + changeModel(input: { repoId: string; taskId: string; sessionId: string; model: ModelId }): Promise; + changeOwner(input: { repoId: string; taskId: string; targetUserId: string; targetUserName: string; targetUserEmail: string }): Promise; + adminReloadGithubOrganization(): Promise; + adminReloadGithubRepository(repoId: string): Promise; } const TranscriptPanel = memo(function TranscriptPanel({ - taskWorkbenchClient, + taskWorkspaceClient, task, hasSandbox, activeSessionId, @@ -288,9 +290,10 @@ const TranscriptPanel = memo(function TranscriptPanel({ rightSidebarCollapsed, onToggleRightSidebar, selectedSessionHydrating = false, + modelGroups, onNavigateToUsage, }: { - taskWorkbenchClient: WorkbenchActions; + taskWorkspaceClient: WorkspaceActions; task: Task; hasSandbox: boolean; activeSessionId: string | null; @@ -307,11 +310,15 @@ const TranscriptPanel = memo(function TranscriptPanel({ rightSidebarCollapsed?: boolean; onToggleRightSidebar?: () => void; selectedSessionHydrating?: boolean; + modelGroups: WorkspaceModelGroup[]; onNavigateToUsage?: () => void; }) { const t = useFoundryTokens(); - const [defaultModel, setDefaultModel] = useState("claude-sonnet-4"); - const [editingField, setEditingField] = useState<"title" | "branch" | null>(null); + const appSnapshot = useMockAppSnapshot(); + const appClient = useMockAppClient(); + const currentUser = activeMockUser(appSnapshot); + const defaultModel = currentUser?.defaultModel ?? DEFAULT_WORKSPACE_MODEL_ID; + const [editingField, setEditingField] = useState<"title" | null>(null); const [editValue, setEditValue] = useState(""); const [editingSessionId, setEditingSessionId] = useState(null); const [editingSessionName, setEditingSessionName] = useState(""); @@ -333,9 +340,8 @@ const TranscriptPanel = memo(function TranscriptPanel({ const isTerminal = task.status === "archived"; const historyEvents = useMemo(() => buildHistoryEvents(task.sessions), [task.sessions]); const activeMessages = useMemo(() => buildDisplayMessages(activeAgentSession), [activeAgentSession]); - const taskRuntimeStatus = task.runtimeStatus ?? task.status; - const taskState = describeTaskState(taskRuntimeStatus, task.statusMessage ?? null); - const taskProvisioning = isProvisioningTaskStatus(taskRuntimeStatus); + const taskState = describeTaskState(task.status); + const taskProvisioning = isProvisioningTaskStatus(task.status); const taskProvisioningMessage = taskState.detail; const activeSessionMessage = sessionStateMessage(activeAgentSession); const showPendingSessionState = @@ -344,16 +350,17 @@ const TranscriptPanel = memo(function TranscriptPanel({ (activeAgentSession.status === "pending_provision" || activeAgentSession.status === "pending_session_create" || activeAgentSession.status === "error") && activeMessages.length === 0; const serverDraft = promptSession?.draft.text ?? ""; - const serverAttachments = promptSession?.draft.attachments ?? []; + const serverAttachments = promptSession?.draft.attachments; + const serverAttachmentsJson = JSON.stringify(serverAttachments ?? []); // Sync server → local only when user hasn't typed recently (3s cooldown) const DRAFT_SYNC_COOLDOWN_MS = 3_000; useEffect(() => { if (Date.now() - lastEditTimeRef.current > DRAFT_SYNC_COOLDOWN_MS) { setLocalDraft(serverDraft); - setLocalAttachments(serverAttachments); + setLocalAttachments(serverAttachments ?? []); } - }, [serverDraft, serverAttachments]); + }, [serverDraft, serverAttachmentsJson]); // Reset local draft immediately on session/task switch useEffect(() => { @@ -436,14 +443,15 @@ const TranscriptPanel = memo(function TranscriptPanel({ return; } - void taskWorkbenchClient.setSessionUnread({ + void taskWorkspaceClient.setSessionUnread({ + repoId: task.repoId, taskId: task.id, sessionId: activeAgentSession.id, unread: false, }); }, [activeAgentSession?.id, activeAgentSession?.unread, task.id]); - const startEditingField = useCallback((field: "title" | "branch", value: string) => { + const startEditingField = useCallback((field: "title", value: string) => { setEditingField(field); setEditValue(value); }, []); @@ -453,18 +461,14 @@ const TranscriptPanel = memo(function TranscriptPanel({ }, []); const commitEditingField = useCallback( - (field: "title" | "branch") => { + (field: "title") => { const value = editValue.trim(); if (!value) { setEditingField(null); return; } - if (field === "title") { - void taskWorkbenchClient.renameTask({ taskId: task.id, value }); - } else { - void taskWorkbenchClient.renameBranch({ taskId: task.id, value }); - } + void taskWorkspaceClient.renameTask({ repoId: task.repoId, taskId: task.id, value }); setEditingField(null); }, [editValue, task.id], @@ -474,7 +478,8 @@ const TranscriptPanel = memo(function TranscriptPanel({ const flushDraft = useCallback( (text: string, nextAttachments: LineAttachment[], sessionId: string) => { - void taskWorkbenchClient.updateDraft({ + void taskWorkspaceClient.updateDraft({ + repoId: task.repoId, taskId: task.id, sessionId, text, @@ -535,7 +540,8 @@ const TranscriptPanel = memo(function TranscriptPanel({ onSetActiveSessionId(promptSession.id); onSetLastAgentSessionId(promptSession.id); - void taskWorkbenchClient.sendMessage({ + void taskWorkspaceClient.sendMessage({ + repoId: task.repoId, taskId: task.id, sessionId: promptSession.id, text, @@ -548,7 +554,8 @@ const TranscriptPanel = memo(function TranscriptPanel({ return; } - void taskWorkbenchClient.stopAgent({ + void taskWorkspaceClient.stopAgent({ + repoId: task.repoId, taskId: task.id, sessionId: promptSession.id, }); @@ -560,9 +567,15 @@ const TranscriptPanel = memo(function TranscriptPanel({ if (!isDiffTab(sessionId)) { onSetLastAgentSessionId(sessionId); + void taskWorkspaceClient.selectSession({ + repoId: task.repoId, + taskId: task.id, + sessionId, + }); const session = task.sessions.find((candidate) => candidate.id === sessionId); if (session?.unread) { - void taskWorkbenchClient.setSessionUnread({ + void taskWorkspaceClient.setSessionUnread({ + repoId: task.repoId, taskId: task.id, sessionId, unread: false, @@ -571,14 +584,14 @@ const TranscriptPanel = memo(function TranscriptPanel({ onSyncRouteSession(task.id, sessionId); } }, - [task.id, task.sessions, onSetActiveSessionId, onSetLastAgentSessionId, onSyncRouteSession], + [task.id, task.repoId, task.sessions, onSetActiveSessionId, onSetLastAgentSessionId, onSyncRouteSession], ); const setSessionUnread = useCallback( (sessionId: string, unread: boolean) => { - void taskWorkbenchClient.setSessionUnread({ taskId: task.id, sessionId, unread }); + void taskWorkspaceClient.setSessionUnread({ repoId: task.repoId, taskId: task.id, sessionId, unread }); }, - [task.id], + [task.id, task.repoId], ); const startRenamingSession = useCallback( @@ -610,7 +623,8 @@ const TranscriptPanel = memo(function TranscriptPanel({ return; } - void taskWorkbenchClient.renameSession({ + void taskWorkspaceClient.renameSession({ + repoId: task.repoId, taskId: task.id, sessionId: editingSessionId, title: trimmedName, @@ -631,9 +645,9 @@ const TranscriptPanel = memo(function TranscriptPanel({ } onSyncRouteSession(task.id, nextSessionId); - void taskWorkbenchClient.closeSession({ taskId: task.id, sessionId }); + void taskWorkspaceClient.closeSession({ repoId: task.repoId, taskId: task.id, sessionId }); }, - [activeSessionId, task.id, task.sessions, lastAgentSessionId, onSetActiveSessionId, onSetLastAgentSessionId, onSyncRouteSession], + [activeSessionId, task.id, task.repoId, task.sessions, lastAgentSessionId, onSetActiveSessionId, onSetLastAgentSessionId, onSyncRouteSession], ); const closeDiffTab = useCallback( @@ -651,12 +665,12 @@ const TranscriptPanel = memo(function TranscriptPanel({ const addSession = useCallback(() => { void (async () => { - const { sessionId } = await taskWorkbenchClient.addSession({ taskId: task.id }); + const { sessionId } = await taskWorkspaceClient.addSession({ repoId: task.repoId, taskId: task.id }); onSetLastAgentSessionId(sessionId); onSetActiveSessionId(sessionId); onSyncRouteSession(task.id, sessionId); })(); - }, [task.id, onSetActiveSessionId, onSetLastAgentSessionId, onSyncRouteSession]); + }, [task.id, task.repoId, onSetActiveSessionId, onSetLastAgentSessionId, onSyncRouteSession]); const changeModel = useCallback( (model: ModelId) => { @@ -664,7 +678,8 @@ const TranscriptPanel = memo(function TranscriptPanel({ throw new Error(`Unable to change model for task ${task.id} without an active prompt session`); } - void taskWorkbenchClient.changeModel({ + void taskWorkspaceClient.changeModel({ + repoId: task.repoId, taskId: task.id, sessionId: promptSession.id, model, @@ -939,7 +954,7 @@ const TranscriptPanel = memo(function TranscriptPanel({ messageRefs={messageRefs} historyEvents={historyEvents} onSelectHistoryEvent={jumpToHistoryEvent} - targetMessageId={pendingHistoryTarget && activeSessionId === pendingHistoryTarget.sessionId ? pendingHistoryTarget.messageId : null} + targetMessageId={pendingHistoryTarget && activeAgentSession?.id === pendingHistoryTarget.sessionId ? pendingHistoryTarget.messageId : null} onTargetMessageResolved={() => setPendingHistoryTarget(null)} copiedMessageId={copiedMessageId} onCopyMessage={(message) => { @@ -958,6 +973,7 @@ const TranscriptPanel = memo(function TranscriptPanel({ textareaRef={textareaRef} placeholder={!promptSession.created ? "Describe your task..." : "Send a message..."} attachments={attachments} + modelGroups={modelGroups} defaultModel={defaultModel} model={promptSession.model} isRunning={promptSession.status === "running"} @@ -966,7 +982,9 @@ const TranscriptPanel = memo(function TranscriptPanel({ onStop={stopAgent} onRemoveAttachment={removeAttachment} onChangeModel={changeModel} - onSetDefaultModel={setDefaultModel} + onSetDefaultModel={(model) => { + void appClient.setDefaultModel(model); + }} /> ) : null} @@ -1055,6 +1073,8 @@ const RightRail = memo(function RightRail({ onArchive, onRevertFile, onPublishPr, + onChangeOwner, + members, onToggleSidebar, }: { organizationId: string; @@ -1064,6 +1084,8 @@ const RightRail = memo(function RightRail({ onArchive: () => void; onRevertFile: (path: string) => void; onPublishPr: () => void; + onChangeOwner: (member: { id: string; name: string; email: string }) => void; + members: Array<{ id: string; name: string; email: string }>; onToggleSidebar?: () => void; }) { const [css] = useStyletron(); @@ -1156,6 +1178,8 @@ const RightRail = memo(function RightRail({ onArchive={onArchive} onRevertFile={onRevertFile} onPublishPr={onPublishPr} + onChangeOwner={onChangeOwner} + members={members} onToggleSidebar={onToggleSidebar} /> @@ -1280,45 +1304,38 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } const [css] = useStyletron(); const t = useFoundryTokens(); const navigate = useNavigate(); - const taskWorkbenchClient = useMemo( + const taskWorkspaceClient = useMemo( () => ({ - createTask: (input) => backendClient.createWorkbenchTask(organizationId, input), - markTaskUnread: (input) => backendClient.markWorkbenchUnread(organizationId, input), - renameTask: (input) => backendClient.renameWorkbenchTask(organizationId, input), - renameBranch: (input) => backendClient.renameWorkbenchBranch(organizationId, input), - archiveTask: async (input) => backendClient.runAction(organizationId, input.taskId, "archive"), - publishPr: (input) => backendClient.publishWorkbenchPr(organizationId, input), - revertFile: (input) => backendClient.revertWorkbenchFile(organizationId, input), - updateDraft: (input) => backendClient.updateWorkbenchDraft(organizationId, input), - sendMessage: (input) => backendClient.sendWorkbenchMessage(organizationId, input), - stopAgent: (input) => backendClient.stopWorkbenchSession(organizationId, input), - setSessionUnread: (input) => backendClient.setWorkbenchSessionUnread(organizationId, input), - renameSession: (input) => backendClient.renameWorkbenchSession(organizationId, input), - closeSession: (input) => backendClient.closeWorkbenchSession(organizationId, input), - addSession: (input) => backendClient.createWorkbenchSession(organizationId, input), - changeModel: (input) => backendClient.changeWorkbenchModel(organizationId, input), - reloadGithubOrganization: () => backendClient.reloadGithubOrganization(organizationId), - reloadGithubPullRequests: () => backendClient.reloadGithubPullRequests(organizationId), - reloadGithubRepository: (repoId) => backendClient.reloadGithubRepository(organizationId, repoId), - reloadGithubPullRequest: (repoId, prNumber) => backendClient.reloadGithubPullRequest(organizationId, repoId, prNumber), + createTask: (input) => backendClient.createWorkspaceTask(organizationId, input), + markTaskUnread: (input) => backendClient.markWorkspaceUnread(organizationId, input), + renameTask: (input) => backendClient.renameWorkspaceTask(organizationId, input), + archiveTask: async (input) => backendClient.runAction(organizationId, input.repoId, input.taskId, "archive"), + publishPr: (input) => backendClient.publishWorkspacePr(organizationId, input), + revertFile: (input) => backendClient.revertWorkspaceFile(organizationId, input), + updateDraft: (input) => backendClient.updateWorkspaceDraft(organizationId, input), + sendMessage: (input) => backendClient.sendWorkspaceMessage(organizationId, input), + stopAgent: (input) => backendClient.stopWorkspaceSession(organizationId, input), + selectSession: (input) => backendClient.selectWorkspaceSession(organizationId, input), + setSessionUnread: (input) => backendClient.setWorkspaceSessionUnread(organizationId, input), + renameSession: (input) => backendClient.renameWorkspaceSession(organizationId, input), + closeSession: (input) => backendClient.closeWorkspaceSession(organizationId, input), + addSession: (input) => backendClient.createWorkspaceSession(organizationId, input), + changeModel: (input) => backendClient.changeWorkspaceModel(organizationId, input), + changeOwner: (input) => backendClient.changeWorkspaceTaskOwner(organizationId, input), + adminReloadGithubOrganization: () => backendClient.adminReloadGithubOrganization(organizationId), + adminReloadGithubRepository: (repoId) => backendClient.adminReloadGithubRepository(organizationId, repoId), }), [organizationId], ); const organizationState = useSubscription(subscriptionManager, "organization", { organizationId }); - const organizationRepos = organizationState.data?.repos ?? []; - const taskSummaries = organizationState.data?.taskSummaries ?? []; - const openPullRequests = organizationState.data?.openPullRequests ?? []; - const openPullRequestsByTaskId = useMemo( - () => new Map(openPullRequests.map((pullRequest) => [openPrTaskId(pullRequest.prId), pullRequest])), - [openPullRequests], - ); - const selectedOpenPullRequest = useMemo( - () => (selectedTaskId ? (openPullRequestsByTaskId.get(selectedTaskId) ?? null) : null), - [openPullRequestsByTaskId, selectedTaskId], - ); + const organizationReposData = organizationState.data?.repos; + const taskSummariesData = organizationState.data?.taskSummaries; + const openPullRequestsData = organizationState.data?.openPullRequests; + const organizationRepos = organizationReposData ?? []; + const taskSummaries = taskSummariesData ?? []; const selectedTaskSummary = useMemo( () => taskSummaries.find((task) => task.id === selectedTaskId) ?? taskSummaries[0] ?? null, - [selectedTaskId, taskSummaries], + [selectedTaskId, taskSummariesData], ); const taskState = useSubscription( subscriptionManager, @@ -1359,6 +1376,20 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } : null, ); const hasSandbox = Boolean(activeSandbox) && sandboxState.status !== "error"; + const modelGroupsQuery = useQuery({ + queryKey: ["mock-layout", "workspace-model-groups", organizationId, activeSandbox?.sandboxProviderId ?? "", activeSandbox?.sandboxId ?? ""], + enabled: Boolean(activeSandbox?.sandboxId), + staleTime: 30_000, + refetchOnWindowFocus: false, + queryFn: async () => { + if (!activeSandbox) { + throw new Error("Cannot load workspace model groups without an active sandbox."); + } + + return await backendClient.getSandboxWorkspaceModelGroups(organizationId, activeSandbox.sandboxProviderId, activeSandbox.sandboxId); + }, + }); + const modelGroups = modelGroupsQuery.data && modelGroupsQuery.data.length > 0 ? modelGroupsQuery.data : DEFAULT_WORKSPACE_MODEL_GROUPS; const tasks = useMemo(() => { const sessionCache = new Map(); if (selectedTaskSummary && taskState.data) { @@ -1383,12 +1414,14 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } const hydratedTasks = taskSummaries.map((summary) => summary.id === selectedTaskSummary?.id ? toTaskModel(summary, taskState.data, sessionCache) : toTaskModel(summary), ); - const openPrTasks = openPullRequests.map((pullRequest) => toOpenPrTaskModel(pullRequest)); - return [...hydratedTasks, ...openPrTasks].sort((left, right) => right.updatedAtMs - left.updatedAtMs); - }, [openPullRequests, selectedTaskSummary, selectedSessionId, sessionState.data, taskState.data, taskSummaries, organizationId]); - const rawRepositories = useMemo(() => groupRepositories(organizationRepos, tasks), [tasks, organizationRepos]); + return hydratedTasks.sort((left, right) => right.updatedAtMs - left.updatedAtMs); + }, [selectedTaskSummary, selectedSessionId, sessionState.data, taskState.data, taskSummariesData, organizationId]); + const openPullRequests = openPullRequestsData ?? []; + const rawRepositories = useMemo(() => groupRepositories(organizationRepos, tasks, openPullRequests), [tasks, organizationReposData, openPullRequestsData]); const appSnapshot = useMockAppSnapshot(); + const currentUser = activeMockUser(appSnapshot); const activeOrg = activeMockOrganization(appSnapshot); + const liveGithub = organizationState.data?.github ?? activeOrg?.github ?? null; const navigateToUsage = useCallback(() => { if (activeOrg) { void navigate({ to: "/organizations/$organizationId/billing" as never, params: { organizationId: activeOrg.id } as never }); @@ -1413,11 +1446,9 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } const leftWidthRef = useRef(leftWidth); const rightWidthRef = useRef(rightWidth); const autoCreatingSessionForTaskRef = useRef>(new Set()); - const resolvingOpenPullRequestsRef = useRef>(new Set()); const [leftSidebarOpen, setLeftSidebarOpen] = useState(true); const [rightSidebarOpen, setRightSidebarOpen] = useState(true); const [leftSidebarPeeking, setLeftSidebarPeeking] = useState(false); - const [materializingOpenPrId, setMaterializingOpenPrId] = useState(null); const showDevPanel = useDevPanel(); const peekTimeoutRef = useRef | null>(null); @@ -1484,80 +1515,17 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } }, []); const activeTask = useMemo(() => { - const realTasks = tasks.filter((task) => !isOpenPrTaskId(task.id)); - if (selectedOpenPullRequest) { - return null; - } if (selectedTaskId) { - return realTasks.find((task) => task.id === selectedTaskId) ?? realTasks[0] ?? null; + return tasks.find((task) => task.id === selectedTaskId) ?? tasks[0] ?? null; } - return realTasks[0] ?? null; - }, [selectedOpenPullRequest, selectedTaskId, tasks]); - - const materializeOpenPullRequest = useCallback( - async (pullRequest: WorkbenchOpenPrSummary) => { - if (resolvingOpenPullRequestsRef.current.has(pullRequest.prId)) { - return; - } - - resolvingOpenPullRequestsRef.current.add(pullRequest.prId); - setMaterializingOpenPrId(pullRequest.prId); - - try { - const { taskId, sessionId } = await taskWorkbenchClient.createTask({ - repoId: pullRequest.repoId, - task: `Continue work on GitHub PR #${pullRequest.number}: ${pullRequest.title}`, - model: "gpt-5.3-codex", - title: pullRequest.title, - onBranch: pullRequest.headRefName, - }); - await navigate({ - to: "/organizations/$organizationId/tasks/$taskId", - params: { - organizationId, - taskId, - }, - search: { sessionId: sessionId ?? undefined }, - replace: true, - }); - } catch (error) { - setMaterializingOpenPrId((current) => (current === pullRequest.prId ? null : current)); - resolvingOpenPullRequestsRef.current.delete(pullRequest.prId); - logger.error( - { - prId: pullRequest.prId, - repoId: pullRequest.repoId, - branchName: pullRequest.headRefName, - ...createErrorContext(error), - }, - "failed_to_materialize_open_pull_request_task", - ); - } - }, - [navigate, taskWorkbenchClient, organizationId], - ); - - useEffect(() => { - if (!selectedOpenPullRequest) { - if (materializingOpenPrId) { - resolvingOpenPullRequestsRef.current.delete(materializingOpenPrId); - } - setMaterializingOpenPrId(null); - return; - } - - void materializeOpenPullRequest(selectedOpenPullRequest); - }, [materializeOpenPullRequest, materializingOpenPrId, selectedOpenPullRequest]); + return tasks[0] ?? null; + }, [selectedTaskId, tasks]); useEffect(() => { if (activeTask) { return; } - if (selectedOpenPullRequest || materializingOpenPrId) { - return; - } - const fallbackTaskId = tasks[0]?.id; if (!fallbackTaskId) { return; @@ -1574,11 +1542,13 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } search: { sessionId: fallbackTask?.sessions[0]?.id ?? undefined }, replace: true, }); - }, [activeTask, materializingOpenPrId, navigate, selectedOpenPullRequest, tasks, organizationId]); + }, [activeTask, navigate, tasks, organizationId]); const openDiffs = activeTask ? sanitizeOpenDiffs(activeTask, openDiffsByTask[activeTask.id]) : []; const lastAgentSessionId = activeTask ? sanitizeLastAgentSessionId(activeTask, lastAgentSessionIdByTask[activeTask.id]) : null; - const activeSessionId = activeTask ? sanitizeActiveSessionId(activeTask, activeSessionIdByTask[activeTask.id], openDiffs, lastAgentSessionId) : null; + const activeSessionId = activeTask + ? sanitizeActiveSessionId(activeTask, activeSessionIdByTask[activeTask.id] ?? activeTask.activeSessionId ?? null, openDiffs, lastAgentSessionId) + : null; const selectedSessionHydrating = Boolean( selectedSessionId && activeSessionId === selectedSessionId && sessionState.status === "loading" && !sessionState.data, ); @@ -1635,6 +1605,7 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } }, [activeTask, lastAgentSessionIdByTask, selectedSessionId, syncRouteSession]); useEffect(() => { + const organizationRepos = organizationReposData ?? []; if (selectedNewTaskRepoId && organizationRepos.some((repo) => repo.id === selectedNewTaskRepoId)) { return; } @@ -1644,7 +1615,7 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } if (fallbackRepoId !== selectedNewTaskRepoId) { setSelectedNewTaskRepoId(fallbackRepoId); } - }, [activeTask?.repoId, selectedNewTaskRepoId, organizationRepos]); + }, [activeTask?.repoId, selectedNewTaskRepoId, organizationReposData]); useEffect(() => { if (!activeTask) { @@ -1664,7 +1635,7 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } autoCreatingSessionForTaskRef.current.add(activeTask.id); void (async () => { try { - const { sessionId } = await taskWorkbenchClient.addSession({ taskId: activeTask.id }); + const { sessionId } = await taskWorkspaceClient.addSession({ repoId: activeTask.repoId, taskId: activeTask.id }); syncRouteSession(activeTask.id, sessionId, true); } catch (error) { logger.error( @@ -1672,13 +1643,13 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } taskId: activeTask.id, ...createErrorContext(error), }, - "failed_to_auto_create_workbench_session", + "failed_to_auto_create_workspace_session", ); // Keep the guard in the set on error to prevent retry storms. // The guard is cleared when sessions appear (line above) or the task changes. } })(); - }, [activeTask, selectedSessionId, syncRouteSession, taskWorkbenchClient]); + }, [activeTask, selectedSessionId, syncRouteSession, taskWorkspaceClient]); const createTask = useCallback( (overrideRepoId?: string, options?: { title?: string; task?: string; branch?: string; onBranch?: string }) => { @@ -1688,10 +1659,10 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } throw new Error("Cannot create a task without an available repo"); } - const { taskId, sessionId } = await taskWorkbenchClient.createTask({ + const { taskId, sessionId } = await taskWorkspaceClient.createTask({ repoId, task: options?.task ?? "New task", - model: "gpt-5.3-codex", + model: currentUser?.defaultModel ?? DEFAULT_WORKSPACE_MODEL_ID, title: options?.title ?? "New task", ...(options?.branch ? { branch: options.branch } : {}), ...(options?.onBranch ? { onBranch: options.onBranch } : {}), @@ -1706,7 +1677,7 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } }); })(); }, - [navigate, selectedNewTaskRepoId, taskWorkbenchClient, organizationId], + [currentUser?.defaultModel, navigate, selectedNewTaskRepoId, taskWorkspaceClient, organizationId], ); const openDiffTab = useCallback( @@ -1735,14 +1706,6 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } const selectTask = useCallback( (id: string) => { - if (isOpenPrTaskId(id)) { - const pullRequest = openPullRequestsByTaskId.get(id); - if (!pullRequest) { - return; - } - void materializeOpenPullRequest(pullRequest); - return; - } const task = tasks.find((candidate) => candidate.id === id) ?? null; void navigate({ to: "/organizations/$organizationId/tasks/$taskId", @@ -1753,12 +1716,19 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } search: { sessionId: task?.sessions[0]?.id ?? undefined }, }); }, - [materializeOpenPullRequest, navigate, openPullRequestsByTaskId, tasks, organizationId], + [navigate, tasks, organizationId], ); - const markTaskUnread = useCallback((id: string) => { - void taskWorkbenchClient.markTaskUnread({ taskId: id }); - }, []); + const markTaskUnread = useCallback( + (id: string) => { + const task = tasks.find((candidate) => candidate.id === id); + if (!task) { + return; + } + void taskWorkspaceClient.markTaskUnread({ repoId: task.repoId, taskId: id }); + }, + [tasks], + ); const renameTask = useCallback( (id: string) => { @@ -1777,45 +1747,39 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } return; } - void taskWorkbenchClient.renameTask({ taskId: id, value: trimmedTitle }); + void taskWorkspaceClient.renameTask({ repoId: currentTask.repoId, taskId: id, value: trimmedTitle }); }, [tasks], ); - const renameBranch = useCallback( - (id: string) => { - const currentTask = tasks.find((task) => task.id === id); - if (!currentTask) { - throw new Error(`Unable to rename missing task ${id}`); + const changeOwner = useCallback( + (member: { id: string; name: string; email: string }) => { + if (!activeTask) { + throw new Error("Cannot change owner without an active task"); } - - const nextBranch = window.prompt("Rename branch", currentTask.branch ?? ""); - if (nextBranch === null) { - return; - } - - const trimmedBranch = nextBranch.trim(); - if (!trimmedBranch) { - return; - } - - void taskWorkbenchClient.renameBranch({ taskId: id, value: trimmedBranch }); + void taskWorkspaceClient.changeOwner({ + repoId: activeTask.repoId, + taskId: activeTask.id, + targetUserId: member.id, + targetUserName: member.name, + targetUserEmail: member.email, + }); }, - [tasks], + [activeTask], ); const archiveTask = useCallback(() => { if (!activeTask) { throw new Error("Cannot archive without an active task"); } - void taskWorkbenchClient.archiveTask({ taskId: activeTask.id }); + void taskWorkspaceClient.archiveTask({ repoId: activeTask.repoId, taskId: activeTask.id }); }, [activeTask]); const publishPr = useCallback(() => { if (!activeTask) { throw new Error("Cannot publish PR without an active task"); } - void taskWorkbenchClient.publishPr({ taskId: activeTask.id }); + void taskWorkspaceClient.publishPr({ repoId: activeTask.repoId, taskId: activeTask.id }); }, [activeTask]); const revertFile = useCallback( @@ -1835,7 +1799,8 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } : (current[activeTask.id] ?? null), })); - void taskWorkbenchClient.revertFile({ + void taskWorkspaceClient.revertFile({ + repoId: activeTask.repoId, taskId: activeTask.id, path, }); @@ -1912,7 +1877,6 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } }; if (!activeTask) { - const isMaterializingSelectedOpenPr = Boolean(selectedOpenPullRequest) || materializingOpenPrId != null; return ( <> {dragRegion} @@ -1939,14 +1903,11 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } onSelectNewTaskRepo={setSelectedNewTaskRepoId} onMarkUnread={markTaskUnread} onRenameTask={renameTask} - onRenameBranch={renameBranch} onReorderRepositories={reorderRepositories} taskOrderByRepository={taskOrderByRepository} onReorderTasks={reorderTasks} - onReloadOrganization={() => void taskWorkbenchClient.reloadGithubOrganization()} - onReloadPullRequests={() => void taskWorkbenchClient.reloadGithubPullRequests()} - onReloadRepository={(repoId) => void taskWorkbenchClient.reloadGithubRepository(repoId)} - onReloadPullRequest={(repoId, prNumber) => void taskWorkbenchClient.reloadGithubPullRequest(repoId, prNumber)} + onReloadOrganization={() => void taskWorkspaceClient.adminReloadGithubOrganization()} + onReloadRepository={(repoId) => void taskWorkspaceClient.adminReloadGithubRepository(repoId)} onToggleSidebar={() => setLeftSidebarOpen(false)} /> @@ -1988,7 +1949,7 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } gap: "12px", }} > - {activeOrg?.github.syncStatus === "syncing" || activeOrg?.github.syncStatus === "pending" ? ( + {liveGithub?.syncStatus === "syncing" || liveGithub?.syncStatus === "pending" ? ( <>

Syncing with GitHub

- Importing repos from @{activeOrg.github.connectedAccount || "GitHub"}... - {activeOrg.github.importedRepoCount > 0 && <> {activeOrg.github.importedRepoCount} repos imported so far.} + {liveGithub.lastSyncLabel || `Importing repos from @${liveGithub.connectedAccount || "GitHub"}...`} + {(liveGithub.totalRepositoryCount ?? 0) > 0 && ( + <> + {" "} + {liveGithub.syncPhase === "syncing_repositories" + ? `${liveGithub.importedRepoCount} of ${liveGithub.totalRepositoryCount} repos imported so far.` + : `${liveGithub.processedRepositoryCount} of ${liveGithub.totalRepositoryCount} repos processed in ${liveGithub.syncPhase?.replace(/^syncing_/, "").replace(/_/g, " ") ?? "sync"}.`} + + )}

- ) : isMaterializingSelectedOpenPr && selectedOpenPullRequest ? ( - <> - -

Creating task from pull request

-

- Preparing a task for {selectedOpenPullRequest.title} on {selectedOpenPullRequest.headRefName}. -

- - ) : activeOrg?.github.syncStatus === "error" ? ( + ) : liveGithub?.syncStatus === "error" ? ( <>

GitHub sync failed

There was a problem syncing repos from GitHub. Check the dev panel for details.

@@ -2075,11 +2035,11 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId }
- {activeOrg && } + {liveGithub && } {showDevPanel && ( @@ -2114,14 +2074,11 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } onSelectNewTaskRepo={setSelectedNewTaskRepoId} onMarkUnread={markTaskUnread} onRenameTask={renameTask} - onRenameBranch={renameBranch} onReorderRepositories={reorderRepositories} taskOrderByRepository={taskOrderByRepository} onReorderTasks={reorderTasks} - onReloadOrganization={() => void taskWorkbenchClient.reloadGithubOrganization()} - onReloadPullRequests={() => void taskWorkbenchClient.reloadGithubPullRequests()} - onReloadRepository={(repoId) => void taskWorkbenchClient.reloadGithubRepository(repoId)} - onReloadPullRequest={(repoId, prNumber) => void taskWorkbenchClient.reloadGithubPullRequest(repoId, prNumber)} + onReloadOrganization={() => void taskWorkspaceClient.adminReloadGithubOrganization()} + onReloadRepository={(repoId) => void taskWorkspaceClient.adminReloadGithubRepository(repoId)} onToggleSidebar={() => setLeftSidebarOpen(false)} /> @@ -2169,14 +2126,11 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } onSelectNewTaskRepo={setSelectedNewTaskRepoId} onMarkUnread={markTaskUnread} onRenameTask={renameTask} - onRenameBranch={renameBranch} onReorderRepositories={reorderRepositories} taskOrderByRepository={taskOrderByRepository} onReorderTasks={reorderTasks} - onReloadOrganization={() => void taskWorkbenchClient.reloadGithubOrganization()} - onReloadPullRequests={() => void taskWorkbenchClient.reloadGithubPullRequests()} - onReloadRepository={(repoId) => void taskWorkbenchClient.reloadGithubRepository(repoId)} - onReloadPullRequest={(repoId, prNumber) => void taskWorkbenchClient.reloadGithubPullRequest(repoId, prNumber)} + onReloadOrganization={() => void taskWorkspaceClient.adminReloadGithubOrganization()} + onReloadRepository={(repoId) => void taskWorkspaceClient.adminReloadGithubRepository(repoId)} onToggleSidebar={() => { setLeftSidebarPeeking(false); setLeftSidebarOpen(true); @@ -2189,9 +2143,10 @@ export function MockLayout({ organizationId, selectedTaskId, selectedSessionId } {leftSidebarOpen ? : null}
setRightSidebarOpen(false)} />
- {activeOrg && } + {liveGithub && } {showDevPanel && ( ({ diff --git a/foundry/packages/frontend/src/components/mock-layout/model-picker.tsx b/foundry/packages/frontend/src/components/mock-layout/model-picker.tsx index ba3f0f3..6ec6ea6 100644 --- a/foundry/packages/frontend/src/components/mock-layout/model-picker.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/model-picker.tsx @@ -2,18 +2,21 @@ import { memo, useState } from "react"; import { useStyletron } from "baseui"; import { StatefulPopover, PLACEMENT } from "baseui/popover"; import { ChevronUp, Star } from "lucide-react"; +import { workspaceModelLabel, type WorkspaceModelGroup } from "@sandbox-agent/foundry-shared"; import { useFoundryTokens } from "../../app/theme"; import { AgentIcon } from "./ui"; -import { MODEL_GROUPS, modelLabel, providerAgent, type ModelId } from "./view-model"; +import { type ModelId } from "./view-model"; const ModelPickerContent = memo(function ModelPickerContent({ + groups, value, defaultModel, onChange, onSetDefault, close, }: { + groups: WorkspaceModelGroup[]; value: ModelId; defaultModel: ModelId; onChange: (id: ModelId) => void; @@ -26,7 +29,7 @@ const ModelPickerContent = memo(function ModelPickerContent({ return (
- {MODEL_GROUPS.map((group) => ( + {groups.map((group) => (
void; @@ -137,7 +142,9 @@ export const ModelPicker = memo(function ModelPicker({ }, }, }} - content={({ close }) => } + content={({ close }) => ( + + )} >
diff --git a/foundry/packages/frontend/src/components/mock-layout/prompt-composer.tsx b/foundry/packages/frontend/src/components/mock-layout/prompt-composer.tsx index 08d72ae..b7e27be 100644 --- a/foundry/packages/frontend/src/components/mock-layout/prompt-composer.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/prompt-composer.tsx @@ -2,6 +2,7 @@ import { memo, type Ref } from "react"; import { useStyletron } from "baseui"; import { ChatComposer, type ChatComposerClassNames } from "@sandbox-agent/react"; import { FileCode, SendHorizonal, Square, X } from "lucide-react"; +import { type WorkspaceModelGroup } from "@sandbox-agent/foundry-shared"; import { useFoundryTokens } from "../../app/theme"; import { ModelPicker } from "./model-picker"; @@ -13,6 +14,7 @@ export const PromptComposer = memo(function PromptComposer({ textareaRef, placeholder, attachments, + modelGroups, defaultModel, model, isRunning, @@ -27,6 +29,7 @@ export const PromptComposer = memo(function PromptComposer({ textareaRef: Ref; placeholder: string; attachments: LineAttachment[]; + modelGroups: WorkspaceModelGroup[]; defaultModel: ModelId; model: ModelId; isRunning: boolean; @@ -172,7 +175,7 @@ export const PromptComposer = memo(function PromptComposer({ renderSubmitContent={() => (isRunning ? : )} renderFooter={() => (
- +
)} /> diff --git a/foundry/packages/frontend/src/components/mock-layout/right-sidebar.tsx b/foundry/packages/frontend/src/components/mock-layout/right-sidebar.tsx index 529da47..3565b44 100644 --- a/foundry/packages/frontend/src/components/mock-layout/right-sidebar.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/right-sidebar.tsx @@ -1,7 +1,21 @@ -import { memo, useCallback, useMemo, useState, type MouseEvent } from "react"; +import { memo, useCallback, useMemo, useRef, useState, type MouseEvent } from "react"; import { useStyletron } from "baseui"; -import { LabelSmall } from "baseui/typography"; -import { Archive, ArrowUpFromLine, ChevronRight, FileCode, FilePlus, FileX, FolderOpen, GitPullRequest, PanelRight } from "lucide-react"; +import { LabelSmall, LabelXSmall } from "baseui/typography"; +import { + Archive, + ArrowUpFromLine, + ChevronDown, + ChevronRight, + FileCode, + FilePlus, + FileX, + FolderOpen, + ExternalLink, + GitBranch, + GitPullRequest, + PanelRight, + User, +} from "lucide-react"; import { useFoundryTokens } from "../../app/theme"; import { createErrorContext } from "@sandbox-agent/foundry-shared"; @@ -99,6 +113,8 @@ export const RightSidebar = memo(function RightSidebar({ onArchive, onRevertFile, onPublishPr, + onChangeOwner, + members, onToggleSidebar, }: { task: Task; @@ -107,11 +123,13 @@ export const RightSidebar = memo(function RightSidebar({ onArchive: () => void; onRevertFile: (path: string) => void; onPublishPr: () => void; + onChangeOwner: (member: { id: string; name: string; email: string }) => void; + members: Array<{ id: string; name: string; email: string }>; onToggleSidebar?: () => void; }) { const [css] = useStyletron(); const t = useFoundryTokens(); - const [rightTab, setRightTab] = useState<"changes" | "files">("changes"); + const [rightTab, setRightTab] = useState<"overview" | "changes" | "files">("overview"); const contextMenu = useContextMenu(); const changedPaths = useMemo(() => new Set(task.fileChanges.map((file) => file.path)), [task.fileChanges]); const isTerminal = task.status === "archived"; @@ -125,7 +143,9 @@ export const RightSidebar = memo(function RightSidebar({ }); observer.observe(node); }, []); - const pullRequestUrl = task.pullRequest != null ? `https://github.com/${task.repoName}/pull/${task.pullRequest.number}` : null; + const [ownerDropdownOpen, setOwnerDropdownOpen] = useState(false); + const ownerDropdownRef = useRef(null); + const pullRequestUrl = task.pullRequest?.url ?? null; const copyFilePath = useCallback(async (path: string) => { try { @@ -310,7 +330,7 @@ export const RightSidebar = memo(function RightSidebar({ })} > +
- {rightTab === "changes" ? ( + {rightTab === "overview" ? ( +
+
+ + Owner + +
+
setOwnerDropdownOpen((prev) => !prev)} + onKeyDown={(event) => { + if (event.key === "Enter" || event.key === " ") setOwnerDropdownOpen((prev) => !prev); + }} + className={css({ + display: "flex", + alignItems: "center", + gap: "10px", + paddingTop: "4px", + paddingRight: "8px", + paddingBottom: "4px", + paddingLeft: "4px", + borderRadius: "6px", + cursor: "pointer", + ":hover": { backgroundColor: t.interactiveHover }, + })} + > + {task.primaryUserLogin ? ( + <> + {task.primaryUserAvatarUrl ? ( + {task.primaryUserLogin} + ) : ( +
+ +
+ )} + + {task.primaryUserLogin} + + + ) : ( + <> +
+ +
+ + No owner assigned + + + )} + +
+ {ownerDropdownOpen ? ( + <> +
setOwnerDropdownOpen(false)} + className={css({ position: "fixed", top: 0, left: 0, right: 0, bottom: 0, zIndex: 99 })} + /> +
+ {members.map((member) => ( +
{ + onChangeOwner(member); + setOwnerDropdownOpen(false); + }} + onKeyDown={(event) => { + if (event.key === "Enter" || event.key === " ") { + onChangeOwner(member); + setOwnerDropdownOpen(false); + } + }} + className={css({ + display: "flex", + alignItems: "center", + gap: "8px", + paddingTop: "6px", + paddingRight: "12px", + paddingBottom: "6px", + paddingLeft: "12px", + cursor: "pointer", + fontSize: "12px", + color: t.textPrimary, + ":hover": { backgroundColor: t.interactiveHover }, + })} + > +
+ +
+ {member.name} +
+ ))} + {members.length === 0 ? ( +
+ No members +
+ ) : null} +
+ + ) : null} +
+
+
+ + Branch + +
+ + + {task.branch ?? "No branch"} + +
+
+
+ + Repository + + {task.repoName} +
+ {task.pullRequest ? ( +
+ + Pull Request + +
+ + + #{task.pullRequest.number} {task.pullRequest.title ?? ""} + +
+
+ ) : null} + {task.sandboxes?.find((s) => s.sandboxId === task.activeSandboxId)?.url ? ( +
+ ) : null} +
+ ) : rightTab === "changes" ? (
{task.fileChanges.length === 0 ? (
diff --git a/foundry/packages/frontend/src/components/mock-layout/sidebar.tsx b/foundry/packages/frontend/src/components/mock-layout/sidebar.tsx index 7ccb18c..6ebb026 100644 --- a/foundry/packages/frontend/src/components/mock-layout/sidebar.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/sidebar.tsx @@ -54,10 +54,6 @@ function repositoryIconColor(label: string): string { return REPOSITORY_COLORS[Math.abs(hash) % REPOSITORY_COLORS.length]!; } -function isPullRequestSidebarItem(task: Task): boolean { - return task.id.startsWith("pr:"); -} - export const Sidebar = memo(function Sidebar({ repositories, newTaskRepos, @@ -68,14 +64,11 @@ export const Sidebar = memo(function Sidebar({ onSelectNewTaskRepo, onMarkUnread, onRenameTask, - onRenameBranch, onReorderRepositories, taskOrderByRepository, onReorderTasks, onReloadOrganization, - onReloadPullRequests, onReloadRepository, - onReloadPullRequest, onToggleSidebar, }: { repositories: RepositorySection[]; @@ -87,14 +80,11 @@ export const Sidebar = memo(function Sidebar({ onSelectNewTaskRepo: (repoId: string) => void; onMarkUnread: (id: string) => void; onRenameTask: (id: string) => void; - onRenameBranch: (id: string) => void; onReorderRepositories: (fromIndex: number, toIndex: number) => void; taskOrderByRepository: Record; onReorderTasks: (repositoryId: string, fromIndex: number, toIndex: number) => void; onReloadOrganization: () => void; - onReloadPullRequests: () => void; onReloadRepository: (repoId: string) => void; - onReloadPullRequest: (repoId: string, prNumber: number) => void; onToggleSidebar?: () => void; }) { const [css] = useStyletron(); @@ -446,16 +436,6 @@ export const Sidebar = memo(function Sidebar({ > Reload organization -
) : null}
{ if (node) { @@ -667,15 +648,12 @@ export const Sidebar = memo(function Sidebar({ if (item.type === "task") { const { repository, task, taskIndex } = item; const isActive = task.id === activeId; - const isPullRequestItem = isPullRequestSidebarItem(task); const isRunning = task.sessions.some((s) => s.status === "running"); const isProvisioning = - !isPullRequestItem && - ((String(task.status).startsWith("init_") && task.status !== "init_complete") || - task.status === "new" || - task.sessions.some((s) => s.status === "pending_provision" || s.status === "pending_session_create")); + (String(task.status).startsWith("init_") && task.status !== "init_complete") || + task.sessions.some((s) => s.status === "pending_provision" || s.status === "pending_session_create"); const hasUnread = task.sessions.some((s) => s.unread); - const isDraft = task.pullRequest == null || task.pullRequest.status === "draft"; + const isDraft = task.pullRequest?.isDraft ?? true; const totalAdded = task.fileChanges.reduce((sum, file) => sum + file.added, 0); const totalRemoved = task.fileChanges.reduce((sum, file) => sum + file.removed, 0); const hasDiffs = totalAdded > 0 || totalRemoved > 0; @@ -686,6 +664,7 @@ export const Sidebar = memo(function Sidebar({ return (
{ @@ -720,18 +699,11 @@ export const Sidebar = memo(function Sidebar({
onSelect(task.id)} onContextMenu={(event) => { - if (isPullRequestItem && task.pullRequest) { - contextMenu.open(event, [ - { label: "Reload pull request", onClick: () => onReloadPullRequest(task.repoId, task.pullRequest!.number) }, - { label: "Create task", onClick: () => onSelect(task.id) }, - ]); - return; - } - contextMenu.open(event, [ + const items = [ { label: "Rename task", onClick: () => onRenameTask(task.id) }, - { label: "Rename branch", onClick: () => onRenameBranch(task.id) }, { label: "Mark as unread", onClick: () => onMarkUnread(task.id) }, - ]); + ]; + contextMenu.open(event, items); }} className={css({ padding: "8px 12px", @@ -756,11 +728,7 @@ export const Sidebar = memo(function Sidebar({ flexShrink: 0, })} > - {isPullRequestItem ? ( - - ) : ( - - )} +
{task.title} - {isPullRequestItem && task.statusMessage ? ( - - {task.statusMessage} - - ) : null}
+ {task.primaryUserLogin ? ( + + {task.primaryUserLogin} + + ) : null} {task.pullRequest != null ? ( #{task.pullRequest.number} - {task.pullRequest.status === "draft" ? : null} + {task.pullRequest.isDraft ? : null} ) : ( @@ -814,6 +794,7 @@ export const Sidebar = memo(function Sidebar({ return (
{ @@ -851,6 +832,7 @@ export const Sidebar = memo(function Sidebar({ return (
{ if (node) { diff --git a/foundry/packages/frontend/src/components/mock-layout/terminal-pane.tsx b/foundry/packages/frontend/src/components/mock-layout/terminal-pane.tsx index 95e6876..10d74d7 100644 --- a/foundry/packages/frontend/src/components/mock-layout/terminal-pane.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/terminal-pane.tsx @@ -305,7 +305,8 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC setProcessTabs([]); }, [taskId]); - const processes = processesState.data ?? []; + const processesData = processesState.data; + const processes = processesData ?? []; const openTerminalTab = useCallback((process: SandboxProcessRecord) => { setProcessTabs((current) => { @@ -361,7 +362,7 @@ export function TerminalPane({ organizationId, taskId, isExpanded, onExpand, onC const activeProcessTab = activeSessionId ? (processTabsById.get(activeSessionId) ?? null) : null; const activeTerminalProcess = useMemo( () => (activeProcessTab ? (processes.find((process) => process.id === activeProcessTab.processId) ?? null) : null), - [activeProcessTab, processes], + [activeProcessTab, processesData], ); const emptyBodyClassName = css({ diff --git a/foundry/packages/frontend/src/components/mock-layout/transcript-header.tsx b/foundry/packages/frontend/src/components/mock-layout/transcript-header.tsx index a024871..16f87e6 100644 --- a/foundry/packages/frontend/src/components/mock-layout/transcript-header.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/transcript-header.tsx @@ -30,11 +30,11 @@ export const TranscriptHeader = memo(function TranscriptHeader({ task: Task; hasSandbox: boolean; activeSession: AgentSession | null | undefined; - editingField: "title" | "branch" | null; + editingField: "title" | null; editValue: string; onEditValueChange: (value: string) => void; - onStartEditingField: (field: "title" | "branch", value: string) => void; - onCommitEditingField: (field: "title" | "branch") => void; + onStartEditingField: (field: "title", value: string) => void; + onCommitEditingField: (field: "title") => void; onCancelEditingField: () => void; onSetActiveSessionUnread: (unread: boolean) => void; sidebarCollapsed?: boolean; @@ -49,10 +49,9 @@ export const TranscriptHeader = memo(function TranscriptHeader({ const t = useFoundryTokens(); const isDesktop = !!import.meta.env.VITE_DESKTOP; const needsTrafficLightInset = isDesktop && sidebarCollapsed; - const taskStatus = task.runtimeStatus ?? task.status; const headerStatus = useMemo( - () => deriveHeaderStatus(taskStatus, task.statusMessage ?? null, activeSession?.status ?? null, activeSession?.errorMessage ?? null, hasSandbox), - [taskStatus, task.statusMessage, activeSession?.status, activeSession?.errorMessage, hasSandbox], + () => deriveHeaderStatus(task.status, activeSession?.status ?? null, activeSession?.errorMessage ?? null, hasSandbox), + [task.status, activeSession?.status, activeSession?.errorMessage, hasSandbox], ); return ( @@ -118,55 +117,20 @@ export const TranscriptHeader = memo(function TranscriptHeader({ )} {task.branch ? ( - editingField === "branch" ? ( - onEditValueChange(event.target.value)} - onBlur={() => onCommitEditingField("branch")} - onKeyDown={(event) => { - if (event.key === "Enter") { - onCommitEditingField("branch"); - } else if (event.key === "Escape") { - onCancelEditingField(); - } - }} - className={css({ - appearance: "none", - WebkitAppearance: "none", - margin: "0", - outline: "none", - padding: "2px 8px", - borderRadius: "999px", - border: `1px solid ${t.borderFocus}`, - backgroundColor: t.interactiveSubtle, - color: t.textPrimary, - fontSize: "11px", - whiteSpace: "nowrap", - fontFamily: '"IBM Plex Mono", monospace', - minWidth: "60px", - })} - /> - ) : ( - onStartEditingField("branch", task.branch ?? "")} - className={css({ - padding: "2px 8px", - borderRadius: "999px", - border: `1px solid ${t.borderMedium}`, - backgroundColor: t.interactiveSubtle, - color: t.textPrimary, - fontSize: "11px", - whiteSpace: "nowrap", - fontFamily: '"IBM Plex Mono", monospace', - cursor: "pointer", - ":hover": { borderColor: t.borderFocus }, - })} - > - {task.branch} - - ) + + {task.branch} + ) : null}
diff --git a/foundry/packages/frontend/src/components/mock-layout/ui.tsx b/foundry/packages/frontend/src/components/mock-layout/ui.tsx index d39a408..b86ca18 100644 --- a/foundry/packages/frontend/src/components/mock-layout/ui.tsx +++ b/foundry/packages/frontend/src/components/mock-layout/ui.tsx @@ -181,6 +181,8 @@ export const AgentIcon = memo(function AgentIcon({ agent, size = 14 }: { agent: return ; case "Cursor": return ; + default: + return ; } }); diff --git a/foundry/packages/frontend/src/components/mock-layout/view-model.test.ts b/foundry/packages/frontend/src/components/mock-layout/view-model.test.ts index 21228fc..bc6ab87 100644 --- a/foundry/packages/frontend/src/components/mock-layout/view-model.test.ts +++ b/foundry/packages/frontend/src/components/mock-layout/view-model.test.ts @@ -1,8 +1,8 @@ import { describe, expect, it } from "vitest"; -import type { WorkbenchSession } from "@sandbox-agent/foundry-shared"; +import type { WorkspaceSession } from "@sandbox-agent/foundry-shared"; import { buildDisplayMessages } from "./view-model"; -function makeSession(transcript: WorkbenchSession["transcript"]): WorkbenchSession { +function makeSession(transcript: WorkspaceSession["transcript"]): WorkspaceSession { return { id: "session-1", sessionId: "session-1", diff --git a/foundry/packages/frontend/src/components/mock-layout/view-model.ts b/foundry/packages/frontend/src/components/mock-layout/view-model.ts index 83f5c7a..9232293 100644 --- a/foundry/packages/frontend/src/components/mock-layout/view-model.ts +++ b/foundry/packages/frontend/src/components/mock-layout/view-model.ts @@ -1,42 +1,28 @@ +import { + DEFAULT_WORKSPACE_MODEL_GROUPS as SharedModelGroups, + workspaceModelLabel as sharedWorkspaceModelLabel, + workspaceProviderAgent as sharedWorkspaceProviderAgent, +} from "@sandbox-agent/foundry-shared"; import type { - WorkbenchAgentKind as AgentKind, - WorkbenchSession as AgentSession, - WorkbenchDiffLineKind as DiffLineKind, - WorkbenchFileChange as FileChange, - WorkbenchFileTreeNode as FileTreeNode, - WorkbenchTask as Task, - WorkbenchHistoryEvent as HistoryEvent, - WorkbenchLineAttachment as LineAttachment, - WorkbenchModelGroup as ModelGroup, - WorkbenchModelId as ModelId, - WorkbenchParsedDiffLine as ParsedDiffLine, - WorkbenchRepositorySection as RepositorySection, - WorkbenchTranscriptEvent as TranscriptEvent, + WorkspaceAgentKind as AgentKind, + WorkspaceSession as AgentSession, + WorkspaceDiffLineKind as DiffLineKind, + WorkspaceFileChange as FileChange, + WorkspaceFileTreeNode as FileTreeNode, + WorkspaceTask as Task, + WorkspaceHistoryEvent as HistoryEvent, + WorkspaceLineAttachment as LineAttachment, + WorkspaceModelGroup as ModelGroup, + WorkspaceModelId as ModelId, + WorkspaceParsedDiffLine as ParsedDiffLine, + WorkspaceRepositorySection as RepositorySection, + WorkspaceTranscriptEvent as TranscriptEvent, } from "@sandbox-agent/foundry-shared"; import { extractEventText } from "../../features/sessions/model"; export type { RepositorySection }; -export const MODEL_GROUPS: ModelGroup[] = [ - { - provider: "Claude", - models: [ - { id: "claude-sonnet-4", label: "Sonnet 4" }, - { id: "claude-opus-4", label: "Opus 4" }, - ], - }, - { - provider: "OpenAI", - models: [ - { id: "gpt-5.3-codex", label: "GPT-5.3 Codex" }, - { id: "gpt-5.4", label: "GPT-5.4" }, - { id: "gpt-5.2-codex", label: "GPT-5.2 Codex" }, - { id: "gpt-5.1-codex-max", label: "GPT-5.1 Codex Max" }, - { id: "gpt-5.2", label: "GPT-5.2" }, - { id: "gpt-5.1-codex-mini", label: "GPT-5.1 Codex Mini" }, - ], - }, -]; +export const MODEL_GROUPS: ModelGroup[] = SharedModelGroups; export function formatRelativeAge(updatedAtMs: number, nowMs = Date.now()): string { const deltaSeconds = Math.max(0, Math.floor((nowMs - updatedAtMs) / 1000)); @@ -94,15 +80,11 @@ export function formatMessageDuration(durationMs: number): string { } export function modelLabel(id: ModelId): string { - const group = MODEL_GROUPS.find((candidate) => candidate.models.some((model) => model.id === id)); - const model = group?.models.find((candidate) => candidate.id === id); - return model && group ? `${group.provider} ${model.label}` : id; + return sharedWorkspaceModelLabel(id, MODEL_GROUPS); } export function providerAgent(provider: string): AgentKind { - if (provider === "Claude") return "Claude"; - if (provider === "OpenAI") return "Codex"; - return "Cursor"; + return sharedWorkspaceProviderAgent(provider); } const DIFF_PREFIX = "diff:"; diff --git a/foundry/packages/frontend/src/components/organization-dashboard.tsx b/foundry/packages/frontend/src/components/organization-dashboard.tsx index 461ee90..4f54ac3 100644 --- a/foundry/packages/frontend/src/components/organization-dashboard.tsx +++ b/foundry/packages/frontend/src/components/organization-dashboard.tsx @@ -1,5 +1,5 @@ import { useEffect, useMemo, useState, type ReactNode } from "react"; -import type { AgentType, RepoBranchRecord, RepoOverview, TaskWorkbenchSnapshot, WorkbenchTaskStatus } from "@sandbox-agent/foundry-shared"; +import type { RepoBranchRecord, RepoOverview, TaskWorkspaceSnapshot, WorkspaceTaskStatus } from "@sandbox-agent/foundry-shared"; import { currentFoundryOrganization, useSubscription } from "@sandbox-agent/foundry-client"; import { useMutation, useQuery } from "@tanstack/react-query"; import { Link, useNavigate } from "@tanstack/react-router"; @@ -14,7 +14,6 @@ import { StyledDivider } from "baseui/divider"; import { styled, useStyletron } from "baseui"; import { HeadingSmall, HeadingXSmall, LabelSmall, LabelXSmall, MonoLabelSmall, ParagraphSmall } from "baseui/typography"; import { Bot, CircleAlert, FolderGit2, GitBranch, MessageSquareText, SendHorizontal } from "lucide-react"; -import { formatDiffStat } from "../features/tasks/model"; import { deriveHeaderStatus, describeTaskState } from "../features/tasks/status"; import { HeaderStatusPill } from "./mock-layout/ui"; import { buildTranscript, resolveSessionSelection } from "../features/sessions/model"; @@ -95,25 +94,13 @@ const FILTER_OPTIONS: SelectItem[] = [ { id: "all", label: "All Branches" }, ]; -const AGENT_OPTIONS: SelectItem[] = [ - { id: "codex", label: "codex" }, - { id: "claude", label: "claude" }, -]; - -function statusKind(status: WorkbenchTaskStatus): StatusTagKind { +function statusKind(status: WorkspaceTaskStatus): StatusTagKind { if (status === "running") return "positive"; if (status === "error") return "negative"; - if (status === "new" || String(status).startsWith("init_")) return "warning"; + if (String(status).startsWith("init_")) return "warning"; return "neutral"; } -function normalizeAgent(agent: string | null): AgentType | undefined { - if (agent === "claude" || agent === "codex") { - return agent; - } - return undefined; -} - function formatTime(value: number): string { return new Date(value).toLocaleTimeString([], { hour: "2-digit", minute: "2-digit" }); } @@ -160,7 +147,7 @@ function repoSummary(overview: RepoOverview | undefined): { if (row.taskId) { mapped += 1; } - if (row.prNumber && row.prState !== "MERGED" && row.prState !== "CLOSED") { + if (row.pullRequest && row.pullRequest.state !== "MERGED" && row.pullRequest.state !== "CLOSED") { openPrs += 1; } } @@ -174,15 +161,25 @@ function repoSummary(overview: RepoOverview | undefined): { } function branchKind(row: RepoBranchRecord): StatusTagKind { - if (row.prState === "OPEN" || row.prState === "DRAFT") { + if (row.pullRequest?.isDraft || row.pullRequest?.state === "OPEN") { return "warning"; } - if (row.prState === "MERGED") { + if (row.pullRequest?.state === "MERGED") { return "positive"; } return "neutral"; } +function branchPullRequestLabel(branch: RepoBranchRecord): string { + if (!branch.pullRequest) { + return "no pr"; + } + if (branch.pullRequest.isDraft) { + return "draft"; + } + return branch.pullRequest.state.toLowerCase(); +} + function matchesOverviewFilter(branch: RepoBranchRecord, filter: RepoOverviewFilter): boolean { if (filter === "archived") { return branch.taskStatus === "archived"; @@ -332,23 +329,17 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected const [createTaskOpen, setCreateTaskOpen] = useState(false); const [selectedOverviewBranch, setSelectedOverviewBranch] = useState(null); const [overviewFilter, setOverviewFilter] = useState("active"); - const [newAgentType, setNewAgentType] = useState(() => { - try { - const raw = globalThis.localStorage?.getItem("hf.settings.agentType"); - return raw === "claude" || raw === "codex" ? raw : "codex"; - } catch { - return "codex"; - } - }); const [createError, setCreateError] = useState(null); const appState = useSubscription(subscriptionManager, "app", {}); const activeOrg = appState.data ? currentFoundryOrganization(appState.data) : null; const organizationState = useSubscription(subscriptionManager, "organization", { organizationId }); - const repos = organizationState.data?.repos ?? []; - const rows = organizationState.data?.taskSummaries ?? []; - const selectedSummary = useMemo(() => rows.find((row) => row.id === selectedTaskId) ?? rows[0] ?? null, [rows, selectedTaskId]); + const reposData = organizationState.data?.repos; + const rowsData = organizationState.data?.taskSummaries; + const repos = reposData ?? []; + const rows = rowsData ?? []; + const selectedSummary = useMemo(() => rows.find((row) => row.id === selectedTaskId) ?? rows[0] ?? null, [rowsData, selectedTaskId]); const taskState = useSubscription( subscriptionManager, "task", @@ -374,6 +365,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected }); useEffect(() => { + const repos = reposData ?? []; if (repoOverviewMode && selectedRepoId) { setCreateRepoId(selectedRepoId); return; @@ -381,17 +373,11 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected if (!createRepoId && repos.length > 0) { setCreateRepoId(repos[0]!.id); } - }, [createRepoId, repoOverviewMode, repos, selectedRepoId]); - - useEffect(() => { - try { - globalThis.localStorage?.setItem("hf.settings.agentType", newAgentType); - } catch { - // ignore storage failures - } - }, [newAgentType]); + }, [createRepoId, repoOverviewMode, reposData, selectedRepoId]); const repoGroups = useMemo(() => { + const repos = reposData ?? []; + const rows = rowsData ?? []; const byRepo = new Map(); for (const row of rows) { const bucket = byRepo.get(row.repoId); @@ -419,7 +405,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected } return a.repoLabel.localeCompare(b.repoLabel); }); - }, [repos, rows]); + }, [reposData, rowsData]); const selectedForSession = repoOverviewMode ? null : (taskState.data ?? null); @@ -432,6 +418,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected }, [selectedForSession]); useEffect(() => { + const rows = rowsData ?? []; if (!repoOverviewMode && !selectedTaskId && rows.length > 0) { void navigate({ to: "/organizations/$organizationId/tasks/$taskId", @@ -443,18 +430,19 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected replace: true, }); } - }, [navigate, repoOverviewMode, rows, selectedTaskId, organizationId]); + }, [navigate, repoOverviewMode, rowsData, selectedTaskId, organizationId]); useEffect(() => { setActiveSessionId(null); setDraft(""); }, [selectedForSession?.id]); - const sessionRows = selectedForSession?.sessionsSummary ?? []; - const taskRuntimeStatus = selectedForSession?.runtimeStatus ?? selectedForSession?.status ?? null; - const taskStatusState = describeTaskState(taskRuntimeStatus, selectedForSession?.statusMessage ?? null); + const sessionRowsData = selectedForSession?.sessionsSummary; + const sessionRows = sessionRowsData ?? []; + const taskStatus = selectedForSession?.status ?? null; + const taskStatusState = describeTaskState(taskStatus); const taskStateSummary = `${taskStatusState.title}. ${taskStatusState.detail}`; - const shouldUseTaskStateEmptyState = Boolean(selectedForSession && taskRuntimeStatus && taskRuntimeStatus !== "running" && taskRuntimeStatus !== "idle"); + const shouldUseTaskStateEmptyState = Boolean(selectedForSession && taskStatus && taskStatus !== "running" && taskStatus !== "idle"); const sessionSelection = useMemo( () => resolveSessionSelection({ @@ -469,7 +457,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected status: session.status, })), }), - [activeSessionId, selectedForSession?.activeSessionId, sessionRows], + [activeSessionId, selectedForSession?.activeSessionId, sessionRowsData], ); const resolvedSessionId = sessionSelection.sessionId; const staleSessionId = sessionSelection.staleSessionId; @@ -485,7 +473,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected } : null, ); - const selectedSessionSummary = useMemo(() => sessionRows.find((session) => session.id === resolvedSessionId) ?? null, [resolvedSessionId, sessionRows]); + const selectedSessionSummary = useMemo(() => sessionRows.find((session) => session.id === resolvedSessionId) ?? null, [resolvedSessionId, sessionRowsData]); const isPendingProvision = selectedSessionSummary?.status === "pending_provision"; const isPendingSessionCreate = selectedSessionSummary?.status === "pending_session_create"; const isSessionError = selectedSessionSummary?.status === "error"; @@ -505,8 +493,6 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected repoId: task.repoId, title: task.title, status: task.status, - runtimeStatus: selectedForSession?.runtimeStatus ?? null, - statusMessage: selectedForSession?.statusMessage ?? null, branch: task.branch ?? null, activeSandboxId: selectedForSession?.activeSandboxId ?? null, activeSessionId: selectedForSession?.activeSessionId ?? null, @@ -515,7 +501,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected }; }, [repoOverviewMode, selectedForSession, selectedSummary]); const devPanelSnapshot = useMemo( - (): TaskWorkbenchSnapshot => ({ + (): TaskWorkspaceSnapshot => ({ organizationId, repos: repos.map((repo) => ({ id: repo.id, label: repo.label })), repositories: [], @@ -524,8 +510,6 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected repoId: task.repoId, title: task.title, status: task.status, - runtimeStatus: selectedForSession?.id === task.id ? selectedForSession.runtimeStatus : undefined, - statusMessage: selectedForSession?.id === task.id ? selectedForSession.statusMessage : null, repoName: task.repoName, updatedAtMs: task.updatedAtMs, branch: task.branch ?? null, @@ -546,20 +530,21 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected activeSandboxId: selectedForSession?.id === task.id ? selectedForSession.activeSandboxId : null, })), }), - [repos, rows, selectedForSession, organizationId], + [reposData, rowsData, selectedForSession, organizationId], ); const startSessionFromTask = async (): Promise<{ id: string; status: "running" | "idle" | "error" }> => { if (!selectedForSession || !activeSandbox?.sandboxId) { throw new Error("No sandbox is available for this task"); } + const preferredAgent = selectedSessionSummary?.agent === "Claude" ? "claude" : selectedSessionSummary?.agent === "Codex" ? "codex" : undefined; return backendClient.createSandboxSession({ organizationId, sandboxProviderId: activeSandbox.sandboxProviderId, sandboxId: activeSandbox.sandboxId, prompt: selectedForSession.task, cwd: activeSandbox.cwd ?? undefined, - agent: normalizeAgent(selectedForSession.agentType), + agent: preferredAgent, }); }; @@ -616,7 +601,6 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected organizationId, repoId, task, - agentType: newAgentType, explicitTitle: draftTitle || undefined, explicitBranchName: createOnBranch ? undefined : draftBranchName || undefined, onBranch: createOnBranch ?? undefined, @@ -654,16 +638,15 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected setCreateTaskOpen(true); }; - const repoOptions = useMemo(() => repos.map((repo) => createOption({ id: repo.id, label: repo.label })), [repos]); + const repoOptions = useMemo(() => repos.map((repo) => createOption({ id: repo.id, label: repo.label })), [reposData]); const selectedRepoOption = repoOptions.find((option) => option.id === createRepoId) ?? null; - const selectedAgentOption = useMemo(() => createOption(AGENT_OPTIONS.find((option) => option.id === newAgentType) ?? AGENT_OPTIONS[0]!), [newAgentType]); const selectedFilterOption = useMemo( () => createOption(FILTER_OPTIONS.find((option) => option.id === overviewFilter) ?? FILTER_OPTIONS[0]!), [overviewFilter], ); const sessionOptions = useMemo( () => sessionRows.map((session) => createOption({ id: session.id, label: `${session.sessionName} (${session.status})` })), - [sessionRows], + [sessionRowsData], ); const selectedSessionOption = sessionOptions.find((option) => option.id === resolvedSessionId) ?? null; @@ -1057,23 +1040,23 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected
{branch.taskTitle ?? branch.taskId ?? "-"}
- {branch.ciStatus ?? "-"} / {branch.reviewStatus ?? "-"} + {branch.ciStatus ?? "-"} / {branch.pullRequest ? (branch.pullRequest.isDraft ? "draft" : "ready") : "-"}
{formatRelativeAge(branch.updatedAt)}
@@ -1098,7 +1081,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected ) : null} - {branch.prState?.toLowerCase() ?? "no pr"} + {branchPullRequestLabel(branch)}
@@ -1137,8 +1120,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected {selectedForSession ? ( {shouldUseTaskStateEmptyState ? taskStateSummary - : (selectedForSession?.statusMessage ?? - (isPendingProvision ? "The task is still provisioning." : "The session is being created."))} + : isPendingProvision + ? "The task is still provisioning." + : "The session is being created."}
) : null} @@ -1277,15 +1260,13 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected {shouldUseTaskStateEmptyState ? taskStateSummary : isPendingProvision - ? (selectedForSession.statusMessage ?? "Provisioning sandbox...") + ? "Provisioning sandbox..." : isPendingSessionCreate ? "Creating session..." : isSessionError ? (selectedSessionSummary?.errorMessage ?? "Session failed to start.") : !activeSandbox?.sandboxId - ? selectedForSession.statusMessage - ? `Sandbox unavailable: ${selectedForSession.statusMessage}` - : "This task is still provisioning its sandbox." + ? "This task is still provisioning its sandbox." : staleSessionId ? `Session ${staleSessionId} is unavailable. Start a new session to continue.` : resolvedSessionId @@ -1458,7 +1439,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected - +
)} @@ -1483,7 +1464,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected gap: theme.sizing.scale300, })} > - + @@ -1504,9 +1485,8 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected })} > - - - + +
@@ -1529,7 +1509,7 @@ export function OrganizationDashboard({ organizationId, selectedTaskId, selected
- {taskRuntimeStatus === "error" ? ( + {taskStatus === "error" ? (
-
- - Agent - - setScreenshotFormat(event.target.value as "png" | "jpeg" | "webp")} + > + + + + +
+ {screenshotFormat !== "png" && ( +
+ + setScreenshotQuality(event.target.value)} + inputMode="numeric" + style={{ maxWidth: 60 }} + /> +
+ )} +
+ + setScreenshotScale(event.target.value)} + inputMode="decimal" + style={{ maxWidth: 60 }} + /> +
+ +
+ )} + {error &&
{error}
} + {screenshotError &&
{screenshotError}
} + {/* ========== Runtime Section ========== */} +
+
+ + + Desktop Runtime + + + {status?.state ?? "unknown"} + +
+
+
+
Display
+
{status?.display ?? "Not assigned"}
+
+
+
Resolution
+
{resolutionLabel}
+
+
+
Started
+
{formatStartedAt(status?.startedAt)}
+
+
+
+
+ + setWidth(event.target.value)} inputMode="numeric" /> +
+
+ + setHeight(event.target.value)} inputMode="numeric" /> +
+
+ + setDpi(event.target.value)} inputMode="numeric" /> +
+
+ + {showAdvancedStart && ( +
+
+ + +
+
+ + +
+
+ + setStreamFrameRate(event.target.value)} + inputMode="numeric" + disabled={isActive} + /> +
+
+ + setWebrtcPortRange(event.target.value)} disabled={isActive} /> +
+
+ + setDefaultRecordingFps(event.target.value)} + inputMode="numeric" + disabled={isActive} + /> +
+
+ )} +
+ {isActive ? ( + + ) : ( + + )} +
+
+ {/* ========== Missing Dependencies ========== */} + {status?.missingDependencies && status.missingDependencies.length > 0 && ( +
+
+ Missing Dependencies +
+
+ {status.missingDependencies.map((dependency) => ( + + {dependency} + + ))} +
+ {status.installCommand && ( + <> +
+ Install command +
+
{status.installCommand}
+ + )} +
+ )} + {/* ========== Live View Section ========== */} +
+
+ + + {isActive && ( + + )} +
+ {liveViewError && ( +
+ {liveViewError} +
+ )} + {!isActive &&
Start the desktop runtime to enable live view.
} + {isActive && liveViewActive && ( + <> +
+ Right click to open window + {status?.resolution && ( + + {status.resolution.width}x{status.resolution.height} + + )} +
+ + + )} + {isActive && !liveViewActive && ( + <> + {screenshotUrl ? ( +
+ Desktop screenshot +
+ ) : ( +
Click "Start Stream" for live desktop view, or use the Screenshot button above.
+ )} + + )} + {isActive && ( +
+ + {mousePos && ( + + ({mousePos.x}, {mousePos.y}) + + )} +
+ )} +
+ {isActive && ( +
+
+ + + Clipboard + +
+ + +
+
+ {clipboardError && ( +
+ {clipboardError} +
+ )} +
+
Current contents
+
+              {clipboardText ? clipboardText : (empty)}
+            
+
+
+
Write to clipboard
+