Refactor Foundry GitHub and sandbox flows

This commit is contained in:
Nathan Flurry 2026-03-12 10:51:33 -07:00
parent 4bccd5fc8d
commit ec8e816d0d
112 changed files with 4026 additions and 2715 deletions

View file

@ -102,7 +102,7 @@ const run = async (cmd: string, options?: { background?: boolean }) => {
};
// Install sandbox-agent
await run("curl -fsSL https://releases.rivet.dev/sandbox-agent/latest/install.sh | sh");
await run("curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh");
// Install agents conditionally based on available API keys
if (envs.ANTHROPIC_API_KEY) {

View file

@ -91,7 +91,7 @@ export async function setupComputeSdkSandboxAgent(): Promise<{
};
console.log("Installing sandbox-agent...");
await run("curl -fsSL https://releases.rivet.dev/sandbox-agent/latest/install.sh | sh");
await run("curl -fsSL https://releases.rivet.dev/sandbox-agent/0.3.x/install.sh | sh");
if (env.ANTHROPIC_API_KEY) {
console.log("Installing Claude agent...");

View file

@ -1,5 +1,5 @@
FROM node:22-bookworm-slim
RUN apt-get update -qq && apt-get install -y -qq --no-install-recommends ca-certificates > /dev/null 2>&1 && \
rm -rf /var/lib/apt/lists/* && \
npm install -g --silent @sandbox-agent/cli@latest && \
npm install -g --silent @sandbox-agent/cli@0.3.x && \
sandbox-agent install-agent claude

View file

@ -39,7 +39,8 @@ Use `pnpm` workspaces and Turborepo.
- Start the local production-build preview stack: `just foundry-preview`
- Start only the backend locally: `just foundry-backend-start`
- Start only the frontend locally: `pnpm --filter @sandbox-agent/foundry-frontend dev`
- Start the frontend against the mock workbench client: `FOUNDRY_FRONTEND_CLIENT_MODE=mock pnpm --filter @sandbox-agent/foundry-frontend dev`
- Start the frontend against the mock workbench client on a separate port: `FOUNDRY_FRONTEND_CLIENT_MODE=mock pnpm --filter @sandbox-agent/foundry-frontend dev -- --port 4180`
- Keep the real frontend on `4173` and the mock frontend on `4180` intentionally so both can run in parallel against the same real backend during UI testing.
- Stop the compose dev stack: `just foundry-dev-down`
- Tail compose logs: `just foundry-dev-logs`
- Stop the preview stack: `just foundry-preview-down`
@ -66,6 +67,14 @@ Use `pnpm` workspaces and Turborepo.
- Use Bun for CLI/backend execution paths and process spawning.
- Do not add Node compatibility fallbacks for OpenTUI/runtime execution.
## Sandbox Runtime Ownership
- For Daytona sandboxes, `ENTRYPOINT`/`CMD` does not reliably hand PID 1 to `sandbox-agent server`. Start `sandbox-agent server` after sandbox creation via Daytona's native process API, then route normal runtime commands through sandbox-agent.
- For Daytona sandboxes, use sandbox-agent process APIs (`/v1/processes/run` or the equivalent SDK surface) for clone, git, and runtime task commands after the server is up. Native Daytona process execution is only for the one-time server bootstrap plus lifecycle/control-plane calls.
- Native Daytona calls are otherwise limited to sandbox lifecycle/control-plane operations such as create/get/start/stop/delete and preview endpoint lookup.
- If a sandbox fails to start, inspect the provider API first. For Daytona, check the Daytona API/build logs and preview endpoint health before assuming the bug is in task/workbench code. Apply the same rule to any non-Daytona provider by checking the underlying sandbox API directly.
- Task UI must surface startup state clearly. While a sandbox/session is still booting, show the current task phase and status message; if startup fails, show the error directly in the task UI instead of leaving the user at a generic loading/empty state.
## Defensive Error Handling
- Write code defensively: validate assumptions at boundaries and state transitions.
@ -85,6 +94,7 @@ For all Rivet/RivetKit implementation:
- Example: the `task` actor instance already represents `(workspaceId, repoId, taskId)`, so its SQLite tables should not need those columns for primary keys.
3. Do not use backend-global SQLite singletons; database access must go through actor `db` providers (`c.db`).
4. The default dependency source for RivetKit is the published `rivetkit` package so workspace installs and CI remain self-contained.
- Current coordinated build for this branch: `https://pkg.pr.new/rivet-dev/rivet/rivetkit@4409`
5. When working on coordinated RivetKit changes, you may temporarily relink to a local checkout instead of the published package.
- Dedicated local checkout for this workspace: `/Users/nathan/conductor/workspaces/task/rivet-checkout`
- Preferred local link target: `../rivet-checkout/rivetkit-typescript/packages/rivetkit`
@ -142,6 +152,10 @@ For all Rivet/RivetKit implementation:
- Keep strict single-writer ownership: each table/row has exactly one actor writer.
- Parent actors (`workspace`, `project`, `task`, `history`, `sandbox-instance`) use command-only loops with no timeout.
- Periodic syncing lives in dedicated child actors with one timeout cadence each.
- Prefer event-driven actor coordination over synchronous actor-to-actor waiting. Inside an actor, enqueue downstream work and continue unless the current actor truly needs the finished child result to complete its own local mutation safely.
- When publishing to actor queues, prefer `wait: false`. Waiting on queue responses inside actors should be the exception, not the default.
- Coordinator actors must not block on child actor provisioning, sync, webhook fanout, or other long-running remote work. Commit local durable state first, then let child actors advance the flow asynchronously.
- Workflow handlers should be decomposed into narrow durable steps. Each mutation or externally meaningful transition should be its own step; do not hide multi-phase cross-actor flows inside one monolithic workflow step.
- Actor handle policy:
- Prefer explicit `get` or explicit `create` based on workflow intent; do not default to `getOrCreate`.
- Use `get`/`getForId` when the actor is expected to already exist; if missing, surface an explicit `Actor not found` error with recovery context.
@ -157,17 +171,38 @@ For all Rivet/RivetKit implementation:
- Put simple metadata in `c.state` (KV state): small scalars and identifiers like `{ taskId }`, `{ repoId }`, booleans, counters, timestamps, status strings.
- If it grows beyond trivial (arrays, maps, histories, query/filter needs, relational consistency), use SQLite + Drizzle in `c.db`.
## GitHub Ownership
- Foundry is multiplayer. Every signed-in user has their own GitHub account and their own app session state.
- Per-user GitHub identity/auth belongs in a dedicated user-scoped actor, not in organization state.
- Keep a single GitHub source-of-truth actor per organization. It is the only actor allowed to receive GitHub webhooks, call the GitHub API, persist GitHub repositories/members/pull requests, and dispatch GitHub-derived updates to the rest of the actor tree.
- Repository/task/history actors must consume GitHub-derived state from the organization GitHub actor; they must not maintain their own GitHub caches.
- Organization grouping is managed by the GitHub organization structure. Do not introduce a second internal grouping model that can diverge from GitHub.
- For workflow-backed actors, install a workflow `onError` hook and report failures into organization-scoped runtime issue state so the frontend can surface actor/workflow errors without querying the entire actor tree live.
- The main workspace top bar should make organization runtime errors obvious. If actor/workflow errors exist, show them there and include detailed issue state in settings.
## Testing Policy
- Never use vitest mocks (`vi.mock`, `vi.spyOn`, `vi.fn`). Instead, define driver interfaces for external I/O and pass test implementations via the actor runtime context.
- All external service calls (git CLI, GitHub CLI, sandbox-agent HTTP, tmux) must go through the `BackendDriver` interface on the runtime context.
- Integration tests use `setupTest()` from `rivetkit/test` and are gated behind `HF_ENABLE_ACTOR_INTEGRATION_TESTS=1`.
- The canonical "main user flow" for large Foundry changes must be exercised in the live product with `agent-browser`, and screenshots from the full flow should be returned to the user.
- Sign in.
- Create a task.
- Prompt the agent to make a change.
- Create a pull request for the change.
- Prompt another change.
- Push that change.
- Merge the PR.
- Confirm the task is finished and its status is updated correctly.
- During this flow, verify that remote GitHub state updates correctly and that Foundry receives and applies the resulting webhook-driven state updates.
- End-to-end testing must run against the dev backend started via `docker compose -f compose.dev.yaml up` (host -> container). Do not run E2E against an in-process test runtime.
- E2E tests should talk to the backend over HTTP (default `http://127.0.0.1:7741/api/rivet`) and use real GitHub repos/PRs.
- For Foundry live verification, use `rivet-dev/sandbox-agent-testing` as the default testing repo unless the task explicitly says otherwise.
- Secrets (e.g. `OPENAI_API_KEY`, `GITHUB_TOKEN`/`GH_TOKEN`) must be provided via environment variables, never hardcoded in the repo.
- `~/misc/env.txt` and `~/misc/the-foundry.env` contain the expected local OpenAI + GitHub OAuth/App config for dev.
- Do not assume `gh auth token` is sufficient for Foundry task provisioning against private repos. Sandbox/bootstrap git clone, push, and PR flows require a repo-capable `GITHUB_TOKEN`/`GH_TOKEN` in the backend container.
- If browser GitHub OAuth suddenly fails with symptoms like `GitHub OAuth is not configured` while other GitHub flows seem to work, first check whether the backend is relying on a `GITHUB_TOKEN` override instead of the OAuth/App env from `~/misc/env.txt` and `~/misc/the-foundry.env`. In local dev, clear `GITHUB_TOKEN`/`GH_TOKEN`, source those env files, and recreate the backend container; `docker restart` is not enough.
- Preferred product behavior for org workspaces is to mint a GitHub App installation token from the workspace installation and inject it into backend/sandbox git operations. Do not rely on an operator's ambient CLI auth as the long-term solution.
- Treat client E2E tests in `packages/client/test` as the primary end-to-end source of truth for product behavior.
- Keep backend tests small and targeted. Only retain backend-only tests for invariants or persistence rules that are not well-covered through client E2E.

View file

@ -19,9 +19,7 @@ services:
OPENAI_API_KEY: "${OPENAI_API_KEY:-}"
# sandbox-agent codex plugin currently expects CODEX_API_KEY. Map from OPENAI_API_KEY for convenience.
CODEX_API_KEY: "${CODEX_API_KEY:-${OPENAI_API_KEY:-}}"
# Support either GITHUB_TOKEN or GITHUB_PAT in local env files.
GITHUB_TOKEN: "${GITHUB_TOKEN:-${GITHUB_PAT:-}}"
GH_TOKEN: "${GH_TOKEN:-${GITHUB_TOKEN:-${GITHUB_PAT:-}}}"
GITHUB_TOKEN: "${GITHUB_TOKEN:-}"
APP_URL: "${APP_URL:-}"
BETTER_AUTH_URL: "${BETTER_AUTH_URL:-}"
BETTER_AUTH_SECRET: "${BETTER_AUTH_SECRET:-}"
@ -74,7 +72,32 @@ services:
HOME: "/tmp"
HF_BACKEND_HTTP: "http://backend:7741"
ports:
- "4173:4173"
- "${FOUNDRY_FRONTEND_PORT:-4173}:4173"
volumes:
- "..:/app"
# Ensure logs in .foundry/ persist on the host even if we change source mounts later.
- "./.foundry:/app/foundry/.foundry"
- "../../../task/rivet-checkout:/task/rivet-checkout:ro"
# Use Linux-native workspace dependencies inside the container instead of host node_modules.
- "foundry_node_modules:/app/node_modules"
- "foundry_client_node_modules:/app/foundry/packages/client/node_modules"
- "foundry_frontend_errors_node_modules:/app/foundry/packages/frontend-errors/node_modules"
- "foundry_frontend_node_modules:/app/foundry/packages/frontend/node_modules"
- "foundry_shared_node_modules:/app/foundry/packages/shared/node_modules"
- "foundry_pnpm_store:/tmp/.local/share/pnpm/store"
frontend-mock:
build:
context: ..
dockerfile: foundry/docker/frontend.dev.Dockerfile
working_dir: /app
depends_on:
- backend
environment:
HOME: "/tmp"
FOUNDRY_FRONTEND_CLIENT_MODE: "mock"
ports:
- "${FOUNDRY_FRONTEND_MOCK_PORT:-4180}:4173"
volumes:
- "..:/app"
# Ensure logs in .foundry/ persist on the host even if we change source mounts later.

View file

@ -39,4 +39,4 @@ ENV SANDBOX_AGENT_BIN="/root/.local/bin/sandbox-agent"
WORKDIR /app
CMD ["bash", "-lc", "git config --global --add safe.directory /app >/dev/null 2>&1 || true; pnpm install --force --frozen-lockfile --filter @sandbox-agent/foundry-backend... && exec bun foundry/packages/backend/src/index.ts start --host 0.0.0.0 --port 7741"]
CMD ["bash", "-lc", "git config --global --add safe.directory /app >/dev/null 2>&1 || true; pnpm install --force --frozen-lockfile --filter @sandbox-agent/foundry-backend... && pnpm --filter acp-http-client build && pnpm --filter @sandbox-agent/cli-shared build && mkdir -p /app/sdks/typescript/dist && printf 'export * from \"../src/index.ts\";\\n' > /app/sdks/typescript/dist/index.js && printf 'export * from \"../src/index.ts\";\\n' > /app/sdks/typescript/dist/index.d.ts && exec bun foundry/packages/backend/src/index.ts start --host 0.0.0.0 --port 7741"]

View file

@ -8,4 +8,4 @@ RUN npm install -g pnpm@10.28.2
WORKDIR /app
CMD ["bash", "-lc", "pnpm install --force --frozen-lockfile --filter @sandbox-agent/foundry-frontend... && cd foundry/packages/frontend && exec pnpm vite --host 0.0.0.0 --port 4173"]
CMD ["bash", "-lc", "pnpm install --force --frozen-lockfile --filter @sandbox-agent/foundry-frontend... && SKIP_OPENAPI_GEN=1 pnpm --filter sandbox-agent build && pnpm --filter @sandbox-agent/react build && pnpm --filter @sandbox-agent/foundry-shared build && pnpm --filter @sandbox-agent/foundry-client build && pnpm --filter @sandbox-agent/foundry-frontend-errors build && cd foundry/packages/frontend && exec pnpm vite --host 0.0.0.0 --port 4173"]

View file

@ -5,32 +5,52 @@
Keep the backend actor tree aligned with this shape unless we explicitly decide to change it:
```text
WorkspaceActor
├─ HistoryActor(workspace-scoped global feed)
├─ ProjectActor(repo)
│ ├─ ProjectBranchSyncActor
│ ├─ ProjectPrSyncActor
OrganizationActor
├─ GitHubStateActor(org-scoped GitHub source of truth)
├─ RepositoryActor(repo)
│ └─ TaskActor(task)
│ ├─ TaskSessionActor(session) × N
│ │ └─ SessionStatusSyncActor(session) × 0..1
│ └─ Task-local workbench state
└─ SandboxInstanceActor(providerId, sandboxId) × N
AppShellOrganization("app")
└─ UserGitHubDataActor(user-scoped GitHub auth/identity) × N
```
## Ownership Rules
- `WorkspaceActor` is the workspace coordinator and lookup/index owner.
- `HistoryActor` is workspace-scoped. There is one workspace-level history feed.
- `ProjectActor` is the repo coordinator and owns repo-local caches/indexes.
- `OrganizationActor` is the organization coordinator and lookup/index owner.
- `HistoryActor` is repository-scoped.
- `RepositoryActor` is the repo coordinator and owns repo-local indexes.
- `TaskActor` is one branch. Treat `1 task = 1 branch` once branch assignment is finalized.
- `TaskActor` can have many sessions.
- `TaskActor` can reference many sandbox instances historically, but should have only one active sandbox/session at a time.
- Session unread state and draft prompts are backend-owned workbench state, not frontend-local state.
- Branch rename is a real git operation, not just metadata.
- `SandboxInstanceActor` stays separate from `TaskActor`; tasks/sessions reference it by identity.
- Sync actors are polling workers only. They feed parent actors and should not become the source of truth.
- `GitHubStateActor` is the only actor allowed to receive GitHub webhooks, call the GitHub API, persist GitHub repository/member/pull-request data, and dispatch GitHub-derived updates to the rest of the actor tree.
- `UserGitHubDataActor` is user-scoped, not organization-scoped. Store per-user GitHub identity and auth there, not in organization state.
- Foundry is multiplayer. Each signed-in user has their own GitHub account, their own app session, and their own `UserGitHubDataActor`.
- Organization grouping comes from GitHub organizations. Do not invent a parallel non-GitHub organization grouping model inside Foundry state.
- Do not add repo-level GitHub caches such as `pr_cache`; repositories must read remote pull-request state from `GitHubStateActor`.
- Prefer event-driven actor coordination. If an actor is telling another actor to do work, default to enqueueing that work and continuing rather than waiting synchronously for the child actor to finish.
- Queue publishes inside actors should usually use `wait: false`. Only wait for a queue response when the current actor cannot safely commit its own local mutation without the completed child result.
- Coordinator actors must not block on downstream provisioning, sync, or other long-running child actor work.
- Workflow handlers should be decomposed into small durable steps. Each local mutation or externally meaningful transition gets its own step; avoid monolithic workflow steps that bundle an entire cross-actor flow together.
- Every actor that uses `workflow(...)` must install an `onError` hook and report normalized workflow failures into organization-scoped runtime issue state.
- Organization runtime issue state is the backend source of truth for actor/workflow error badges in the frontend top bar and settings screens.
## Maintenance
- Keep this file up to date whenever actor ownership, hierarchy, or lifecycle responsibilities change.
- If the real actor tree diverges from this document, update this document in the same change.
## Daytona Provider Rules
- Daytona sandbox lifecycle uses native Daytona control-plane operations only: create, get, start, stop, delete, and preview endpoint lookup.
- Once a Daytona sandbox exists, the backend must treat sandbox-agent as the runtime surface. Run in-sandbox commands through sandbox-agent process APIs, not Daytona native process execution.
- The Daytona snapshot image must fail fast if `sandbox-agent` or agent installation fails. Do not hide install failures with `|| true`.
- Daytona does not reliably replace PID 1 with the image `ENTRYPOINT`/`CMD`. Start `sandbox-agent server` after sandbox creation via Daytona's native process API, then use sandbox-agent for all normal runtime commands.
- If sandbox startup fails, inspect the provider API and image/build logs first. For Daytona, confirm the snapshot image builds, the preview endpoint comes up, and `/v1/health` responds before chasing task/workbench code paths.
- Task/workbench payloads must include enough startup detail for the frontend to show the current provisioning phase and any startup error message.

View file

@ -22,7 +22,7 @@
"drizzle-orm": "^0.44.5",
"hono": "^4.11.9",
"pino": "^10.3.1",
"rivetkit": "2.1.6",
"rivetkit": "https://pkg.pr.new/rivet-dev/rivet/rivetkit@4409",
"sandbox-agent": "workspace:*",
"uuid": "^13.0.0",
"zod": "^4.1.5"

View file

@ -2,4 +2,4 @@ import { db } from "rivetkit/db/drizzle";
import * as schema from "./schema.js";
import migrations from "./migrations.js";
export const projectDb = db({ schema, migrations });
export const githubStateDb = db({ schema, migrations });

View file

@ -0,0 +1,58 @@
const journal = {
entries: [
{
idx: 0,
when: 1773273600000,
tag: "0000_github_state",
breakpoints: true,
},
],
} as const;
export default {
journal,
migrations: {
m0000: `CREATE TABLE \`github_meta\` (
\`id\` integer PRIMARY KEY NOT NULL,
\`connected_account\` text NOT NULL,
\`installation_status\` text NOT NULL,
\`sync_status\` text NOT NULL,
\`installation_id\` integer,
\`last_sync_label\` text NOT NULL,
\`last_sync_at\` integer,
\`updated_at\` integer NOT NULL
);
CREATE TABLE \`github_repositories\` (
\`repo_id\` text PRIMARY KEY NOT NULL,
\`full_name\` text NOT NULL,
\`clone_url\` text NOT NULL,
\`private\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
CREATE TABLE \`github_members\` (
\`member_id\` text PRIMARY KEY NOT NULL,
\`login\` text NOT NULL,
\`display_name\` text NOT NULL,
\`email\` text,
\`role\` text,
\`state\` text NOT NULL,
\`updated_at\` integer NOT NULL
);
CREATE TABLE \`github_pull_requests\` (
\`pr_id\` text PRIMARY KEY NOT NULL,
\`repo_id\` text NOT NULL,
\`repo_full_name\` text NOT NULL,
\`number\` integer NOT NULL,
\`title\` text NOT NULL,
\`body\` text,
\`state\` text NOT NULL,
\`url\` text NOT NULL,
\`head_ref_name\` text NOT NULL,
\`base_ref_name\` text NOT NULL,
\`author_login\` text,
\`is_draft\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
} as const,
};

View file

@ -0,0 +1,46 @@
import { integer, sqliteTable, text } from "rivetkit/db/drizzle";
export const githubMeta = sqliteTable("github_meta", {
id: integer("id").primaryKey(),
connectedAccount: text("connected_account").notNull(),
installationStatus: text("installation_status").notNull(),
syncStatus: text("sync_status").notNull(),
installationId: integer("installation_id"),
lastSyncLabel: text("last_sync_label").notNull(),
lastSyncAt: integer("last_sync_at"),
updatedAt: integer("updated_at").notNull(),
});
export const githubRepositories = sqliteTable("github_repositories", {
repoId: text("repo_id").notNull().primaryKey(),
fullName: text("full_name").notNull(),
cloneUrl: text("clone_url").notNull(),
private: integer("private").notNull(),
updatedAt: integer("updated_at").notNull(),
});
export const githubMembers = sqliteTable("github_members", {
memberId: text("member_id").notNull().primaryKey(),
login: text("login").notNull(),
displayName: text("display_name").notNull(),
email: text("email"),
role: text("role"),
state: text("state").notNull(),
updatedAt: integer("updated_at").notNull(),
});
export const githubPullRequests = sqliteTable("github_pull_requests", {
prId: text("pr_id").notNull().primaryKey(),
repoId: text("repo_id").notNull(),
repoFullName: text("repo_full_name").notNull(),
number: integer("number").notNull(),
title: text("title").notNull(),
body: text("body"),
state: text("state").notNull(),
url: text("url").notNull(),
headRefName: text("head_ref_name").notNull(),
baseRefName: text("base_ref_name").notNull(),
authorLogin: text("author_login"),
isDraft: integer("is_draft").notNull(),
updatedAt: integer("updated_at").notNull(),
});

View file

@ -0,0 +1,649 @@
// @ts-nocheck
import { eq } from "drizzle-orm";
import { actor } from "rivetkit";
import type { FoundryGithubInstallationStatus, FoundryGithubSyncStatus } from "@sandbox-agent/foundry-shared";
import { repoIdFromRemote } from "../../services/repo.js";
import { resolveWorkspaceGithubAuth } from "../../services/github-auth.js";
import { getActorRuntimeContext } from "../context.js";
import { getOrCreateOrganization, getOrCreateRepository, selfGithubState } from "../handles.js";
import { githubStateDb } from "./db/db.js";
import { githubMembers, githubMeta, githubPullRequests, githubRepositories } from "./db/schema.js";
const META_ROW_ID = 1;
interface GithubStateInput {
organizationId: string;
}
interface GithubStateMeta {
connectedAccount: string;
installationStatus: FoundryGithubInstallationStatus;
syncStatus: FoundryGithubSyncStatus;
installationId: number | null;
lastSyncLabel: string;
lastSyncAt: number | null;
}
interface SyncMemberSeed {
id: string;
login: string;
name: string;
email?: string | null;
role?: string | null;
state?: string | null;
}
interface FullSyncInput {
kind: "personal" | "organization";
githubLogin: string;
connectedAccount: string;
installationStatus: FoundryGithubInstallationStatus;
installationId: number | null;
accessToken?: string | null;
label?: string;
fallbackMembers?: SyncMemberSeed[];
}
interface PullRequestWebhookInput {
connectedAccount: string;
installationStatus: FoundryGithubInstallationStatus;
installationId: number | null;
repository: {
fullName: string;
cloneUrl: string;
private: boolean;
};
pullRequest: {
number: number;
title: string;
body: string | null;
state: string;
url: string;
headRefName: string;
baseRefName: string;
authorLogin: string | null;
isDraft: boolean;
merged?: boolean;
};
}
function normalizePullRequestStatus(input: { state: string; isDraft?: boolean; merged?: boolean }): "draft" | "ready" | "closed" | "merged" {
const rawState = input.state.trim().toUpperCase();
if (input.merged || rawState === "MERGED") {
return "merged";
}
if (rawState === "CLOSED") {
return "closed";
}
return input.isDraft ? "draft" : "ready";
}
interface FullSyncSnapshot {
repositories: Array<{ fullName: string; cloneUrl: string; private: boolean }>;
members: SyncMemberSeed[];
loadPullRequests: () => Promise<
Array<{
repoFullName: string;
cloneUrl: string;
number: number;
title: string;
body?: string | null;
state: string;
url: string;
headRefName: string;
baseRefName: string;
authorLogin?: string | null;
isDraft?: boolean;
}>
>;
}
async function readMeta(c: any): Promise<GithubStateMeta> {
const row = await c.db.select().from(githubMeta).where(eq(githubMeta.id, META_ROW_ID)).get();
return {
connectedAccount: row?.connectedAccount ?? "",
installationStatus: (row?.installationStatus ?? "install_required") as FoundryGithubInstallationStatus,
syncStatus: (row?.syncStatus ?? "pending") as FoundryGithubSyncStatus,
installationId: row?.installationId ?? null,
lastSyncLabel: row?.lastSyncLabel ?? "Waiting for first sync",
lastSyncAt: row?.lastSyncAt ?? null,
};
}
async function writeMeta(c: any, patch: Partial<GithubStateMeta>): Promise<GithubStateMeta> {
const current = await readMeta(c);
const next: GithubStateMeta = {
...current,
...patch,
};
await c.db
.insert(githubMeta)
.values({
id: META_ROW_ID,
connectedAccount: next.connectedAccount,
installationStatus: next.installationStatus,
syncStatus: next.syncStatus,
installationId: next.installationId,
lastSyncLabel: next.lastSyncLabel,
lastSyncAt: next.lastSyncAt,
updatedAt: Date.now(),
})
.onConflictDoUpdate({
target: githubMeta.id,
set: {
connectedAccount: next.connectedAccount,
installationStatus: next.installationStatus,
syncStatus: next.syncStatus,
installationId: next.installationId,
lastSyncLabel: next.lastSyncLabel,
lastSyncAt: next.lastSyncAt,
updatedAt: Date.now(),
},
})
.run();
return next;
}
async function replaceRepositories(c: any, repositories: Array<{ fullName: string; cloneUrl: string; private: boolean }>): Promise<void> {
await c.db.delete(githubRepositories).run();
const now = Date.now();
for (const repository of repositories) {
await c.db
.insert(githubRepositories)
.values({
repoId: repoIdFromRemote(repository.cloneUrl),
fullName: repository.fullName,
cloneUrl: repository.cloneUrl,
private: repository.private ? 1 : 0,
updatedAt: now,
})
.run();
}
}
async function replaceMembers(c: any, members: SyncMemberSeed[]): Promise<void> {
await c.db.delete(githubMembers).run();
const now = Date.now();
for (const member of members) {
await c.db
.insert(githubMembers)
.values({
memberId: member.id,
login: member.login,
displayName: member.name || member.login,
email: member.email ?? null,
role: member.role ?? null,
state: member.state ?? "active",
updatedAt: now,
})
.run();
}
}
async function replacePullRequests(
c: any,
pullRequests: Array<{
repoFullName: string;
cloneUrl: string;
number: number;
title: string;
body?: string | null;
state: string;
url: string;
headRefName: string;
baseRefName: string;
authorLogin?: string | null;
isDraft?: boolean;
}>,
): Promise<void> {
await c.db.delete(githubPullRequests).run();
const now = Date.now();
for (const pullRequest of pullRequests) {
const repoId = repoIdFromRemote(pullRequest.cloneUrl);
await c.db
.insert(githubPullRequests)
.values({
prId: `${repoId}#${pullRequest.number}`,
repoId,
repoFullName: pullRequest.repoFullName,
number: pullRequest.number,
title: pullRequest.title,
body: pullRequest.body ?? null,
state: pullRequest.state,
url: pullRequest.url,
headRefName: pullRequest.headRefName,
baseRefName: pullRequest.baseRefName,
authorLogin: pullRequest.authorLogin ?? null,
isDraft: pullRequest.isDraft ? 1 : 0,
updatedAt: now,
})
.run();
}
}
async function upsertPullRequest(c: any, input: PullRequestWebhookInput): Promise<void> {
const repoId = repoIdFromRemote(input.repository.cloneUrl);
const now = Date.now();
await c.db
.insert(githubRepositories)
.values({
repoId,
fullName: input.repository.fullName,
cloneUrl: input.repository.cloneUrl,
private: input.repository.private ? 1 : 0,
updatedAt: now,
})
.onConflictDoUpdate({
target: githubRepositories.repoId,
set: {
fullName: input.repository.fullName,
cloneUrl: input.repository.cloneUrl,
private: input.repository.private ? 1 : 0,
updatedAt: now,
},
})
.run();
await c.db
.insert(githubPullRequests)
.values({
prId: `${repoId}#${input.pullRequest.number}`,
repoId,
repoFullName: input.repository.fullName,
number: input.pullRequest.number,
title: input.pullRequest.title,
body: input.pullRequest.body ?? null,
state: input.pullRequest.state,
url: input.pullRequest.url,
headRefName: input.pullRequest.headRefName,
baseRefName: input.pullRequest.baseRefName,
authorLogin: input.pullRequest.authorLogin ?? null,
isDraft: input.pullRequest.isDraft ? 1 : 0,
updatedAt: now,
})
.onConflictDoUpdate({
target: githubPullRequests.prId,
set: {
title: input.pullRequest.title,
body: input.pullRequest.body ?? null,
state: input.pullRequest.state,
url: input.pullRequest.url,
headRefName: input.pullRequest.headRefName,
baseRefName: input.pullRequest.baseRefName,
authorLogin: input.pullRequest.authorLogin ?? null,
isDraft: input.pullRequest.isDraft ? 1 : 0,
updatedAt: now,
},
})
.run();
}
async function upsertPullRequestSnapshot(
c: any,
input: {
repoId: string;
repoFullName: string;
number: number;
title: string;
body?: string | null;
state: string;
url: string;
headRefName: string;
baseRefName: string;
authorLogin?: string | null;
isDraft?: boolean;
},
): Promise<void> {
const now = Date.now();
await c.db
.insert(githubPullRequests)
.values({
prId: `${input.repoId}#${input.number}`,
repoId: input.repoId,
repoFullName: input.repoFullName,
number: input.number,
title: input.title,
body: input.body ?? null,
state: input.state,
url: input.url,
headRefName: input.headRefName,
baseRefName: input.baseRefName,
authorLogin: input.authorLogin ?? null,
isDraft: input.isDraft ? 1 : 0,
updatedAt: now,
})
.onConflictDoUpdate({
target: githubPullRequests.prId,
set: {
title: input.title,
body: input.body ?? null,
state: input.state,
url: input.url,
headRefName: input.headRefName,
baseRefName: input.baseRefName,
authorLogin: input.authorLogin ?? null,
isDraft: input.isDraft ? 1 : 0,
updatedAt: now,
},
})
.run();
}
async function countRows(c: any) {
const repositories = await c.db.select().from(githubRepositories).all();
const members = await c.db.select().from(githubMembers).all();
const pullRequests = await c.db.select().from(githubPullRequests).all();
return {
repositoryCount: repositories.length,
memberCount: members.length,
pullRequestCount: pullRequests.length,
};
}
function repoBelongsToAccount(fullName: string, accountLogin: string): boolean {
const owner = fullName.split("/")[0]?.trim().toLowerCase() ?? "";
return owner.length > 0 && owner === accountLogin.trim().toLowerCase();
}
export const githubState = actor({
db: githubStateDb,
createState: (_c, input: GithubStateInput) => ({
organizationId: input.organizationId,
}),
actions: {
async getSummary(c): Promise<GithubStateMeta & { repositoryCount: number; memberCount: number; pullRequestCount: number }> {
return {
...(await readMeta(c)),
...(await countRows(c)),
};
},
async listRepositories(c): Promise<Array<{ repoId: string; fullName: string; cloneUrl: string; private: boolean }>> {
const rows = await c.db.select().from(githubRepositories).all();
return rows.map((row) => ({
repoId: row.repoId,
fullName: row.fullName,
cloneUrl: row.cloneUrl,
private: Boolean(row.private),
}));
},
async listPullRequestsForRepository(c, input: { repoId: string }) {
const rows = await c.db.select().from(githubPullRequests).where(eq(githubPullRequests.repoId, input.repoId)).all();
return rows.map((row) => ({
number: row.number,
title: row.title,
body: row.body ?? null,
state: row.state,
url: row.url,
headRefName: row.headRefName,
baseRefName: row.baseRefName,
authorLogin: row.authorLogin ?? null,
isDraft: Boolean(row.isDraft),
}));
},
async getPullRequestForBranch(
c,
input: { repoId: string; branchName: string },
): Promise<{ number: number; state: string; url: string; title: string; body: string | null; status: "draft" | "ready" | "closed" | "merged" } | null> {
const branchName = input.branchName?.trim();
if (!branchName) {
return null;
}
const rows = await c.db.select().from(githubPullRequests).where(eq(githubPullRequests.repoId, input.repoId)).all();
const match = rows.find((candidate) => candidate.headRefName === branchName) ?? null;
if (!match) {
return null;
}
return {
number: match.number,
state: match.state,
url: match.url,
title: match.title,
body: match.body ?? null,
status: normalizePullRequestStatus({
state: match.state,
isDraft: Boolean(match.isDraft),
}),
};
},
async clearState(
c,
input: { connectedAccount: string; installationStatus: FoundryGithubInstallationStatus; installationId: number | null; label: string },
): Promise<void> {
await c.db.delete(githubRepositories).run();
await c.db.delete(githubMembers).run();
await c.db.delete(githubPullRequests).run();
await writeMeta(c, {
connectedAccount: input.connectedAccount,
installationStatus: input.installationStatus,
installationId: input.installationId,
syncStatus: input.installationStatus === "connected" ? "pending" : "error",
lastSyncLabel: input.label,
lastSyncAt: null,
});
const organization = await getOrCreateOrganization(c, c.state.organizationId);
await organization.applyOrganizationRepositoryCatalog({
repositories: [],
});
},
async fullSync(c, input: FullSyncInput) {
const { appShell } = getActorRuntimeContext();
const organization = await getOrCreateOrganization(c, c.state.organizationId);
await writeMeta(c, {
connectedAccount: input.connectedAccount,
installationStatus: input.installationStatus,
installationId: input.installationId,
syncStatus: "syncing",
lastSyncLabel: input.label ?? "Syncing GitHub data...",
});
try {
const syncFromUserToken = async (): Promise<FullSyncSnapshot> => {
const rawRepositories = input.accessToken ? await appShell.github.listUserRepositories(input.accessToken) : [];
const repositories =
input.kind === "organization"
? rawRepositories.filter((repository) => repoBelongsToAccount(repository.fullName, input.githubLogin))
: rawRepositories;
const members =
input.accessToken && input.kind === "organization"
? await appShell.github.listOrganizationMembers(input.accessToken, input.githubLogin)
: (input.fallbackMembers ?? []).map((member) => ({
id: member.id,
login: member.login,
name: member.name,
email: member.email ?? null,
role: member.role ?? null,
state: member.state ?? "active",
}));
return {
repositories,
members,
loadPullRequests: async () => (input.accessToken ? await appShell.github.listPullRequestsForUserRepositories(input.accessToken, repositories) : []),
};
};
const { repositories, members, loadPullRequests } =
input.installationId != null
? await (async (): Promise<FullSyncSnapshot> => {
try {
const repositories = await appShell.github.listInstallationRepositories(input.installationId!);
const members =
input.kind === "organization"
? await appShell.github.listInstallationMembers(input.installationId!, input.githubLogin)
: (input.fallbackMembers ?? []).map((member) => ({
id: member.id,
login: member.login,
name: member.name,
email: member.email ?? null,
role: member.role ?? null,
state: member.state ?? "active",
}));
return {
repositories,
members,
loadPullRequests: async () => await appShell.github.listInstallationPullRequests(input.installationId!),
};
} catch (error) {
if (!input.accessToken) {
throw error;
}
return await syncFromUserToken();
}
})()
: await syncFromUserToken();
await replaceRepositories(c, repositories);
await organization.applyOrganizationRepositoryCatalog({
repositories,
});
await replaceMembers(c, members);
const pullRequests = await loadPullRequests();
await replacePullRequests(c, pullRequests);
const lastSyncLabel = repositories.length > 0 ? `Synced ${repositories.length} repositories` : "No repositories available";
await writeMeta(c, {
connectedAccount: input.connectedAccount,
installationStatus: input.installationStatus,
installationId: input.installationId,
syncStatus: "synced",
lastSyncLabel,
lastSyncAt: Date.now(),
});
} catch (error) {
const message = error instanceof Error ? error.message : "GitHub sync failed";
const installationStatus = error instanceof Error && /403|404|401/.test(error.message) ? "reconnect_required" : input.installationStatus;
await writeMeta(c, {
connectedAccount: input.connectedAccount,
installationStatus,
installationId: input.installationId,
syncStatus: "error",
lastSyncLabel: message,
});
throw error;
}
return await selfGithubState(c).getSummary();
},
async handlePullRequestWebhook(c, input: PullRequestWebhookInput): Promise<void> {
await upsertPullRequest(c, input);
await writeMeta(c, {
connectedAccount: input.connectedAccount,
installationStatus: input.installationStatus,
installationId: input.installationId,
syncStatus: "synced",
lastSyncLabel: `Updated PR #${input.pullRequest.number}`,
lastSyncAt: Date.now(),
});
const repository = await getOrCreateRepository(c, c.state.organizationId, repoIdFromRemote(input.repository.cloneUrl), input.repository.cloneUrl);
await repository.applyGithubPullRequestState({
branchName: input.pullRequest.headRefName,
state: input.pullRequest.state,
});
},
async createPullRequest(
c,
input: {
repoId: string;
repoPath: string;
branchName: string;
title: string;
body?: string | null;
},
): Promise<{ number: number; url: string }> {
const { driver } = getActorRuntimeContext();
const auth = await resolveWorkspaceGithubAuth(c, c.state.organizationId);
const repository = await c.db.select().from(githubRepositories).where(eq(githubRepositories.repoId, input.repoId)).get();
const baseRef = await driver.git.remoteDefaultBaseRef(input.repoPath).catch(() => "origin/main");
const baseRefName = baseRef.replace(/^origin\//, "");
const now = Date.now();
let created: { number: number; url: string };
try {
created = await driver.github.createPr(input.repoPath, input.branchName, input.title, input.body ?? undefined, {
githubToken: auth?.githubToken ?? null,
});
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
if (!/already exists/i.test(message)) {
throw error;
}
const existing = await driver.github.getPrInfo(input.repoPath, input.branchName, {
githubToken: auth?.githubToken ?? null,
});
if (!existing?.number || !existing.url) {
throw error;
}
created = {
number: existing.number,
url: existing.url,
};
if (repository) {
await upsertPullRequestSnapshot(c, {
repoId: input.repoId,
repoFullName: repository.fullName,
number: existing.number,
title: existing.title || input.title,
body: input.body ?? null,
state: existing.state,
url: existing.url,
headRefName: existing.headRefName || input.branchName,
baseRefName,
authorLogin: existing.author || null,
isDraft: existing.isDraft,
});
}
await writeMeta(c, {
syncStatus: "synced",
lastSyncLabel: `Linked existing PR #${existing.number}`,
lastSyncAt: now,
});
return created;
}
if (repository) {
await upsertPullRequestSnapshot(c, {
repoId: input.repoId,
repoFullName: repository.fullName,
number: created.number,
title: input.title,
body: input.body ?? null,
state: "OPEN",
url: created.url,
headRefName: input.branchName,
baseRefName,
authorLogin: null,
isDraft: false,
});
}
await writeMeta(c, {
syncStatus: "synced",
lastSyncLabel: `Created PR #${created.number}`,
lastSyncAt: now,
});
return created;
},
async starRepository(c, input: { repoFullName: string }): Promise<void> {
const { driver } = getActorRuntimeContext();
const auth = await resolveWorkspaceGithubAuth(c, c.state.organizationId);
await driver.github.starRepository(input.repoFullName, {
githubToken: auth?.githubToken ?? null,
});
},
},
});

View file

@ -1,107 +1,101 @@
import { taskKey, taskStatusSyncKey, historyKey, projectBranchSyncKey, projectKey, projectPrSyncKey, sandboxInstanceKey, workspaceKey } from "./keys.js";
import { githubStateKey, historyKey, organizationKey, repositoryKey, sandboxInstanceKey, taskKey, taskStatusSyncKey, userGithubDataKey } from "./keys.js";
import type { ProviderId } from "@sandbox-agent/foundry-shared";
export function actorClient(c: any) {
return c.client();
}
export async function getOrCreateWorkspace(c: any, workspaceId: string) {
return await actorClient(c).workspace.getOrCreate(workspaceKey(workspaceId), {
createWithInput: workspaceId,
export async function getOrCreateOrganization(c: any, organizationId: string) {
return await actorClient(c).organization.getOrCreate(organizationKey(organizationId), {
createWithInput: organizationId,
});
}
export async function getOrCreateProject(c: any, workspaceId: string, repoId: string, remoteUrl: string) {
return await actorClient(c).project.getOrCreate(projectKey(workspaceId, repoId), {
export async function getOrCreateRepository(c: any, organizationId: string, repoId: string, remoteUrl: string) {
return await actorClient(c).repository.getOrCreate(repositoryKey(organizationId, repoId), {
createWithInput: {
workspaceId,
workspaceId: organizationId,
repoId,
remoteUrl,
},
});
}
export function getProject(c: any, workspaceId: string, repoId: string) {
return actorClient(c).project.get(projectKey(workspaceId, repoId));
export function getRepository(c: any, organizationId: string, repoId: string) {
return actorClient(c).repository.get(repositoryKey(organizationId, repoId));
}
export function getTask(c: any, workspaceId: string, repoId: string, taskId: string) {
return actorClient(c).task.get(taskKey(workspaceId, repoId, taskId));
export async function getOrCreateGithubState(c: any, organizationId: string) {
return await actorClient(c).githubState.getOrCreate(githubStateKey(organizationId), {
createWithInput: {
organizationId,
},
});
}
export async function getOrCreateTask(c: any, workspaceId: string, repoId: string, taskId: string, createWithInput: Record<string, unknown>) {
return await actorClient(c).task.getOrCreate(taskKey(workspaceId, repoId, taskId), {
export function getGithubState(c: any, organizationId: string) {
return actorClient(c).githubState.get(githubStateKey(organizationId));
}
export async function getOrCreateUserGithubData(c: any, userId: string) {
return await actorClient(c).userGithub.getOrCreate(userGithubDataKey(userId), {
createWithInput: {
userId,
},
});
}
export function getUserGithubData(c: any, userId: string) {
return actorClient(c).userGithub.get(userGithubDataKey(userId));
}
export function getTask(c: any, organizationId: string, repoId: string, taskId: string) {
return actorClient(c).task.get(taskKey(organizationId, repoId, taskId));
}
export async function getOrCreateTask(c: any, organizationId: string, repoId: string, taskId: string, createWithInput: Record<string, unknown>) {
return await actorClient(c).task.getOrCreate(taskKey(organizationId, repoId, taskId), {
createWithInput,
});
}
export async function getOrCreateHistory(c: any, workspaceId: string, repoId: string) {
return await actorClient(c).history.getOrCreate(historyKey(workspaceId, repoId), {
export async function getOrCreateHistory(c: any, organizationId: string, repoId: string) {
return await actorClient(c).history.getOrCreate(historyKey(organizationId, repoId), {
createWithInput: {
workspaceId,
workspaceId: organizationId,
repoId,
},
});
}
export async function getOrCreateProjectPrSync(c: any, workspaceId: string, repoId: string, repoPath: string, intervalMs: number) {
return await actorClient(c).projectPrSync.getOrCreate(projectPrSyncKey(workspaceId, repoId), {
createWithInput: {
workspaceId,
repoId,
repoPath,
intervalMs,
},
});
}
export async function getOrCreateProjectBranchSync(c: any, workspaceId: string, repoId: string, repoPath: string, intervalMs: number) {
return await actorClient(c).projectBranchSync.getOrCreate(projectBranchSyncKey(workspaceId, repoId), {
createWithInput: {
workspaceId,
repoId,
repoPath,
intervalMs,
},
});
}
export function getSandboxInstance(c: any, workspaceId: string, providerId: ProviderId, sandboxId: string) {
return actorClient(c).sandboxInstance.get(sandboxInstanceKey(workspaceId, providerId, sandboxId));
export function getSandboxInstance(c: any, organizationId: string, providerId: ProviderId, sandboxId: string) {
return actorClient(c).sandboxInstance.get(sandboxInstanceKey(organizationId, providerId, sandboxId));
}
export async function getOrCreateSandboxInstance(
c: any,
workspaceId: string,
organizationId: string,
providerId: ProviderId,
sandboxId: string,
createWithInput: Record<string, unknown>,
) {
return await actorClient(c).sandboxInstance.getOrCreate(sandboxInstanceKey(workspaceId, providerId, sandboxId), { createWithInput });
return await actorClient(c).sandboxInstance.getOrCreate(sandboxInstanceKey(organizationId, providerId, sandboxId), { createWithInput });
}
export async function getOrCreateTaskStatusSync(
c: any,
workspaceId: string,
organizationId: string,
repoId: string,
taskId: string,
sandboxId: string,
sessionId: string,
createWithInput: Record<string, unknown>,
) {
return await actorClient(c).taskStatusSync.getOrCreate(taskStatusSyncKey(workspaceId, repoId, taskId, sandboxId, sessionId), {
return await actorClient(c).taskStatusSync.getOrCreate(taskStatusSyncKey(organizationId, repoId, taskId, sandboxId, sessionId), {
createWithInput,
});
}
export function selfProjectPrSync(c: any) {
return actorClient(c).projectPrSync.getForId(c.actorId);
}
export function selfProjectBranchSync(c: any) {
return actorClient(c).projectBranchSync.getForId(c.actorId);
}
export function selfTaskStatusSync(c: any) {
return actorClient(c).taskStatusSync.getForId(c.actorId);
}
@ -114,12 +108,20 @@ export function selfTask(c: any) {
return actorClient(c).task.getForId(c.actorId);
}
export function selfWorkspace(c: any) {
return actorClient(c).workspace.getForId(c.actorId);
export function selfOrganization(c: any) {
return actorClient(c).organization.getForId(c.actorId);
}
export function selfProject(c: any) {
return actorClient(c).project.getForId(c.actorId);
export function selfRepository(c: any) {
return actorClient(c).repository.getForId(c.actorId);
}
export function selfGithubState(c: any) {
return actorClient(c).githubState.getForId(c.actorId);
}
export function selfUserGithubData(c: any) {
return actorClient(c).userGithub.getForId(c.actorId);
}
export function selfSandboxInstance(c: any) {

View file

@ -4,6 +4,7 @@ import { actor, queue } from "rivetkit";
import { Loop, workflow } from "rivetkit/workflow";
import type { HistoryEvent } from "@sandbox-agent/foundry-shared";
import { selfHistory } from "../handles.js";
import { reportWorkflowIssueToOrganization } from "../runtime-issues.js";
import { historyDb } from "./db/db.js";
import { events } from "./db/schema.js";
@ -107,5 +108,14 @@ export const history = actor({
}));
},
},
run: workflow(runHistoryWorkflow),
run: workflow(runHistoryWorkflow, {
onError: async (c: any, event) => {
await reportWorkflowIssueToOrganization(c, event, {
actorType: "history",
organizationId: c.state.workspaceId,
scopeId: c.state.repoId,
scopeLabel: `History ${c.state.repoId}`,
});
},
}),
});

View file

@ -2,11 +2,11 @@ import { setup } from "rivetkit";
import { taskStatusSync } from "./task-status-sync/index.js";
import { task } from "./task/index.js";
import { history } from "./history/index.js";
import { projectBranchSync } from "./project-branch-sync/index.js";
import { projectPrSync } from "./project-pr-sync/index.js";
import { project } from "./project/index.js";
import { githubState } from "./github-state/index.js";
import { repository } from "./repository/index.js";
import { sandboxInstance } from "./sandbox-instance/index.js";
import { workspace } from "./workspace/index.js";
import { organization } from "./organization/index.js";
import { userGithub } from "./user-github-data/index.js";
export function resolveManagerPort(): number {
const raw = process.env.HF_RIVET_MANAGER_PORT ?? process.env.RIVETKIT_MANAGER_PORT;
@ -28,15 +28,16 @@ function resolveManagerHost(): string {
export const registry = setup({
use: {
workspace,
project,
organization,
repository,
githubState,
userGithub,
task,
sandboxInstance,
history,
projectPrSync,
projectBranchSync,
taskStatusSync,
},
serveManager: true,
managerPort: resolveManagerPort(),
managerHost: resolveManagerHost(),
});
@ -46,9 +47,9 @@ export * from "./events.js";
export * from "./task-status-sync/index.js";
export * from "./task/index.js";
export * from "./history/index.js";
export * from "./github-state/index.js";
export * from "./keys.js";
export * from "./project-branch-sync/index.js";
export * from "./project-pr-sync/index.js";
export * from "./project/index.js";
export * from "./repository/index.js";
export * from "./sandbox-instance/index.js";
export * from "./workspace/index.js";
export * from "./organization/index.js";
export * from "./user-github-data/index.js";

View file

@ -1,34 +1,34 @@
export type ActorKey = string[];
export function workspaceKey(workspaceId: string): ActorKey {
return ["ws", workspaceId];
export function organizationKey(organizationId: string): ActorKey {
return ["org", organizationId];
}
export function projectKey(workspaceId: string, repoId: string): ActorKey {
return ["ws", workspaceId, "project", repoId];
export function repositoryKey(organizationId: string, repoId: string): ActorKey {
return ["org", organizationId, "repo", repoId];
}
export function taskKey(workspaceId: string, repoId: string, taskId: string): ActorKey {
return ["ws", workspaceId, "project", repoId, "task", taskId];
export function githubStateKey(organizationId: string): ActorKey {
return ["org", organizationId, "github"];
}
export function sandboxInstanceKey(workspaceId: string, providerId: string, sandboxId: string): ActorKey {
return ["ws", workspaceId, "provider", providerId, "sandbox", sandboxId];
export function userGithubDataKey(userId: string): ActorKey {
return ["user", userId, "github"];
}
export function historyKey(workspaceId: string, repoId: string): ActorKey {
return ["ws", workspaceId, "project", repoId, "history"];
export function taskKey(organizationId: string, repoId: string, taskId: string): ActorKey {
return ["org", organizationId, "repo", repoId, "task", taskId];
}
export function projectPrSyncKey(workspaceId: string, repoId: string): ActorKey {
return ["ws", workspaceId, "project", repoId, "pr-sync"];
export function sandboxInstanceKey(organizationId: string, providerId: string, sandboxId: string): ActorKey {
return ["org", organizationId, "provider", providerId, "sandbox", sandboxId];
}
export function projectBranchSyncKey(workspaceId: string, repoId: string): ActorKey {
return ["ws", workspaceId, "project", repoId, "branch-sync"];
export function historyKey(organizationId: string, repoId: string): ActorKey {
return ["org", organizationId, "repo", repoId, "history"];
}
export function taskStatusSyncKey(workspaceId: string, repoId: string, taskId: string, sandboxId: string, sessionId: string): ActorKey {
export function taskStatusSyncKey(organizationId: string, repoId: string, taskId: string, sandboxId: string, sessionId: string): ActorKey {
// Include sandbox + session so multiple sandboxes/sessions can be tracked per task.
return ["ws", workspaceId, "project", repoId, "task", taskId, "status-sync", sandboxId, sessionId];
return ["org", organizationId, "repo", repoId, "task", taskId, "status-sync", sandboxId, sessionId];
}

View file

@ -1,5 +1,6 @@
// @ts-nocheck
import { desc, eq } from "drizzle-orm";
import { randomUUID } from "node:crypto";
import { Loop } from "rivetkit/workflow";
import type {
AddRepoInput,
@ -31,14 +32,17 @@ import type {
WorkspaceUseInput,
} from "@sandbox-agent/foundry-shared";
import { getActorRuntimeContext } from "../context.js";
import { getTask, getOrCreateHistory, getOrCreateProject, selfWorkspace } from "../handles.js";
import { getOrCreateGithubState, getOrCreateHistory, getOrCreateRepository, getOrCreateTask, getTask, selfOrganization } from "../handles.js";
import { logActorWarning, resolveErrorMessage } from "../logging.js";
import { upsertActorRuntimeIssue } from "../runtime-issues.js";
import { normalizeRemoteUrl, repoIdFromRemote } from "../../services/repo.js";
import { resolveWorkspaceGithubAuth } from "../../services/github-auth.js";
import { foundryRepoClonePath } from "../../services/foundry-paths.js";
import { taskLookup, repos, providerProfiles } from "./db/schema.js";
import { agentTypeForModel } from "../task/workbench.js";
import { expectQueueResponse } from "../../services/queue.js";
import { workspaceAppActions } from "./app-shell.js";
import { projectWorkflowQueueName } from "../repository/actions.js";
interface WorkspaceState {
workspaceId: string;
@ -82,11 +86,31 @@ function assertWorkspace(c: { state: WorkspaceState }, workspaceId: string): voi
async function resolveRepoId(c: any, taskId: string): Promise<string> {
const row = await c.db.select({ repoId: taskLookup.repoId }).from(taskLookup).where(eq(taskLookup.taskId, taskId)).get();
if (!row) {
throw new Error(`Unknown task: ${taskId} (not in lookup)`);
if (row) {
return row.repoId;
}
return row.repoId;
const repoRows = await c.db.select({ repoId: repos.repoId, remoteUrl: repos.remoteUrl }).from(repos).orderBy(desc(repos.updatedAt)).all();
for (const repoRow of repoRows) {
try {
const project = await getOrCreateRepository(c, c.state.workspaceId, repoRow.repoId, repoRow.remoteUrl);
const summaries = await project.listTaskSummaries({ includeArchived: true });
if (!summaries.some((summary) => summary.taskId === taskId)) {
continue;
}
await upsertTaskLookupRow(c, taskId, repoRow.repoId);
return repoRow.repoId;
} catch (error) {
logActorWarning("workspace", "failed resolving repo from task summary fallback", {
workspaceId: c.state.workspaceId,
repoId: repoRow.repoId,
taskId,
error: resolveErrorMessage(error),
});
}
}
throw new Error(`Unknown task: ${taskId}`);
}
async function upsertTaskLookupRow(c: any, taskId: string, repoId: string): Promise<void> {
@ -105,17 +129,32 @@ async function upsertTaskLookupRow(c: any, taskId: string, repoId: string): Prom
async function collectAllTaskSummaries(c: any): Promise<TaskSummary[]> {
const repoRows = await c.db.select({ repoId: repos.repoId, remoteUrl: repos.remoteUrl }).from(repos).orderBy(desc(repos.updatedAt)).all();
const taskRows = await c.db.select({ taskId: taskLookup.taskId, repoId: taskLookup.repoId }).from(taskLookup).all();
const repoById = new Map(repoRows.map((row) => [row.repoId, row]));
const all: TaskSummary[] = [];
for (const row of repoRows) {
for (const row of taskRows) {
const repo = repoById.get(row.repoId);
if (!repo) {
continue;
}
try {
const project = await getOrCreateProject(c, c.state.workspaceId, row.repoId, row.remoteUrl);
const snapshot = await project.listTaskSummaries({ includeArchived: true });
all.push(...snapshot);
const task = getTask(c, c.state.workspaceId, row.repoId, row.taskId);
const snapshot = await task.get();
all.push({
workspaceId: c.state.workspaceId,
repoId: row.repoId,
taskId: snapshot.taskId,
branchName: snapshot.branchName,
title: snapshot.title,
status: snapshot.status,
updatedAt: snapshot.updatedAt,
});
} catch (error) {
logActorWarning("workspace", "failed collecting tasks for repo", {
workspaceId: c.state.workspaceId,
repoId: row.repoId,
taskId: row.taskId,
error: resolveErrorMessage(error),
});
}
@ -145,48 +184,46 @@ async function buildWorkbenchSnapshot(c: any): Promise<TaskWorkbenchSnapshot> {
.from(repos)
.orderBy(desc(repos.updatedAt))
.all();
const taskRows = await c.db.select({ taskId: taskLookup.taskId, repoId: taskLookup.repoId }).from(taskLookup).all();
const repoById = new Map(repoRows.map((row) => [row.repoId, row]));
const tasks: Array<any> = [];
const projects: Array<any> = [];
for (const row of repoRows) {
const projectTasks: Array<any> = [];
const projectTasksByRepoId = new Map<string, Array<any>>();
for (const row of taskRows) {
const repo = repoById.get(row.repoId);
if (!repo) {
continue;
}
try {
const project = await getOrCreateProject(c, c.state.workspaceId, row.repoId, row.remoteUrl);
const summaries = await project.listTaskSummaries({ includeArchived: true });
for (const summary of summaries) {
try {
await upsertTaskLookupRow(c, summary.taskId, row.repoId);
const task = getTask(c, c.state.workspaceId, row.repoId, summary.taskId);
const snapshot = await task.getWorkbench({});
tasks.push(snapshot);
projectTasks.push(snapshot);
} catch (error) {
logActorWarning("workspace", "failed collecting workbench task", {
workspaceId: c.state.workspaceId,
repoId: row.repoId,
taskId: summary.taskId,
error: resolveErrorMessage(error),
});
}
}
if (projectTasks.length > 0) {
projects.push({
id: row.repoId,
label: repoLabelFromRemote(row.remoteUrl),
updatedAtMs: projectTasks[0]?.updatedAtMs ?? row.updatedAt,
tasks: projectTasks.sort((left, right) => right.updatedAtMs - left.updatedAtMs),
});
}
const task = getTask(c, c.state.workspaceId, row.repoId, row.taskId);
const snapshot = await task.getWorkbenchSummary({});
tasks.push(snapshot);
const repoTasks = projectTasksByRepoId.get(row.repoId) ?? [];
repoTasks.push(snapshot);
projectTasksByRepoId.set(row.repoId, repoTasks);
} catch (error) {
logActorWarning("workspace", "failed collecting workbench repo snapshot", {
logActorWarning("workspace", "failed collecting workbench task", {
workspaceId: c.state.workspaceId,
repoId: row.repoId,
taskId: row.taskId,
error: resolveErrorMessage(error),
});
}
}
for (const row of repoRows) {
const projectTasks = (projectTasksByRepoId.get(row.repoId) ?? []).sort((left, right) => right.updatedAtMs - left.updatedAtMs);
if (projectTasks.length > 0) {
projects.push({
id: row.repoId,
label: repoLabelFromRemote(row.remoteUrl),
updatedAtMs: projectTasks[0]?.updatedAtMs ?? row.updatedAt,
tasks: projectTasks,
});
}
}
tasks.sort((left, right) => right.updatedAtMs - left.updatedAtMs);
projects.sort((left, right) => right.updatedAtMs - left.updatedAtMs);
return {
@ -250,7 +287,7 @@ async function addRepoMutation(c: any, input: AddRepoInput): Promise<RepoRecord>
async function createTaskMutation(c: any, input: CreateTaskInput): Promise<TaskRecord> {
assertWorkspace(c, input.workspaceId);
const { providers } = getActorRuntimeContext();
const { config, providers } = getActorRuntimeContext();
const providerId = input.providerId ?? providers.defaultProviderId();
const repoId = input.repoId;
@ -259,6 +296,11 @@ async function createTaskMutation(c: any, input: CreateTaskInput): Promise<TaskR
throw new Error(`Unknown repo: ${repoId}`);
}
const remoteUrl = repoRow.remoteUrl;
const taskId = randomUUID();
const now = Date.now();
const initialBranchName = input.onBranch?.trim() || null;
const initialTitle = initialBranchName ? null : (input.explicitTitle ?? null);
const localPath = foundryRepoClonePath(config, c.state.workspaceId, repoId);
await c.db
.insert(providerProfiles)
@ -271,27 +313,15 @@ async function createTaskMutation(c: any, input: CreateTaskInput): Promise<TaskR
target: providerProfiles.providerId,
set: {
profileJson: JSON.stringify({ providerId }),
updatedAt: Date.now(),
updatedAt: now,
},
})
.run();
const project = await getOrCreateProject(c, c.state.workspaceId, repoId, remoteUrl);
await project.ensure({ remoteUrl });
const created = await project.createTask({
task: input.task,
providerId,
agentType: input.agentType ?? null,
explicitTitle: input.explicitTitle ?? null,
explicitBranchName: input.explicitBranchName ?? null,
onBranch: input.onBranch ?? null,
});
await c.db
.insert(taskLookup)
.values({
taskId: created.taskId,
taskId,
repoId,
})
.onConflictDoUpdate({
@ -300,11 +330,70 @@ async function createTaskMutation(c: any, input: CreateTaskInput): Promise<TaskR
})
.run();
const task = getTask(c, c.state.workspaceId, repoId, created.taskId);
await task.provision({ providerId });
await getOrCreateTask(c, c.state.workspaceId, repoId, taskId, {
workspaceId: c.state.workspaceId,
repoId,
taskId,
repoRemote: remoteUrl,
repoLocalPath: localPath,
branchName: initialBranchName,
title: initialTitle,
task: input.task,
providerId,
agentType: input.agentType ?? null,
explicitTitle: initialBranchName ? null : (input.explicitTitle ?? null),
explicitBranchName: initialBranchName ? null : (input.explicitBranchName ?? null),
initialPrompt: null,
createdAt: now,
updatedAt: now,
});
const project = await getOrCreateRepository(c, c.state.workspaceId, repoId, remoteUrl);
await project.send(
projectWorkflowQueueName("project.command.createTask"),
{
taskId,
task: input.task,
providerId,
agentType: input.agentType ?? null,
explicitTitle: input.explicitTitle ?? null,
explicitBranchName: input.explicitBranchName ?? null,
onBranch: input.onBranch ?? null,
},
{
wait: false,
},
);
await workspaceActions.notifyWorkbenchUpdated(c);
return created;
return {
workspaceId: c.state.workspaceId,
repoId,
repoRemote: remoteUrl,
taskId,
branchName: initialBranchName,
title: initialTitle,
task: input.task,
providerId,
status: "init_enqueue_provision",
statusMessage: "provision queued",
activeSandboxId: null,
activeSessionId: null,
sandboxes: [],
agentType: input.agentType ?? null,
prSubmitted: false,
diffStat: null,
hasUnpushed: null,
conflictsWithMain: null,
parentBranch: null,
prUrl: null,
prAuthor: null,
ciStatus: null,
reviewStatus: null,
reviewer: null,
createdAt: now,
updatedAt: now,
} satisfies TaskRecord;
}
async function refreshProviderProfilesMutation(c: any, command?: RefreshProviderProfilesCommand): Promise<void> {
@ -333,6 +422,7 @@ async function refreshProviderProfilesMutation(c: any, command?: RefreshProvider
export async function runWorkspaceWorkflow(ctx: any): Promise<void> {
await ctx.loop("workspace-command-loop", async (loopCtx: any) => {
await loopCtx.removed("workspace-create-task", "step");
const msg = await loopCtx.queue.next("next-workspace-command", {
names: [...WORKSPACE_QUEUE_NAMES],
completable: true,
@ -354,7 +444,7 @@ export async function runWorkspaceWorkflow(ctx: any): Promise<void> {
if (msg.name === "workspace.command.createTask") {
const result = await loopCtx.step({
name: "workspace-create-task",
timeout: 12 * 60_000,
timeout: 60_000,
run: async () => createTaskMutation(loopCtx, msg.body as CreateTaskInput),
});
await msg.complete(result);
@ -374,13 +464,17 @@ export async function runWorkspaceWorkflow(ctx: any): Promise<void> {
export const workspaceActions = {
...workspaceAppActions,
async recordActorRuntimeIssue(c: any, input: any): Promise<void> {
await upsertActorRuntimeIssue(c, input);
},
async useWorkspace(c: any, input: WorkspaceUseInput): Promise<{ workspaceId: string }> {
assertWorkspace(c, input.workspaceId);
return { workspaceId: c.state.workspaceId };
},
async addRepo(c: any, input: AddRepoInput): Promise<RepoRecord> {
const self = selfWorkspace(c);
const self = selfOrganization(c);
return expectQueueResponse<RepoRecord>(
await self.send(workspaceWorkflowQueueName("workspace.command.addRepo"), input, {
wait: true,
@ -413,19 +507,13 @@ export const workspaceActions = {
},
async createTask(c: any, input: CreateTaskInput): Promise<TaskRecord> {
const self = selfWorkspace(c);
return expectQueueResponse<TaskRecord>(
await self.send(workspaceWorkflowQueueName("workspace.command.createTask"), input, {
wait: true,
timeout: 12 * 60_000,
}),
);
return await createTaskMutation(c, input);
},
async starSandboxAgentRepo(c: any, input: StarSandboxAgentRepoInput): Promise<StarSandboxAgentRepoResult> {
assertWorkspace(c, input.workspaceId);
const { driver } = getActorRuntimeContext();
await driver.github.starRepository(SANDBOX_AGENT_REPO);
const githubState = await getOrCreateGithubState(c, c.state.workspaceId);
await githubState.starRepository({ repoFullName: SANDBOX_AGENT_REPO });
return {
repo: SANDBOX_AGENT_REPO,
starredAt: Date.now(),
@ -441,7 +529,7 @@ export const workspaceActions = {
c.broadcast("workbenchUpdated", { at: Date.now() });
},
async createWorkbenchTask(c: any, input: TaskWorkbenchCreateTaskInput): Promise<{ taskId: string; tabId?: string }> {
async createWorkbenchTask(c: any, input: TaskWorkbenchCreateTaskInput): Promise<{ taskId: string }> {
const created = await workspaceActions.createTask(c, {
workspaceId: c.state.workspaceId,
repoId: input.repoId,
@ -450,12 +538,7 @@ export const workspaceActions = {
...(input.branch ? { explicitBranchName: input.branch } : {}),
...(input.model ? { agentType: agentTypeForModel(input.model) } : {}),
});
const task = await requireWorkbenchTask(c, created.taskId);
const snapshot = await task.getWorkbench({});
return {
taskId: created.taskId,
tabId: snapshot.tabs[0]?.id,
};
return { taskId: created.taskId };
},
async markWorkbenchUnread(c: any, input: TaskWorkbenchSelectInput): Promise<void> {
@ -532,7 +615,7 @@ export const workspaceActions = {
throw new Error(`Unknown repo: ${input.repoId}`);
}
const project = await getOrCreateProject(c, c.state.workspaceId, input.repoId, repoRow.remoteUrl);
const project = await getOrCreateRepository(c, c.state.workspaceId, input.repoId, repoRow.remoteUrl);
return await project.listTaskSummaries({ includeArchived: true });
}
@ -547,7 +630,7 @@ export const workspaceActions = {
throw new Error(`Unknown repo: ${input.repoId}`);
}
const project = await getOrCreateProject(c, c.state.workspaceId, input.repoId, repoRow.remoteUrl);
const project = await getOrCreateRepository(c, c.state.workspaceId, input.repoId, repoRow.remoteUrl);
await project.ensure({ remoteUrl: repoRow.remoteUrl });
return await project.getRepoOverview({});
},
@ -560,7 +643,7 @@ export const workspaceActions = {
throw new Error(`Unknown repo: ${input.repoId}`);
}
const project = await getOrCreateProject(c, c.state.workspaceId, input.repoId, repoRow.remoteUrl);
const project = await getOrCreateRepository(c, c.state.workspaceId, input.repoId, repoRow.remoteUrl);
await project.ensure({ remoteUrl: repoRow.remoteUrl });
return await project.runRepoStackAction({
action: input.action,
@ -584,7 +667,7 @@ export const workspaceActions = {
},
async refreshProviderProfiles(c: any, command?: RefreshProviderProfilesCommand): Promise<void> {
const self = selfWorkspace(c);
const self = selfOrganization(c);
await self.send(workspaceWorkflowQueueName("workspace.command.refreshProviderProfiles"), command ?? {}, {
wait: true,
timeout: 60_000,
@ -602,11 +685,12 @@ export const workspaceActions = {
for (const row of repoRows) {
try {
const hist = await getOrCreateHistory(c, c.state.workspaceId, row.repoId);
const items = await hist.list({
branch: input.branch,
taskId: input.taskId,
limit,
});
const items =
(await hist.list({
branch: input.branch,
taskId: input.taskId,
limit,
})) ?? [];
allEvents.push(...items);
} catch (error) {
logActorWarning("workspace", "history lookup failed for repo", {
@ -631,8 +715,24 @@ export const workspaceActions = {
throw new Error(`Unknown repo: ${repoId}`);
}
const project = await getOrCreateProject(c, c.state.workspaceId, repoId, repoRow.remoteUrl);
return await project.getTaskEnriched({ taskId: input.taskId });
const project = await getOrCreateRepository(c, c.state.workspaceId, repoId, repoRow.remoteUrl);
try {
return await project.getTaskEnriched({ taskId: input.taskId });
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
if (!message.includes("Unknown task in repo")) {
throw error;
}
logActorWarning("workspace", "repository task index missed known task; falling back to direct task actor read", {
workspaceId: c.state.workspaceId,
repoId,
taskId: input.taskId,
});
const task = getTask(c, c.state.workspaceId, repoId, input.taskId);
return await task.get();
}
},
async attachTask(c: any, input: TaskProxyActionInput): Promise<{ target: string; sessionId: string | null }> {

View file

@ -3,6 +3,7 @@ import { desc, eq } from "drizzle-orm";
import { randomUUID } from "node:crypto";
import type {
FoundryAppSnapshot,
FoundryActorRuntimeState,
FoundryBillingPlanId,
FoundryBillingState,
FoundryOrganization,
@ -11,24 +12,25 @@ import type {
UpdateFoundryOrganizationProfileInput,
} from "@sandbox-agent/foundry-shared";
import { getActorRuntimeContext } from "../context.js";
import { getOrCreateWorkspace } from "../handles.js";
import { getOrCreateGithubState, getOrCreateOrganization, getOrCreateUserGithubData } from "../handles.js";
import { GitHubAppError } from "../../services/app-github.js";
import { repoIdFromRemote, repoLabelFromRemote } from "../../services/repo.js";
import { listActorRuntimeIssues } from "../runtime-issues.js";
import { appSessions, invoices, organizationMembers, organizationProfile, repos, seatAssignments, stripeLookup } from "./db/schema.js";
export const APP_SHELL_WORKSPACE_ID = "app";
export const APP_SHELL_ORGANIZATION_ID = "app";
const PROFILE_ROW_ID = "profile";
const OAUTH_TTL_MS = 10 * 60_000;
function assertAppWorkspace(c: any): void {
if (c.state.workspaceId !== APP_SHELL_WORKSPACE_ID) {
throw new Error(`App shell action requires workspace ${APP_SHELL_WORKSPACE_ID}, got ${c.state.workspaceId}`);
if (c.state.workspaceId !== APP_SHELL_ORGANIZATION_ID) {
throw new Error(`App shell action requires workspace ${APP_SHELL_ORGANIZATION_ID}, got ${c.state.workspaceId}`);
}
}
function assertOrganizationWorkspace(c: any): void {
if (c.state.workspaceId === APP_SHELL_WORKSPACE_ID) {
if (c.state.workspaceId === APP_SHELL_ORGANIZATION_ID) {
throw new Error("Organization action cannot run on the reserved app workspace");
}
}
@ -49,33 +51,10 @@ function organizationWorkspaceId(kind: FoundryOrganization["kind"], login: strin
return kind === "personal" ? personalWorkspaceId(login) : slugify(login);
}
function splitScopes(value: string): string[] {
return value
.split(",")
.map((entry) => entry.trim())
.filter((entry) => entry.length > 0);
}
function hasRepoScope(scopes: string[]): boolean {
return scopes.some((scope) => scope === "repo" || scope.startsWith("repo:"));
}
function parseEligibleOrganizationIds(value: string): string[] {
try {
const parsed = JSON.parse(value);
if (!Array.isArray(parsed)) {
return [];
}
return parsed.filter((entry): entry is string => typeof entry === "string" && entry.length > 0);
} catch {
return [];
}
}
function encodeEligibleOrganizationIds(value: string[]): string {
return JSON.stringify([...new Set(value)]);
}
function encodeOauthState(payload: { sessionId: string; nonce: string }): string {
return Buffer.from(JSON.stringify(payload), "utf8").toString("base64url");
}
@ -117,17 +96,6 @@ function formatUnixDate(value: number): string {
return new Date(value * 1000).toISOString().slice(0, 10);
}
function legacyRepoImportStatusToGithubSyncStatus(value: string | null | undefined): FoundryOrganization["github"]["syncStatus"] {
switch (value) {
case "ready":
return "synced";
case "importing":
return "syncing";
default:
return "pending";
}
}
function stringFromMetadata(metadata: unknown, key: string): string | null {
if (!metadata || typeof metadata !== "object") {
return null;
@ -183,14 +151,7 @@ async function ensureAppSession(c: any, requestedSessionId?: string | null): Pro
.values({
id: sessionId,
currentUserId: null,
currentUserName: null,
currentUserEmail: null,
currentUserGithubLogin: null,
currentUserRoleLabel: null,
eligibleOrganizationIdsJson: "[]",
activeOrganizationId: null,
githubAccessToken: null,
githubScope: "",
starterRepoStatus: "pending",
starterRepoStarredAt: null,
starterRepoSkippedAt: null,
@ -223,12 +184,13 @@ async function getOrganizationState(workspace: any) {
async function buildAppSnapshot(c: any, sessionId: string): Promise<FoundryAppSnapshot> {
assertAppWorkspace(c);
const session = await requireAppSessionRow(c, sessionId);
const eligibleOrganizationIds = parseEligibleOrganizationIds(session.eligibleOrganizationIdsJson);
const userProfile = session.currentUserId != null ? await getOrCreateUserGithubData(c, session.currentUserId).then((user) => user.getProfile()) : null;
const eligibleOrganizationIds = userProfile?.eligibleOrganizationIds ?? [];
const organizations: FoundryOrganization[] = [];
for (const organizationId of eligibleOrganizationIds) {
try {
const workspace = await getOrCreateWorkspace(c, organizationId);
const workspace = await getOrCreateOrganization(c, organizationId);
const organizationState = await getOrganizationState(workspace);
organizations.push(organizationState.snapshot);
} catch (error) {
@ -239,14 +201,14 @@ async function buildAppSnapshot(c: any, sessionId: string): Promise<FoundryAppSn
}
}
const currentUser: FoundryUser | null = session.currentUserId
const currentUser: FoundryUser | null = userProfile
? {
id: session.currentUserId,
name: session.currentUserName ?? session.currentUserGithubLogin ?? "GitHub user",
email: session.currentUserEmail ?? "",
githubLogin: session.currentUserGithubLogin ?? "",
roleLabel: session.currentUserRoleLabel ?? "GitHub user",
eligibleOrganizationIds: organizations.map((organization) => organization.id),
id: userProfile.userId,
name: userProfile.displayName,
email: userProfile.email,
githubLogin: userProfile.githubLogin,
roleLabel: "GitHub user",
eligibleOrganizationIds,
}
: null;
@ -279,15 +241,24 @@ async function buildAppSnapshot(c: any, sessionId: string): Promise<FoundryAppSn
async function requireSignedInSession(c: any, sessionId: string) {
const session = await requireAppSessionRow(c, sessionId);
if (!session.currentUserId || !session.currentUserEmail || !session.currentUserGithubLogin) {
if (!session.currentUserId) {
throw new Error("User must be signed in");
}
return session;
const userGithub = await getOrCreateUserGithubData(c, session.currentUserId);
const profile = await userGithub.getProfile();
const auth = await userGithub.getAuth();
if (!profile || !auth) {
throw new Error(`GitHub user data is not initialized for session user ${session.currentUserId}`);
}
return {
session,
profile,
auth,
};
}
function requireEligibleOrganization(session: any, organizationId: string): void {
const eligibleOrganizationIds = parseEligibleOrganizationIds(session.eligibleOrganizationIdsJson);
if (!eligibleOrganizationIds.includes(organizationId)) {
function requireEligibleOrganization(userProfile: { eligibleOrganizationIds: string[] }, organizationId: string): void {
if (!userProfile.eligibleOrganizationIds.includes(organizationId)) {
throw new Error(`Organization ${organizationId} is not available in this app session`);
}
}
@ -364,13 +335,27 @@ async function safeListInstallations(accessToken: string): Promise<any[]> {
}
}
async function syncGithubSessionFromToken(c: any, sessionId: string, accessToken: string): Promise<{ sessionId: string; redirectTo: string }> {
async function syncGithubSessionFromToken(
c: any,
sessionId: string,
accessToken: string,
scopes: string[] = [],
options?: { organizationLogins?: string[] | null },
): Promise<{ sessionId: string; redirectTo: string }> {
assertAppWorkspace(c);
const { appShell } = getActorRuntimeContext();
const session = await requireAppSessionRow(c, sessionId);
const token = { accessToken, scopes: splitScopes(session.githubScope) };
const resolvedScopes =
scopes.length > 0
? [...new Set(scopes.map((value) => value.trim()).filter((value) => value.length > 0))]
: await appShell.github.getTokenScopes(accessToken).catch(() => []);
const viewer = await appShell.github.getViewer(accessToken);
const organizations = await safeListOrganizations(accessToken);
const requestedOrganizationLogins = new Set(
(options?.organizationLogins ?? []).map((value) => value.trim().toLowerCase()).filter((value) => value.length > 0),
);
const organizations = (await safeListOrganizations(accessToken)).filter(
(organization) => requestedOrganizationLogins.size === 0 || requestedOrganizationLogins.has(organization.login.trim().toLowerCase()),
);
const installations = await safeListInstallations(accessToken);
const userId = `user-${slugify(viewer.login)}`;
@ -395,20 +380,54 @@ async function syncGithubSessionFromToken(c: any, sessionId: string, accessToken
for (const account of accounts) {
const organizationId = organizationWorkspaceId(account.kind, account.githubLogin);
const installation = installations.find((candidate) => candidate.accountLogin === account.githubLogin) ?? null;
const workspace = await getOrCreateWorkspace(c, organizationId);
const workspace = await getOrCreateOrganization(c, organizationId);
await workspace.syncOrganizationShellFromGithub({
userId,
userName: viewer.name || viewer.login,
userEmail: viewer.email ?? `${viewer.login}@users.noreply.github.com`,
githubUserLogin: viewer.login,
githubAccountId: account.githubAccountId,
githubLogin: account.githubLogin,
githubAccountType: account.githubAccountType,
kind: account.kind,
displayName: account.displayName,
installationId: installation?.id ?? null,
appConfigured: appShell.github.isAppConfigured(),
});
if (account.kind === "personal" || installation?.id || accessToken) {
const installationStatus =
account.kind === "personal"
? "connected"
: installation?.id
? "connected"
: appShell.github.isAppConfigured()
? "install_required"
: "reconnect_required";
const githubState = await getOrCreateGithubState(c, organizationId);
void githubState
.fullSync({
kind: account.kind,
githubLogin: account.githubLogin,
connectedAccount: account.githubLogin,
installationStatus,
installationId: account.kind === "personal" ? null : (installation?.id ?? null),
accessToken,
label: "Syncing GitHub data...",
fallbackMembers:
account.kind === "personal"
? [
{
id: userId,
login: viewer.login,
name: viewer.name || viewer.login,
email: viewer.email ?? `${viewer.login}@users.noreply.github.com`,
role: "owner",
state: "active",
},
]
: [],
})
.catch(() => {});
}
linkedOrganizationIds.push(organizationId);
}
@ -419,16 +438,20 @@ async function syncGithubSessionFromToken(c: any, sessionId: string, accessToken
? (linkedOrganizationIds[0] ?? null)
: null;
const userGithub = await getOrCreateUserGithubData(c, userId);
await userGithub.upsert({
githubUserId: viewer.id,
githubLogin: viewer.login,
displayName: viewer.name || viewer.login,
email: viewer.email ?? `${viewer.login}@users.noreply.github.com`,
accessToken,
scopes: resolvedScopes,
eligibleOrganizationIds: linkedOrganizationIds,
});
await updateAppSession(c, session.id, {
currentUserId: userId,
currentUserName: viewer.name || viewer.login,
currentUserEmail: viewer.email ?? `${viewer.login}@users.noreply.github.com`,
currentUserGithubLogin: viewer.login,
currentUserRoleLabel: "GitHub user",
eligibleOrganizationIdsJson: encodeEligibleOrganizationIds(linkedOrganizationIds),
activeOrganizationId,
githubAccessToken: accessToken,
githubScope: token.scopes.join(","),
oauthState: null,
oauthStateExpiresAt: null,
});
@ -488,19 +511,33 @@ async function listOrganizationRepoCatalog(c: any): Promise<string[]> {
return rows.map((row) => repoLabelFromRemote(row.remoteUrl)).sort((left, right) => left.localeCompare(right));
}
async function buildOrganizationRuntimeState(c: any): Promise<FoundryActorRuntimeState> {
assertOrganizationWorkspace(c);
const issues = await listActorRuntimeIssues(c);
return {
status: issues.length > 0 ? "error" : "healthy",
errorCount: issues.length,
lastErrorAt: issues[0]?.occurredAt ?? null,
issues,
};
}
async function buildOrganizationState(c: any) {
const row = await requireOrganizationProfileRow(c);
const repoCatalog = await listOrganizationRepoCatalog(c);
const members = await listOrganizationMembers(c);
const seatAssignmentEmails = await listOrganizationSeatAssignments(c);
const invoiceRows = await listOrganizationInvoices(c);
const githubState = await getOrCreateGithubState(c, c.state.workspaceId);
const githubSummary = await githubState.getSummary();
const runtime = await buildOrganizationRuntimeState(c);
return {
id: c.state.workspaceId,
workspaceId: c.state.workspaceId,
kind: row.kind,
githubLogin: row.githubLogin,
githubInstallationId: row.githubInstallationId ?? null,
githubInstallationId: githubSummary.installationId,
stripeCustomerId: row.stripeCustomerId ?? null,
stripeSubscriptionId: row.stripeSubscriptionId ?? null,
stripePriceId: row.stripePriceId ?? null,
@ -518,13 +555,14 @@ async function buildOrganizationState(c: any) {
autoImportRepos: row.autoImportRepos === 1,
},
github: {
connectedAccount: row.githubConnectedAccount,
installationStatus: row.githubInstallationStatus,
syncStatus: row.githubSyncStatus ?? legacyRepoImportStatusToGithubSyncStatus(row.repoImportStatus),
importedRepoCount: repoCatalog.length,
lastSyncLabel: row.githubLastSyncLabel,
lastSyncAt: row.githubLastSyncAt ?? null,
connectedAccount: githubSummary.connectedAccount,
installationStatus: githubSummary.installationStatus,
syncStatus: githubSummary.syncStatus,
importedRepoCount: githubSummary.repositoryCount,
lastSyncLabel: githubSummary.lastSyncLabel,
lastSyncAt: githubSummary.lastSyncAt,
},
runtime,
billing: {
planId: row.billingPlanId,
status: row.billingStatus,
@ -579,20 +617,38 @@ export const workspaceAppActions = {
assertAppWorkspace(c);
const rows = await c.db.select().from(appSessions).orderBy(desc(appSessions.updatedAt)).all();
for (const row of rows) {
if (row.activeOrganizationId !== input.organizationId || !row.githubAccessToken) {
continue;
}
const resolveFromRows = async (candidates: typeof rows) => {
for (const row of candidates) {
if (!row.currentUserId) {
continue;
}
const scopes = splitScopes(row.githubScope);
if (input.requireRepoScope !== false && !hasRepoScope(scopes)) {
continue;
}
const userGithub = await getOrCreateUserGithubData(c, row.currentUserId);
const [profile, auth] = await Promise.all([userGithub.getProfile(), userGithub.getAuth()]);
if (!profile || !auth || !profile.eligibleOrganizationIds.includes(input.organizationId)) {
continue;
}
return {
accessToken: row.githubAccessToken,
scopes,
};
if (input.requireRepoScope !== false && !hasRepoScope(auth.scopes)) {
continue;
}
return {
accessToken: auth.accessToken,
scopes: auth.scopes,
};
}
return null;
};
const preferred = await resolveFromRows(rows.filter((row) => row.activeOrganizationId === input.organizationId));
if (preferred) {
return preferred;
}
const fallback = await resolveFromRows(rows);
if (fallback) {
return fallback;
}
return null;
@ -622,19 +678,21 @@ export const workspaceAppActions = {
}
const token = await appShell.github.exchangeCode(input.code);
await updateAppSession(c, session.id, {
githubScope: token.scopes.join(","),
});
return await syncGithubSessionFromToken(c, session.id, token.accessToken);
return await syncGithubSessionFromToken(c, session.id, token.accessToken, token.scopes);
},
async bootstrapAppGithubSession(c: any, input: { accessToken: string; sessionId?: string | null }): Promise<{ sessionId: string; redirectTo: string }> {
async bootstrapAppGithubSession(
c: any,
input: { accessToken: string; sessionId?: string | null; organizationLogins?: string[] | null },
): Promise<{ sessionId: string; redirectTo: string }> {
assertAppWorkspace(c);
if (process.env.NODE_ENV === "production") {
throw new Error("bootstrapAppGithubSession is development-only");
}
const sessionId = await ensureAppSession(c, input.sessionId ?? null);
return await syncGithubSessionFromToken(c, sessionId, input.accessToken);
return await syncGithubSessionFromToken(c, sessionId, input.accessToken, [], {
organizationLogins: input.organizationLogins ?? null,
});
},
async signOutApp(c: any, input: { sessionId: string }): Promise<FoundryAppSnapshot> {
@ -642,14 +700,7 @@ export const workspaceAppActions = {
const sessionId = await ensureAppSession(c, input.sessionId);
await updateAppSession(c, sessionId, {
currentUserId: null,
currentUserName: null,
currentUserEmail: null,
currentUserGithubLogin: null,
currentUserRoleLabel: null,
eligibleOrganizationIdsJson: "[]",
activeOrganizationId: null,
githubAccessToken: null,
githubScope: "",
starterRepoStatus: "pending",
starterRepoStarredAt: null,
starterRepoSkippedAt: null,
@ -672,9 +723,9 @@ export const workspaceAppActions = {
async starAppStarterRepo(c: any, input: { sessionId: string; organizationId: string }): Promise<FoundryAppSnapshot> {
assertAppWorkspace(c);
const session = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(session, input.organizationId);
const workspace = await getOrCreateWorkspace(c, input.organizationId);
const { profile } = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(profile, input.organizationId);
const workspace = await getOrCreateOrganization(c, input.organizationId);
await workspace.starSandboxAgentRepo({
workspaceId: input.organizationId,
});
@ -688,13 +739,13 @@ export const workspaceAppActions = {
async selectAppOrganization(c: any, input: { sessionId: string; organizationId: string }): Promise<FoundryAppSnapshot> {
assertAppWorkspace(c);
const session = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(session, input.organizationId);
const { profile } = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(profile, input.organizationId);
await updateAppSession(c, input.sessionId, {
activeOrganizationId: input.organizationId,
});
const workspace = await getOrCreateWorkspace(c, input.organizationId);
const workspace = await getOrCreateOrganization(c, input.organizationId);
const organization = await getOrganizationState(workspace);
if (organization.snapshot.github.syncStatus !== "synced") {
return await workspaceAppActions.triggerAppRepoImport(c, input);
@ -707,9 +758,9 @@ export const workspaceAppActions = {
input: { sessionId: string; organizationId: string } & UpdateFoundryOrganizationProfileInput,
): Promise<FoundryAppSnapshot> {
assertAppWorkspace(c);
const session = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(session, input.organizationId);
const workspace = await getOrCreateWorkspace(c, input.organizationId);
const { profile } = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(profile, input.organizationId);
const workspace = await getOrCreateOrganization(c, input.organizationId);
await workspace.updateOrganizationShellProfile({
displayName: input.displayName,
slug: input.slug,
@ -720,68 +771,47 @@ export const workspaceAppActions = {
async triggerAppRepoImport(c: any, input: { sessionId: string; organizationId: string }): Promise<FoundryAppSnapshot> {
assertAppWorkspace(c);
const session = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(session, input.organizationId);
const { appShell } = getActorRuntimeContext();
const workspace = await getOrCreateWorkspace(c, input.organizationId);
const { profile, auth } = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(profile, input.organizationId);
const workspace = await getOrCreateOrganization(c, input.organizationId);
const organization = await getOrganizationState(workspace);
await workspace.markOrganizationSyncStarted({
label: "Importing repository catalog...",
});
try {
let repositories;
let installationStatus = organization.snapshot.github.installationStatus;
if (organization.snapshot.kind === "personal") {
repositories = await appShell.github.listUserRepositories(session.githubAccessToken);
installationStatus = "connected";
} else if (organization.githubInstallationId) {
try {
repositories = await appShell.github.listInstallationRepositories(organization.githubInstallationId);
} catch (error) {
if (!(error instanceof GitHubAppError) || (error.status !== 403 && error.status !== 404)) {
throw error;
}
repositories = (await appShell.github.listUserRepositories(session.githubAccessToken)).filter((repository) =>
repository.fullName.startsWith(`${organization.githubLogin}/`),
);
installationStatus = "reconnect_required";
}
} else {
repositories = (await appShell.github.listUserRepositories(session.githubAccessToken)).filter((repository) =>
repository.fullName.startsWith(`${organization.githubLogin}/`),
);
installationStatus = "reconnect_required";
}
await workspace.applyOrganizationSyncCompleted({
repositories,
installationStatus,
lastSyncLabel: repositories.length > 0 ? "Synced just now" : "No repositories available",
const githubState = await getOrCreateGithubState(c, input.organizationId);
void githubState
.fullSync({
kind: organization.snapshot.kind,
githubLogin: organization.githubLogin,
connectedAccount: organization.snapshot.github.connectedAccount,
installationStatus: organization.snapshot.kind === "personal" ? "connected" : organization.snapshot.github.installationStatus,
installationId: organization.snapshot.kind === "personal" ? null : organization.githubInstallationId,
accessToken: auth.accessToken,
label: "Syncing GitHub data...",
fallbackMembers:
organization.snapshot.kind === "personal"
? [
{
id: profile.userId,
login: profile.githubLogin,
name: profile.displayName,
email: profile.email,
role: "owner",
state: "active",
},
]
: [],
})
.catch((error) => {
console.error("foundry github full sync failed", error);
});
} catch (error) {
const installationStatus =
error instanceof GitHubAppError && (error.status === 403 || error.status === 404)
? "reconnect_required"
: organization.snapshot.github.installationStatus;
await workspace.markOrganizationSyncFailed({
message: error instanceof Error ? error.message : "GitHub import failed",
installationStatus,
});
}
return await buildAppSnapshot(c, input.sessionId);
},
async beginAppGithubInstall(c: any, input: { sessionId: string; organizationId: string }): Promise<{ url: string }> {
assertAppWorkspace(c);
const session = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(session, input.organizationId);
const { profile } = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(profile, input.organizationId);
const { appShell } = getActorRuntimeContext();
const workspace = await getOrCreateWorkspace(c, input.organizationId);
const workspace = await getOrCreateOrganization(c, input.organizationId);
const organization = await getOrganizationState(workspace);
if (organization.snapshot.kind !== "organization") {
return {
@ -795,10 +825,10 @@ export const workspaceAppActions = {
async createAppCheckoutSession(c: any, input: { sessionId: string; organizationId: string; planId: FoundryBillingPlanId }): Promise<{ url: string }> {
assertAppWorkspace(c);
const session = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(session, input.organizationId);
const { profile } = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(profile, input.organizationId);
const { appShell } = getActorRuntimeContext();
const workspace = await getOrCreateWorkspace(c, input.organizationId);
const workspace = await getOrCreateOrganization(c, input.organizationId);
const organization = await getOrganizationState(workspace);
if (input.planId === "free") {
@ -818,7 +848,7 @@ export const workspaceAppActions = {
await appShell.stripe.createCustomer({
organizationId: input.organizationId,
displayName: organization.snapshot.settings.displayName,
email: session.currentUserEmail,
email: profile.email,
})
).id;
await workspace.applyOrganizationStripeCustomer({ customerId });
@ -830,7 +860,7 @@ export const workspaceAppActions = {
.createCheckoutSession({
organizationId: input.organizationId,
customerId,
customerEmail: session.currentUserEmail,
customerEmail: profile.email,
planId: input.planId,
successUrl: `${appShell.appUrl}/api/rivet/app/billing/checkout/complete?organizationId=${encodeURIComponent(
input.organizationId,
@ -844,7 +874,7 @@ export const workspaceAppActions = {
async finalizeAppCheckoutSession(c: any, input: { sessionId: string; organizationId: string; checkoutSessionId: string }): Promise<{ redirectTo: string }> {
assertAppWorkspace(c);
const { appShell } = getActorRuntimeContext();
const workspace = await getOrCreateWorkspace(c, input.organizationId);
const workspace = await getOrCreateOrganization(c, input.organizationId);
const organization = await getOrganizationState(workspace);
const completion = await appShell.stripe.retrieveCheckoutCompletion(input.checkoutSessionId);
@ -871,10 +901,10 @@ export const workspaceAppActions = {
async createAppBillingPortalSession(c: any, input: { sessionId: string; organizationId: string }): Promise<{ url: string }> {
assertAppWorkspace(c);
const session = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(session, input.organizationId);
const { profile } = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(profile, input.organizationId);
const { appShell } = getActorRuntimeContext();
const workspace = await getOrCreateWorkspace(c, input.organizationId);
const workspace = await getOrCreateOrganization(c, input.organizationId);
const organization = await getOrganizationState(workspace);
if (!organization.stripeCustomerId) {
throw new Error("Stripe customer is not available for this organization");
@ -888,10 +918,10 @@ export const workspaceAppActions = {
async cancelAppScheduledRenewal(c: any, input: { sessionId: string; organizationId: string }): Promise<FoundryAppSnapshot> {
assertAppWorkspace(c);
const session = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(session, input.organizationId);
const { profile } = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(profile, input.organizationId);
const { appShell } = getActorRuntimeContext();
const workspace = await getOrCreateWorkspace(c, input.organizationId);
const workspace = await getOrCreateOrganization(c, input.organizationId);
const organization = await getOrganizationState(workspace);
if (organization.stripeSubscriptionId && appShell.stripe.isConfigured()) {
@ -907,10 +937,10 @@ export const workspaceAppActions = {
async resumeAppSubscription(c: any, input: { sessionId: string; organizationId: string }): Promise<FoundryAppSnapshot> {
assertAppWorkspace(c);
const session = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(session, input.organizationId);
const { profile } = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(profile, input.organizationId);
const { appShell } = getActorRuntimeContext();
const workspace = await getOrCreateWorkspace(c, input.organizationId);
const workspace = await getOrCreateOrganization(c, input.organizationId);
const organization = await getOrganizationState(workspace);
if (organization.stripeSubscriptionId && appShell.stripe.isConfigured()) {
@ -926,11 +956,11 @@ export const workspaceAppActions = {
async recordAppSeatUsage(c: any, input: { sessionId: string; workspaceId: string }): Promise<FoundryAppSnapshot> {
assertAppWorkspace(c);
const session = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(session, input.workspaceId);
const workspace = await getOrCreateWorkspace(c, input.workspaceId);
const { profile } = await requireSignedInSession(c, input.sessionId);
requireEligibleOrganization(profile, input.workspaceId);
const workspace = await getOrCreateOrganization(c, input.workspaceId);
await workspace.recordOrganizationSeatUsage({
email: session.currentUserEmail,
email: profile.email,
});
return await buildAppSnapshot(c, input.sessionId);
},
@ -950,7 +980,7 @@ export const workspaceAppActions = {
typeof object.subscription === "string" ? object.subscription : null,
));
if (organizationId) {
const workspace = await getOrCreateWorkspace(c, organizationId);
const workspace = await getOrCreateOrganization(c, organizationId);
if (typeof object.customer === "string") {
await workspace.applyOrganizationStripeCustomer({ customerId: object.customer });
}
@ -968,7 +998,7 @@ export const workspaceAppActions = {
const subscription = stripeWebhookSubscription(event);
const organizationId = await findOrganizationIdForStripeEvent(c, subscription.customerId, subscription.id);
if (organizationId) {
const workspace = await getOrCreateWorkspace(c, organizationId);
const workspace = await getOrCreateOrganization(c, organizationId);
const organization = await getOrganizationState(workspace);
await applySubscriptionState(workspace, subscription, appShell.stripe.planIdForPriceId(subscription.priceId ?? "") ?? organization.billingPlanId);
await upsertStripeLookupEntries(c, organizationId, subscription.customerId, subscription.id);
@ -980,7 +1010,7 @@ export const workspaceAppActions = {
const subscription = stripeWebhookSubscription(event);
const organizationId = await findOrganizationIdForStripeEvent(c, subscription.customerId, subscription.id);
if (organizationId) {
const workspace = await getOrCreateWorkspace(c, organizationId);
const workspace = await getOrCreateOrganization(c, organizationId);
await workspace.applyOrganizationFreePlan({ clearSubscription: true });
}
return { ok: true };
@ -990,7 +1020,7 @@ export const workspaceAppActions = {
const invoice = event.data.object as Record<string, unknown>;
const organizationId = await findOrganizationIdForStripeEvent(c, typeof invoice.customer === "string" ? invoice.customer : null, null);
if (organizationId) {
const workspace = await getOrCreateWorkspace(c, organizationId);
const workspace = await getOrCreateOrganization(c, organizationId);
const rawAmount = typeof invoice.amount_paid === "number" ? invoice.amount_paid : invoice.amount_due;
const amountUsd = Math.round((typeof rawAmount === "number" ? rawAmount : 0) / 100);
await workspace.upsertOrganizationInvoice({
@ -1020,16 +1050,43 @@ export const workspaceAppActions = {
const kind: FoundryOrganization["kind"] = accountType === "User" ? "personal" : "organization";
const organizationId = organizationWorkspaceId(kind, accountLogin);
const githubState = await getOrCreateGithubState(c, organizationId);
if (event === "installation" && (body.action === "created" || body.action === "deleted" || body.action === "suspend" || body.action === "unsuspend")) {
console.log(`[github-webhook] ${event}.${body.action} for ${accountLogin} (org=${organizationId})`);
if (body.action === "deleted") {
const workspace = await getOrCreateWorkspace(c, organizationId);
await workspace.applyGithubInstallationRemoved({});
await githubState.clearState({
connectedAccount: accountLogin,
installationStatus: "install_required",
installationId: null,
label: "GitHub App installation removed",
});
} else if (body.action === "created") {
const workspace = await getOrCreateWorkspace(c, organizationId);
await workspace.applyGithubInstallationCreated({
installationId: body.installation?.id ?? 0,
await githubState.fullSync({
kind,
githubLogin: accountLogin,
connectedAccount: accountLogin,
installationStatus: "connected",
installationId: body.installation?.id ?? null,
label: "Syncing GitHub data from installation webhook...",
fallbackMembers: [],
});
} else if (body.action === "suspend") {
await githubState.clearState({
connectedAccount: accountLogin,
installationStatus: "reconnect_required",
installationId: body.installation?.id ?? null,
label: "GitHub App installation suspended",
});
} else if (body.action === "unsuspend") {
await githubState.fullSync({
kind,
githubLogin: accountLogin,
connectedAccount: accountLogin,
installationStatus: "connected",
installationId: body.installation?.id ?? null,
label: "Resyncing GitHub data after unsuspend...",
fallbackMembers: [],
});
}
return { ok: true };
@ -1039,13 +1096,14 @@ export const workspaceAppActions = {
console.log(
`[github-webhook] ${event}.${body.action} for ${accountLogin}: +${body.repositories_added?.length ?? 0} -${body.repositories_removed?.length ?? 0}`,
);
const workspace = await getOrCreateWorkspace(c, organizationId);
await workspace.applyGithubRepositoryChanges({
added: (body.repositories_added ?? []).map((r) => ({
fullName: r.full_name,
private: r.private,
})),
removed: (body.repositories_removed ?? []).map((r) => r.full_name),
await githubState.fullSync({
kind,
githubLogin: accountLogin,
connectedAccount: accountLogin,
installationStatus: "connected",
installationId: body.installation?.id ?? null,
label: "Resyncing GitHub data after repository access change...",
fallbackMembers: [],
});
return { ok: true };
}
@ -1064,7 +1122,30 @@ export const workspaceAppActions = {
const repoFullName = body.repository?.full_name;
if (repoFullName) {
console.log(`[github-webhook] ${event}.${body.action ?? ""} for ${repoFullName}`);
// TODO: Dispatch to GitHubStateActor / downstream actors
}
if (event === "pull_request" && body.repository?.full_name && body.repository?.clone_url && body.pull_request) {
await githubState.handlePullRequestWebhook({
connectedAccount: accountLogin,
installationStatus: "connected",
installationId: body.installation?.id ?? null,
repository: {
fullName: body.repository.full_name,
cloneUrl: body.repository.clone_url,
private: Boolean(body.repository.private),
},
pullRequest: {
number: body.pull_request.number,
title: body.pull_request.title ?? "",
body: body.pull_request.body ?? null,
state: body.pull_request.merged ? "MERGED" : (body.pull_request.state ?? "open"),
url: body.pull_request.html_url ?? `https://github.com/${body.repository.full_name}/pull/${body.pull_request.number}`,
headRefName: body.pull_request.head?.ref ?? "",
baseRefName: body.pull_request.base?.ref ?? "",
authorLogin: body.pull_request.user?.login ?? null,
isDraft: Boolean(body.pull_request.draft),
merged: Boolean(body.pull_request.merged),
},
});
}
return { ok: true };
}
@ -1079,14 +1160,11 @@ export const workspaceAppActions = {
userId: string;
userName: string;
userEmail: string;
githubUserLogin: string;
githubAccountId: string;
githubLogin: string;
githubAccountType: string;
kind: FoundryOrganization["kind"];
displayName: string;
installationId: number | null;
appConfigured: boolean;
},
): Promise<{ organizationId: string }> {
assertOrganizationWorkspace(c);
@ -1098,17 +1176,6 @@ export const workspaceAppActions = {
throw new Error(`Workspace actor mismatch: actor=${c.state.workspaceId} github=${organizationId}`);
}
const installationStatus =
input.kind === "personal" ? "connected" : input.installationId ? "connected" : input.appConfigured ? "install_required" : "reconnect_required";
const syncStatus = existing?.githubSyncStatus ?? legacyRepoImportStatusToGithubSyncStatus(existing?.repoImportStatus);
const lastSyncLabel =
syncStatus === "synced"
? existing.githubLastSyncLabel
: installationStatus === "connected"
? "Waiting for first import"
: installationStatus === "install_required"
? "GitHub App installation required"
: "GitHub App configuration incomplete";
const hasStripeBillingState = Boolean(existing?.stripeCustomerId || existing?.stripeSubscriptionId || existing?.stripePriceId);
const defaultBillingPlanId = input.kind === "personal" || !hasStripeBillingState ? "free" : (existing?.billingPlanId ?? "team");
const defaultSeatsIncluded = input.kind === "personal" || !hasStripeBillingState ? 1 : (existing?.billingSeatsIncluded ?? 5);
@ -1133,12 +1200,6 @@ export const workspaceAppActions = {
defaultModel: existing?.defaultModel ?? "claude-sonnet-4",
autoImportRepos: existing?.autoImportRepos ?? 1,
repoImportStatus: existing?.repoImportStatus ?? "not_started",
githubConnectedAccount: input.githubLogin,
githubInstallationStatus: installationStatus,
githubSyncStatus: syncStatus,
githubInstallationId: input.installationId,
githubLastSyncLabel: lastSyncLabel,
githubLastSyncAt: existing?.githubLastSyncAt ?? null,
stripeCustomerId: existing?.stripeCustomerId ?? null,
stripeSubscriptionId: existing?.stripeSubscriptionId ?? null,
stripePriceId: existing?.stripePriceId ?? null,
@ -1159,12 +1220,6 @@ export const workspaceAppActions = {
githubLogin: input.githubLogin,
githubAccountType: input.githubAccountType,
displayName: input.displayName,
githubConnectedAccount: input.githubLogin,
githubInstallationStatus: installationStatus,
githubSyncStatus: syncStatus,
githubInstallationId: input.installationId,
githubLastSyncLabel: lastSyncLabel,
githubLastSyncAt: existing?.githubLastSyncAt ?? null,
billingPlanId: defaultBillingPlanId,
billingSeatsIncluded: defaultSeatsIncluded,
billingPaymentMethodLabel: defaultPaymentMethodLabel,
@ -1218,29 +1273,17 @@ export const workspaceAppActions = {
.run();
},
async markOrganizationSyncStarted(c: any, input: { label: string }): Promise<void> {
assertOrganizationWorkspace(c);
await c.db
.update(organizationProfile)
.set({
githubSyncStatus: "syncing",
githubLastSyncLabel: input.label,
updatedAt: Date.now(),
})
.where(eq(organizationProfile.id, PROFILE_ROW_ID))
.run();
},
async applyOrganizationSyncCompleted(
c: any,
input: {
repositories: Array<{ fullName: string; cloneUrl: string; private: boolean }>;
installationStatus: FoundryOrganization["github"]["installationStatus"];
lastSyncLabel: string;
},
): Promise<void> {
async applyOrganizationRepositoryCatalog(c: any, input: { repositories: Array<{ fullName: string; cloneUrl: string; private: boolean }> }): Promise<void> {
assertOrganizationWorkspace(c);
const now = Date.now();
const nextRepoIds = new Set(input.repositories.map((repository) => repoIdFromRemote(repository.cloneUrl)));
const existing = await c.db.select({ repoId: repos.repoId }).from(repos).all();
for (const row of existing) {
if (nextRepoIds.has(row.repoId)) {
continue;
}
await c.db.delete(repos).where(eq(repos.repoId, row.repoId)).run();
}
for (const repository of input.repositories) {
const remoteUrl = repository.cloneUrl;
await c.db
@ -1260,31 +1303,6 @@ export const workspaceAppActions = {
})
.run();
}
await c.db
.update(organizationProfile)
.set({
githubInstallationStatus: input.installationStatus,
githubSyncStatus: "synced",
githubLastSyncLabel: input.lastSyncLabel,
githubLastSyncAt: now,
updatedAt: now,
})
.where(eq(organizationProfile.id, PROFILE_ROW_ID))
.run();
},
async markOrganizationSyncFailed(c: any, input: { message: string; installationStatus: FoundryOrganization["github"]["installationStatus"] }): Promise<void> {
assertOrganizationWorkspace(c);
await c.db
.update(organizationProfile)
.set({
githubInstallationStatus: input.installationStatus,
githubSyncStatus: "error",
githubLastSyncLabel: input.message,
updatedAt: Date.now(),
})
.where(eq(organizationProfile.id, PROFILE_ROW_ID))
.run();
},
async applyOrganizationStripeCustomer(c: any, input: { customerId: string }): Promise<void> {
@ -1413,76 +1431,4 @@ export const workspaceAppActions = {
.onConflictDoNothing()
.run();
},
async applyGithubInstallationCreated(c: any, input: { installationId: number }): Promise<void> {
assertOrganizationWorkspace(c);
await c.db
.update(organizationProfile)
.set({
githubInstallationId: input.installationId,
githubInstallationStatus: "connected",
updatedAt: Date.now(),
})
.where(eq(organizationProfile.id, PROFILE_ROW_ID))
.run();
},
async applyGithubInstallationRemoved(c: any, _input: {}): Promise<void> {
assertOrganizationWorkspace(c);
await c.db
.update(organizationProfile)
.set({
githubInstallationId: null,
githubInstallationStatus: "install_required",
githubSyncStatus: "pending",
githubLastSyncLabel: "GitHub App installation removed",
updatedAt: Date.now(),
})
.where(eq(organizationProfile.id, PROFILE_ROW_ID))
.run();
},
async applyGithubRepositoryChanges(c: any, input: { added: Array<{ fullName: string; private: boolean }>; removed: string[] }): Promise<void> {
assertOrganizationWorkspace(c);
const now = Date.now();
for (const repo of input.added) {
const remoteUrl = `https://github.com/${repo.fullName}.git`;
const repoId = repoIdFromRemote(remoteUrl);
await c.db
.insert(repos)
.values({
repoId,
remoteUrl,
createdAt: now,
updatedAt: now,
})
.onConflictDoUpdate({
target: repos.repoId,
set: {
remoteUrl,
updatedAt: now,
},
})
.run();
}
for (const fullName of input.removed) {
const remoteUrl = `https://github.com/${fullName}.git`;
const repoId = repoIdFromRemote(remoteUrl);
await c.db.delete(repos).where(eq(repos.repoId, repoId)).run();
}
const repoCount = (await c.db.select().from(repos).all()).length;
await c.db
.update(organizationProfile)
.set({
githubSyncStatus: "synced",
githubLastSyncLabel: `${repoCount} repositories synced`,
githubLastSyncAt: now,
updatedAt: now,
})
.where(eq(organizationProfile.id, PROFILE_ROW_ID))
.run();
},
};

View file

@ -0,0 +1,5 @@
import { db } from "rivetkit/db/drizzle";
import * as schema from "./schema.js";
import migrations from "./migrations.js";
export const organizationDb = db({ schema, migrations });

View file

@ -0,0 +1,6 @@
import { defineConfig } from "rivetkit/db/drizzle";
export default defineConfig({
out: "./src/actors/organization/db/drizzle",
schema: "./src/actors/organization/db/schema.ts",
});

View file

@ -0,0 +1,116 @@
const journal = {
entries: [
{
idx: 0,
when: 1773356100000,
tag: "0000_organization_state",
breakpoints: true,
},
],
} as const;
export default {
journal,
migrations: {
m0000: `CREATE TABLE \`provider_profiles\` (
\`provider_id\` text PRIMARY KEY NOT NULL,
\`profile_json\` text NOT NULL,
\`updated_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`repos\` (
\`repo_id\` text PRIMARY KEY NOT NULL,
\`remote_url\` text NOT NULL,
\`created_at\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`task_lookup\` (
\`task_id\` text PRIMARY KEY NOT NULL,
\`repo_id\` text NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`organization_profile\` (
\`id\` text PRIMARY KEY NOT NULL,
\`kind\` text NOT NULL,
\`github_account_id\` text NOT NULL,
\`github_login\` text NOT NULL,
\`github_account_type\` text NOT NULL,
\`display_name\` text NOT NULL,
\`slug\` text NOT NULL,
\`primary_domain\` text NOT NULL,
\`default_model\` text NOT NULL,
\`auto_import_repos\` integer NOT NULL,
\`repo_import_status\` text NOT NULL,
\`stripe_customer_id\` text,
\`stripe_subscription_id\` text,
\`stripe_price_id\` text,
\`billing_plan_id\` text NOT NULL,
\`billing_status\` text NOT NULL,
\`billing_seats_included\` integer NOT NULL,
\`billing_trial_ends_at\` text,
\`billing_renewal_at\` text,
\`billing_payment_method_label\` text NOT NULL,
\`created_at\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`organization_members\` (
\`id\` text PRIMARY KEY NOT NULL,
\`name\` text NOT NULL,
\`email\` text NOT NULL,
\`role\` text NOT NULL,
\`state\` text NOT NULL,
\`updated_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`seat_assignments\` (
\`email\` text PRIMARY KEY NOT NULL,
\`created_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`organization_actor_issues\` (
\`actor_id\` text PRIMARY KEY NOT NULL,
\`actor_type\` text NOT NULL,
\`scope_id\` text,
\`scope_label\` text NOT NULL,
\`message\` text NOT NULL,
\`workflow_id\` text,
\`step_name\` text,
\`attempt\` integer,
\`will_retry\` integer DEFAULT 0 NOT NULL,
\`retry_delay_ms\` integer,
\`occurred_at\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`invoices\` (
\`id\` text PRIMARY KEY NOT NULL,
\`label\` text NOT NULL,
\`issued_at\` text NOT NULL,
\`amount_usd\` integer NOT NULL,
\`status\` text NOT NULL,
\`created_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`app_sessions\` (
\`id\` text PRIMARY KEY NOT NULL,
\`current_user_id\` text,
\`active_organization_id\` text,
\`starter_repo_status\` text NOT NULL,
\`starter_repo_starred_at\` integer,
\`starter_repo_skipped_at\` integer,
\`oauth_state\` text,
\`oauth_state_expires_at\` integer,
\`created_at\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`stripe_lookup\` (
\`lookup_key\` text PRIMARY KEY NOT NULL,
\`organization_id\` text NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
} as const,
};

View file

@ -31,12 +31,6 @@ export const organizationProfile = sqliteTable("organization_profile", {
defaultModel: text("default_model").notNull(),
autoImportRepos: integer("auto_import_repos").notNull(),
repoImportStatus: text("repo_import_status").notNull(),
githubConnectedAccount: text("github_connected_account").notNull(),
githubInstallationStatus: text("github_installation_status").notNull(),
githubSyncStatus: text("github_sync_status").notNull(),
githubInstallationId: integer("github_installation_id"),
githubLastSyncLabel: text("github_last_sync_label").notNull(),
githubLastSyncAt: integer("github_last_sync_at"),
stripeCustomerId: text("stripe_customer_id"),
stripeSubscriptionId: text("stripe_subscription_id"),
stripePriceId: text("stripe_price_id"),
@ -64,6 +58,21 @@ export const seatAssignments = sqliteTable("seat_assignments", {
createdAt: integer("created_at").notNull(),
});
export const organizationActorIssues = sqliteTable("organization_actor_issues", {
actorId: text("actor_id").notNull().primaryKey(),
actorType: text("actor_type").notNull(),
scopeId: text("scope_id"),
scopeLabel: text("scope_label").notNull(),
message: text("message").notNull(),
workflowId: text("workflow_id"),
stepName: text("step_name"),
attempt: integer("attempt"),
willRetry: integer("will_retry").notNull().default(0),
retryDelayMs: integer("retry_delay_ms"),
occurredAt: integer("occurred_at").notNull(),
updatedAt: integer("updated_at").notNull(),
});
export const invoices = sqliteTable("invoices", {
id: text("id").notNull().primaryKey(),
label: text("label").notNull(),
@ -76,14 +85,7 @@ export const invoices = sqliteTable("invoices", {
export const appSessions = sqliteTable("app_sessions", {
id: text("id").notNull().primaryKey(),
currentUserId: text("current_user_id"),
currentUserName: text("current_user_name"),
currentUserEmail: text("current_user_email"),
currentUserGithubLogin: text("current_user_github_login"),
currentUserRoleLabel: text("current_user_role_label"),
eligibleOrganizationIdsJson: text("eligible_organization_ids_json").notNull(),
activeOrganizationId: text("active_organization_id"),
githubAccessToken: text("github_access_token"),
githubScope: text("github_scope").notNull(),
starterRepoStatus: text("starter_repo_status").notNull(),
starterRepoStarredAt: integer("starter_repo_starred_at"),
starterRepoSkippedAt: integer("starter_repo_skipped_at"),

View file

@ -0,0 +1,33 @@
import { actor, queue } from "rivetkit";
import { workflow } from "rivetkit/workflow";
import { organizationDb } from "./db/db.js";
import { reportWorkflowIssueToOrganization } from "../runtime-issues.js";
import {
WORKSPACE_QUEUE_NAMES as ORGANIZATION_QUEUE_NAMES,
runWorkspaceWorkflow as runOrganizationWorkflow,
workspaceActions as organizationActions,
} from "./actions.js";
const organizationConfig: any = {
db: organizationDb,
queues: Object.fromEntries(ORGANIZATION_QUEUE_NAMES.map((name) => [name, queue()])),
options: {
actionTimeout: 5 * 60_000,
},
createState: (_c, workspaceId: string) => ({
workspaceId,
}),
actions: organizationActions,
run: workflow(runOrganizationWorkflow, {
onError: async (c: any, event) => {
await reportWorkflowIssueToOrganization(c, event, {
actorType: "organization",
organizationId: c.state.workspaceId,
scopeId: c.state.workspaceId,
scopeLabel: `Organization ${c.state.workspaceId}`,
});
},
}),
};
export const organization = (actor as any)(organizationConfig);

View file

@ -1,176 +0,0 @@
import { actor, queue } from "rivetkit";
import { workflow } from "rivetkit/workflow";
import type { GitDriver } from "../../driver.js";
import { getActorRuntimeContext } from "../context.js";
import { getProject, selfProjectBranchSync } from "../handles.js";
import { logActorWarning, resolveErrorMessage, resolveErrorStack } from "../logging.js";
import { type PollingControlState, runWorkflowPollingLoop } from "../polling.js";
import { parentLookupFromStack } from "../project/stack-model.js";
import { withRepoGitLock } from "../../services/repo-git-lock.js";
export interface ProjectBranchSyncInput {
workspaceId: string;
repoId: string;
repoPath: string;
intervalMs: number;
}
interface SetIntervalCommand {
intervalMs: number;
}
interface EnrichedBranchSnapshot {
branchName: string;
commitSha: string;
parentBranch: string | null;
trackedInStack: boolean;
diffStat: string | null;
hasUnpushed: boolean;
conflictsWithMain: boolean;
}
interface ProjectBranchSyncState extends PollingControlState {
workspaceId: string;
repoId: string;
repoPath: string;
}
const CONTROL = {
start: "project.branch_sync.control.start",
stop: "project.branch_sync.control.stop",
setInterval: "project.branch_sync.control.set_interval",
force: "project.branch_sync.control.force",
} as const;
async function enrichBranches(workspaceId: string, repoId: string, repoPath: string, git: GitDriver): Promise<EnrichedBranchSnapshot[]> {
return await withRepoGitLock(repoPath, async () => {
await git.fetch(repoPath);
const branches = await git.listRemoteBranches(repoPath);
const { driver } = getActorRuntimeContext();
const stackEntries = await driver.stack.listStack(repoPath).catch(() => []);
const parentByBranch = parentLookupFromStack(stackEntries);
const enriched: EnrichedBranchSnapshot[] = [];
const baseRef = await git.remoteDefaultBaseRef(repoPath);
const baseSha = await git.revParse(repoPath, baseRef).catch(() => "");
for (const branch of branches) {
let branchDiffStat: string | null = null;
let branchHasUnpushed = false;
let branchConflicts = false;
try {
branchDiffStat = await git.diffStatForBranch(repoPath, branch.branchName);
} catch (error) {
logActorWarning("project-branch-sync", "diffStatForBranch failed", {
workspaceId,
repoId,
branchName: branch.branchName,
error: resolveErrorMessage(error),
});
branchDiffStat = null;
}
try {
const headSha = await git.revParse(repoPath, `origin/${branch.branchName}`);
branchHasUnpushed = Boolean(baseSha && headSha && headSha !== baseSha);
} catch (error) {
logActorWarning("project-branch-sync", "revParse failed", {
workspaceId,
repoId,
branchName: branch.branchName,
error: resolveErrorMessage(error),
});
branchHasUnpushed = false;
}
try {
branchConflicts = await git.conflictsWithMain(repoPath, branch.branchName);
} catch (error) {
logActorWarning("project-branch-sync", "conflictsWithMain failed", {
workspaceId,
repoId,
branchName: branch.branchName,
error: resolveErrorMessage(error),
});
branchConflicts = false;
}
enriched.push({
branchName: branch.branchName,
commitSha: branch.commitSha,
parentBranch: parentByBranch.get(branch.branchName) ?? null,
trackedInStack: parentByBranch.has(branch.branchName),
diffStat: branchDiffStat,
hasUnpushed: branchHasUnpushed,
conflictsWithMain: branchConflicts,
});
}
return enriched;
});
}
async function pollBranches(c: { state: ProjectBranchSyncState }): Promise<void> {
const { driver } = getActorRuntimeContext();
const enrichedItems = await enrichBranches(c.state.workspaceId, c.state.repoId, c.state.repoPath, driver.git);
const parent = getProject(c, c.state.workspaceId, c.state.repoId);
await parent.applyBranchSyncResult({ items: enrichedItems, at: Date.now() });
}
export const projectBranchSync = actor({
queues: {
[CONTROL.start]: queue(),
[CONTROL.stop]: queue(),
[CONTROL.setInterval]: queue(),
[CONTROL.force]: queue(),
},
options: {
// Polling actors rely on timer-based wakeups; sleeping would pause the timer and stop polling.
noSleep: true,
},
createState: (_c, input: ProjectBranchSyncInput): ProjectBranchSyncState => ({
workspaceId: input.workspaceId,
repoId: input.repoId,
repoPath: input.repoPath,
intervalMs: input.intervalMs,
running: true,
}),
actions: {
async start(c): Promise<void> {
const self = selfProjectBranchSync(c);
await self.send(CONTROL.start, {}, { wait: true, timeout: 15_000 });
},
async stop(c): Promise<void> {
const self = selfProjectBranchSync(c);
await self.send(CONTROL.stop, {}, { wait: true, timeout: 15_000 });
},
async setIntervalMs(c, payload: SetIntervalCommand): Promise<void> {
const self = selfProjectBranchSync(c);
await self.send(CONTROL.setInterval, payload, { wait: true, timeout: 15_000 });
},
async force(c): Promise<void> {
const self = selfProjectBranchSync(c);
await self.send(CONTROL.force, {}, { wait: true, timeout: 5 * 60_000 });
},
},
run: workflow(async (ctx) => {
await runWorkflowPollingLoop<ProjectBranchSyncState>(ctx, {
loopName: "project-branch-sync-loop",
control: CONTROL,
onPoll: async (loopCtx) => {
try {
await pollBranches(loopCtx);
} catch (error) {
logActorWarning("project-branch-sync", "poll failed", {
error: resolveErrorMessage(error),
stack: resolveErrorStack(error),
});
}
},
});
}),
});

View file

@ -1,96 +0,0 @@
import { actor, queue } from "rivetkit";
import { workflow } from "rivetkit/workflow";
import { getActorRuntimeContext } from "../context.js";
import { getProject, selfProjectPrSync } from "../handles.js";
import { logActorWarning, resolveErrorMessage, resolveErrorStack } from "../logging.js";
import { type PollingControlState, runWorkflowPollingLoop } from "../polling.js";
import { resolveWorkspaceGithubAuth } from "../../services/github-auth.js";
export interface ProjectPrSyncInput {
workspaceId: string;
repoId: string;
repoPath: string;
intervalMs: number;
}
interface SetIntervalCommand {
intervalMs: number;
}
interface ProjectPrSyncState extends PollingControlState {
workspaceId: string;
repoId: string;
repoPath: string;
}
const CONTROL = {
start: "project.pr_sync.control.start",
stop: "project.pr_sync.control.stop",
setInterval: "project.pr_sync.control.set_interval",
force: "project.pr_sync.control.force",
} as const;
async function pollPrs(c: { state: ProjectPrSyncState }): Promise<void> {
const { driver } = getActorRuntimeContext();
const auth = await resolveWorkspaceGithubAuth(c, c.state.workspaceId);
const items = await driver.github.listPullRequests(c.state.repoPath, { githubToken: auth?.githubToken ?? null });
const parent = getProject(c, c.state.workspaceId, c.state.repoId);
await parent.applyPrSyncResult({ items, at: Date.now() });
}
export const projectPrSync = actor({
queues: {
[CONTROL.start]: queue(),
[CONTROL.stop]: queue(),
[CONTROL.setInterval]: queue(),
[CONTROL.force]: queue(),
},
options: {
// Polling actors rely on timer-based wakeups; sleeping would pause the timer and stop polling.
noSleep: true,
},
createState: (_c, input: ProjectPrSyncInput): ProjectPrSyncState => ({
workspaceId: input.workspaceId,
repoId: input.repoId,
repoPath: input.repoPath,
intervalMs: input.intervalMs,
running: true,
}),
actions: {
async start(c): Promise<void> {
const self = selfProjectPrSync(c);
await self.send(CONTROL.start, {}, { wait: true, timeout: 15_000 });
},
async stop(c): Promise<void> {
const self = selfProjectPrSync(c);
await self.send(CONTROL.stop, {}, { wait: true, timeout: 15_000 });
},
async setIntervalMs(c, payload: SetIntervalCommand): Promise<void> {
const self = selfProjectPrSync(c);
await self.send(CONTROL.setInterval, payload, { wait: true, timeout: 15_000 });
},
async force(c): Promise<void> {
const self = selfProjectPrSync(c);
await self.send(CONTROL.force, {}, { wait: true, timeout: 5 * 60_000 });
},
},
run: workflow(async (ctx) => {
await runWorkflowPollingLoop<ProjectPrSyncState>(ctx, {
loopName: "project-pr-sync-loop",
control: CONTROL,
onPoll: async (loopCtx) => {
try {
await pollPrs(loopCtx);
} catch (error) {
logActorWarning("project-pr-sync", "poll failed", {
error: resolveErrorMessage(error),
stack: resolveErrorStack(error),
});
}
},
});
}),
});

View file

@ -1,6 +0,0 @@
import { defineConfig } from "rivetkit/db/drizzle";
export default defineConfig({
out: "./src/actors/project/db/drizzle",
schema: "./src/actors/project/db/schema.ts",
});

View file

@ -1,27 +0,0 @@
CREATE TABLE `branches` (
`branch_name` text PRIMARY KEY NOT NULL,
`commit_sha` text NOT NULL,
`worktree_path` text,
`parent_branch` text,
`diff_stat` text,
`has_unpushed` integer,
`conflicts_with_main` integer,
`first_seen_at` integer,
`last_seen_at` integer,
`updated_at` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE `pr_cache` (
`branch_name` text PRIMARY KEY NOT NULL,
`pr_number` integer NOT NULL,
`state` text NOT NULL,
`title` text NOT NULL,
`pr_url` text,
`pr_author` text,
`is_draft` integer,
`ci_status` text,
`review_status` text,
`reviewer` text,
`fetched_at` integer,
`updated_at` integer NOT NULL
);

View file

@ -1,7 +0,0 @@
CREATE TABLE `repo_meta` (
`id` integer PRIMARY KEY NOT NULL,
`remote_url` text NOT NULL,
`updated_at` integer NOT NULL
);
--> statement-breakpoint
ALTER TABLE `branches` DROP COLUMN `worktree_path`;

View file

@ -1,6 +0,0 @@
CREATE TABLE `task_index` (
`task_id` text PRIMARY KEY NOT NULL,
`branch_name` text,
`created_at` integer NOT NULL,
`updated_at` integer NOT NULL
);

View file

@ -1 +0,0 @@
ALTER TABLE `branches` ADD `tracked_in_stack` integer;

View file

@ -1,192 +0,0 @@
{
"version": "6",
"dialect": "sqlite",
"id": "03d97613-0108-4197-8660-5f2af5409fe6",
"prevId": "00000000-0000-0000-0000-000000000000",
"tables": {
"branches": {
"name": "branches",
"columns": {
"branch_name": {
"name": "branch_name",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"commit_sha": {
"name": "commit_sha",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"worktree_path": {
"name": "worktree_path",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"parent_branch": {
"name": "parent_branch",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"diff_stat": {
"name": "diff_stat",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"has_unpushed": {
"name": "has_unpushed",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"conflicts_with_main": {
"name": "conflicts_with_main",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"first_seen_at": {
"name": "first_seen_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"last_seen_at": {
"name": "last_seen_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"pr_cache": {
"name": "pr_cache",
"columns": {
"branch_name": {
"name": "branch_name",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"pr_number": {
"name": "pr_number",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"state": {
"name": "state",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"pr_url": {
"name": "pr_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"pr_author": {
"name": "pr_author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"is_draft": {
"name": "is_draft",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"ci_status": {
"name": "ci_status",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"review_status": {
"name": "review_status",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"reviewer": {
"name": "reviewer",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"fetched_at": {
"name": "fetched_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View file

@ -1,216 +0,0 @@
{
"version": "6",
"dialect": "sqlite",
"id": "e6d294b6-27ce-424b-a3b3-c100b42e628b",
"prevId": "03d97613-0108-4197-8660-5f2af5409fe6",
"tables": {
"branches": {
"name": "branches",
"columns": {
"branch_name": {
"name": "branch_name",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"commit_sha": {
"name": "commit_sha",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"parent_branch": {
"name": "parent_branch",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"diff_stat": {
"name": "diff_stat",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"has_unpushed": {
"name": "has_unpushed",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"conflicts_with_main": {
"name": "conflicts_with_main",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"first_seen_at": {
"name": "first_seen_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"last_seen_at": {
"name": "last_seen_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"pr_cache": {
"name": "pr_cache",
"columns": {
"branch_name": {
"name": "branch_name",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"pr_number": {
"name": "pr_number",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"state": {
"name": "state",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"pr_url": {
"name": "pr_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"pr_author": {
"name": "pr_author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"is_draft": {
"name": "is_draft",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"ci_status": {
"name": "ci_status",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"review_status": {
"name": "review_status",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"reviewer": {
"name": "reviewer",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"fetched_at": {
"name": "fetched_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"repo_meta": {
"name": "repo_meta",
"columns": {
"id": {
"name": "id",
"type": "integer",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"remote_url": {
"name": "remote_url",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View file

@ -1,254 +0,0 @@
{
"version": "6",
"dialect": "sqlite",
"id": "ac89870f-1630-4a16-9606-7b1225f6da8a",
"prevId": "e6d294b6-27ce-424b-a3b3-c100b42e628b",
"tables": {
"branches": {
"name": "branches",
"columns": {
"branch_name": {
"name": "branch_name",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"commit_sha": {
"name": "commit_sha",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"parent_branch": {
"name": "parent_branch",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"diff_stat": {
"name": "diff_stat",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"has_unpushed": {
"name": "has_unpushed",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"conflicts_with_main": {
"name": "conflicts_with_main",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"first_seen_at": {
"name": "first_seen_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"last_seen_at": {
"name": "last_seen_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"task_index": {
"name": "task_index",
"columns": {
"task_id": {
"name": "task_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"branch_name": {
"name": "branch_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"pr_cache": {
"name": "pr_cache",
"columns": {
"branch_name": {
"name": "branch_name",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"pr_number": {
"name": "pr_number",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"state": {
"name": "state",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"pr_url": {
"name": "pr_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"pr_author": {
"name": "pr_author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"is_draft": {
"name": "is_draft",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"ci_status": {
"name": "ci_status",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"review_status": {
"name": "review_status",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"reviewer": {
"name": "reviewer",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"fetched_at": {
"name": "fetched_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"repo_meta": {
"name": "repo_meta",
"columns": {
"id": {
"name": "id",
"type": "integer",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"remote_url": {
"name": "remote_url",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View file

@ -1,34 +0,0 @@
{
"version": "7",
"dialect": "sqlite",
"entries": [
{
"idx": 0,
"version": "6",
"when": 1770924376062,
"tag": "0000_stormy_the_hunter",
"breakpoints": true
},
{
"idx": 1,
"version": "6",
"when": 1770947252449,
"tag": "0001_wild_carlie_cooper",
"breakpoints": true
},
{
"idx": 2,
"version": "6",
"when": 1771276338465,
"tag": "0002_far_war_machine",
"breakpoints": true
},
{
"idx": 3,
"version": "6",
"when": 1771369000000,
"tag": "0003_busy_legacy",
"breakpoints": true
}
]
}

View file

@ -1,81 +0,0 @@
// This file is generated by src/actors/_scripts/generate-actor-migrations.ts.
// Source of truth is drizzle-kit output under ./drizzle (meta/_journal.json + *.sql).
// Do not hand-edit this file.
const journal = {
entries: [
{
idx: 0,
when: 1770924376062,
tag: "0000_stormy_the_hunter",
breakpoints: true,
},
{
idx: 1,
when: 1770947252449,
tag: "0001_wild_carlie_cooper",
breakpoints: true,
},
{
idx: 2,
when: 1771276338465,
tag: "0002_far_war_machine",
breakpoints: true,
},
{
idx: 3,
when: 1771369000000,
tag: "0003_busy_legacy",
breakpoints: true,
},
],
} as const;
export default {
journal,
migrations: {
m0000: `CREATE TABLE \`branches\` (
\`branch_name\` text PRIMARY KEY NOT NULL,
\`commit_sha\` text NOT NULL,
\`worktree_path\` text,
\`parent_branch\` text,
\`diff_stat\` text,
\`has_unpushed\` integer,
\`conflicts_with_main\` integer,
\`first_seen_at\` integer,
\`last_seen_at\` integer,
\`updated_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`pr_cache\` (
\`branch_name\` text PRIMARY KEY NOT NULL,
\`pr_number\` integer NOT NULL,
\`state\` text NOT NULL,
\`title\` text NOT NULL,
\`pr_url\` text,
\`pr_author\` text,
\`is_draft\` integer,
\`ci_status\` text,
\`review_status\` text,
\`reviewer\` text,
\`fetched_at\` integer,
\`updated_at\` integer NOT NULL
);
`,
m0001: `CREATE TABLE \`repo_meta\` (
\`id\` integer PRIMARY KEY NOT NULL,
\`remote_url\` text NOT NULL,
\`updated_at\` integer NOT NULL
);
--> statement-breakpoint
ALTER TABLE \`branches\` DROP COLUMN \`worktree_path\`;`,
m0002: `CREATE TABLE \`task_index\` (
\`task_id\` text PRIMARY KEY NOT NULL,
\`branch_name\` text,
\`created_at\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
m0003: `ALTER TABLE \`branches\` ADD \`tracked_in_stack\` integer;`,
} as const,
};

View file

@ -1,28 +0,0 @@
import { actor, queue } from "rivetkit";
import { workflow } from "rivetkit/workflow";
import { projectDb } from "./db/db.js";
import { PROJECT_QUEUE_NAMES, projectActions, runProjectWorkflow } from "./actions.js";
export interface ProjectInput {
workspaceId: string;
repoId: string;
remoteUrl: string;
}
export const project = actor({
db: projectDb,
queues: Object.fromEntries(PROJECT_QUEUE_NAMES.map((name) => [name, queue()])),
options: {
actionTimeout: 5 * 60_000,
},
createState: (_c, input: ProjectInput) => ({
workspaceId: input.workspaceId,
repoId: input.repoId,
remoteUrl: input.remoteUrl,
localPath: null as string | null,
syncActorsStarted: false,
taskIndexHydrated: false,
}),
actions: projectActions,
run: workflow(runProjectWorkflow),
});

View file

@ -4,16 +4,18 @@ import { and, desc, eq, isNotNull, ne } from "drizzle-orm";
import { Loop } from "rivetkit/workflow";
import type { AgentType, TaskRecord, TaskSummary, ProviderId, RepoOverview, RepoStackAction, RepoStackActionResult } from "@sandbox-agent/foundry-shared";
import { getActorRuntimeContext } from "../context.js";
import { getTask, getOrCreateTask, getOrCreateHistory, getOrCreateProjectBranchSync, getOrCreateProjectPrSync, selfProject } from "../handles.js";
import { getOrCreateGithubState, getTask, getOrCreateTask, getOrCreateHistory, selfRepository } from "../handles.js";
import { isActorNotFoundError, logActorWarning, resolveErrorMessage } from "../logging.js";
import { foundryRepoClonePath } from "../../services/foundry-paths.js";
import { resolveWorkspaceGithubAuth } from "../../services/github-auth.js";
import { expectQueueResponse } from "../../services/queue.js";
import { withRepoGitLock } from "../../services/repo-git-lock.js";
import { branches, taskIndex, prCache, repoMeta } from "./db/schema.js";
import { branches, taskIndex, repoMeta } from "./db/schema.js";
import { deriveFallbackTitle } from "../../services/create-flow.js";
import { normalizeBaseBranchName } from "../../integrations/git-spice/index.js";
import { parentLookupFromStack } from "./stack-model.js";
import { sortBranchesForOverview } from "./stack-model.js";
import { taskWorkflowQueueName } from "../task/workflow/index.js";
interface EnsureProjectCommand {
remoteUrl: string;
@ -24,6 +26,7 @@ interface EnsureProjectResult {
}
interface CreateTaskCommand {
taskId?: string | null;
task: string;
providerId: ProviderId;
agentType: AgentType | null;
@ -55,33 +58,9 @@ interface GetPullRequestForBranchCommand {
branchName: string;
}
interface PrSyncResult {
items: Array<{
number: number;
headRefName: string;
state: string;
title: string;
url?: string;
author?: string;
isDraft?: boolean;
ciStatus?: string | null;
reviewStatus?: string | null;
reviewer?: string | null;
}>;
at: number;
}
interface BranchSyncResult {
items: Array<{
branchName: string;
commitSha: string;
parentBranch?: string | null;
trackedInStack?: boolean;
diffStat?: string | null;
hasUnpushed?: boolean;
conflictsWithMain?: boolean;
}>;
at: number;
interface ApplyGithubPullRequestStateCommand {
branchName: string;
state: string;
}
interface RepoOverviewCommand {}
@ -98,8 +77,6 @@ const PROJECT_QUEUE_NAMES = [
"project.command.createTask",
"project.command.registerTaskBranch",
"project.command.runRepoStackAction",
"project.command.applyPrSyncResult",
"project.command.applyBranchSyncResult",
] as const;
type ProjectQueueName = (typeof PROJECT_QUEUE_NAMES)[number];
@ -119,18 +96,88 @@ async function ensureLocalClone(c: any, remoteUrl: string): Promise<string> {
return localPath;
}
async function ensureProjectSyncActors(c: any, localPath: string): Promise<void> {
if (c.state.syncActorsStarted) {
return;
async function refreshRepositoryBranches(c: any, localPath: string): Promise<void> {
const { driver } = getActorRuntimeContext();
const at = Date.now();
const auth = await resolveWorkspaceGithubAuth(c, c.state.workspaceId);
const enrichedItems = await withRepoGitLock(localPath, async () => {
await driver.git.fetch(localPath, { githubToken: auth?.githubToken ?? null });
const remoteBranches = await driver.git.listRemoteBranches(localPath);
const stackEntries = await driver.stack.listStack(localPath).catch(() => []);
const parentByBranch = parentLookupFromStack(stackEntries);
const baseRef = await driver.git.remoteDefaultBaseRef(localPath);
const baseSha = await driver.git.revParse(localPath, baseRef).catch(() => "");
return await Promise.all(
remoteBranches.map(async (branch) => {
const [diffStat, headSha, conflictsWithMain] = await Promise.all([
driver.git.diffStatForBranch(localPath, branch.branchName).catch(() => null),
driver.git.revParse(localPath, `origin/${branch.branchName}`).catch(() => ""),
driver.git.conflictsWithMain(localPath, branch.branchName).catch(() => false),
]);
return {
branchName: branch.branchName,
commitSha: branch.commitSha,
parentBranch: parentByBranch.get(branch.branchName) ?? null,
trackedInStack: parentByBranch.has(branch.branchName),
diffStat,
hasUnpushed: Boolean(baseSha && headSha && headSha !== baseSha),
conflictsWithMain,
};
}),
);
});
const incoming = new Set(enrichedItems.map((item) => item.branchName));
for (const item of enrichedItems) {
const existing = await c.db
.select({
firstSeenAt: branches.firstSeenAt,
})
.from(branches)
.where(eq(branches.branchName, item.branchName))
.get();
await c.db
.insert(branches)
.values({
branchName: item.branchName,
commitSha: item.commitSha,
parentBranch: item.parentBranch,
trackedInStack: item.trackedInStack ? 1 : 0,
diffStat: item.diffStat ?? null,
hasUnpushed: item.hasUnpushed ? 1 : 0,
conflictsWithMain: item.conflictsWithMain ? 1 : 0,
firstSeenAt: existing?.firstSeenAt ?? at,
lastSeenAt: at,
updatedAt: at,
})
.onConflictDoUpdate({
target: branches.branchName,
set: {
commitSha: item.commitSha,
parentBranch: item.parentBranch,
trackedInStack: item.trackedInStack ? 1 : 0,
diffStat: item.diffStat ?? null,
hasUnpushed: item.hasUnpushed ? 1 : 0,
conflictsWithMain: item.conflictsWithMain ? 1 : 0,
firstSeenAt: existing?.firstSeenAt ?? at,
lastSeenAt: at,
updatedAt: at,
},
})
.run();
}
const prSync = await getOrCreateProjectPrSync(c, c.state.workspaceId, c.state.repoId, localPath, 30_000);
await prSync.start();
const branchSync = await getOrCreateProjectBranchSync(c, c.state.workspaceId, c.state.repoId, localPath, 5_000);
await branchSync.start();
c.state.syncActorsStarted = true;
const existingRows = await c.db.select({ branchName: branches.branchName }).from(branches).all();
for (const row of existingRows) {
if (incoming.has(row.branchName)) {
continue;
}
await c.db.delete(branches).where(eq(branches.branchName, row.branchName)).run();
}
}
async function deleteStaleTaskIndexRow(c: any, taskId: string): Promise<void> {
@ -222,7 +269,7 @@ async function ensureProjectReady(c: any): Promise<string> {
if (!c.state.localPath) {
throw new Error("project local repo is not initialized");
}
await ensureProjectSyncActors(c, c.state.localPath);
await refreshRepositoryBranches(c, c.state.localPath);
return c.state.localPath;
}
@ -231,7 +278,7 @@ async function ensureProjectReadyForRead(c: any): Promise<string> {
throw new Error("project remoteUrl is not initialized");
}
if (!c.state.localPath || !c.state.syncActorsStarted) {
if (!c.state.localPath) {
const result = await projectActions.ensure(c, { remoteUrl: c.state.remoteUrl });
const localPath = result?.localPath ?? c.state.localPath;
if (!localPath) {
@ -251,11 +298,7 @@ async function ensureTaskIndexHydratedForRead(c: any): Promise<void> {
}
async function forceProjectSync(c: any, localPath: string): Promise<void> {
const prSync = await getOrCreateProjectPrSync(c, c.state.workspaceId, c.state.repoId, localPath, 30_000);
await prSync.force();
const branchSync = await getOrCreateProjectBranchSync(c, c.state.workspaceId, c.state.repoId, localPath, 5_000);
await branchSync.force();
await refreshRepositoryBranches(c, localPath);
}
async function enrichTaskRecord(c: any, record: TaskRecord): Promise<TaskRecord> {
@ -274,20 +317,8 @@ async function enrichTaskRecord(c: any, record: TaskRecord): Promise<TaskRecord>
.get()
: null;
const pr =
branchName != null
? await c.db
.select({
prUrl: prCache.prUrl,
prAuthor: prCache.prAuthor,
ciStatus: prCache.ciStatus,
reviewStatus: prCache.reviewStatus,
reviewer: prCache.reviewer,
})
.from(prCache)
.where(eq(prCache.branchName, branchName))
.get()
: null;
const githubState = await getOrCreateGithubState(c, c.state.workspaceId);
const pr = branchName != null ? await githubState.getPullRequestForBranch({ repoId: c.state.repoId, branchName }) : null;
return {
...record,
@ -295,34 +326,14 @@ async function enrichTaskRecord(c: any, record: TaskRecord): Promise<TaskRecord>
hasUnpushed: br?.hasUnpushed != null ? String(br.hasUnpushed) : null,
conflictsWithMain: br?.conflictsWithMain != null ? String(br.conflictsWithMain) : null,
parentBranch: br?.parentBranch ?? null,
prUrl: pr?.prUrl ?? null,
prAuthor: pr?.prAuthor ?? null,
ciStatus: pr?.ciStatus ?? null,
reviewStatus: pr?.reviewStatus ?? null,
reviewer: pr?.reviewer ?? null,
prUrl: pr?.url ?? null,
prAuthor: null,
ciStatus: null,
reviewStatus: null,
reviewer: null,
};
}
async function reinsertTaskIndexRow(c: any, taskId: string, branchName: string | null, updatedAt: number): Promise<void> {
const now = Date.now();
await c.db
.insert(taskIndex)
.values({
taskId,
branchName,
createdAt: updatedAt || now,
updatedAt: now,
})
.onConflictDoUpdate({
target: taskIndex.taskId,
set: {
branchName,
updatedAt: now,
},
})
.run();
}
async function ensureProjectMutation(c: any, cmd: EnsureProjectCommand): Promise<EnsureProjectResult> {
c.state.remoteUrl = cmd.remoteUrl;
const localPath = await ensureLocalClone(c, cmd.remoteUrl);
@ -343,7 +354,7 @@ async function ensureProjectMutation(c: any, cmd: EnsureProjectCommand): Promise
})
.run();
await ensureProjectSyncActors(c, localPath);
await refreshRepositoryBranches(c, localPath);
return { localPath };
}
@ -356,7 +367,8 @@ async function createTaskMutation(c: any, cmd: CreateTaskCommand): Promise<TaskR
const onBranch = cmd.onBranch?.trim() || null;
const initialBranchName = onBranch;
const initialTitle = onBranch ? deriveFallbackTitle(cmd.task, cmd.explicitTitle ?? undefined) : null;
const taskId = randomUUID();
const taskId = cmd.taskId?.trim() || randomUUID();
const now = Date.now();
if (onBranch) {
await forceProjectSync(c, localPath);
@ -402,7 +414,6 @@ async function createTaskMutation(c: any, cmd: CreateTaskCommand): Promise<TaskR
}
if (!onBranch) {
const now = Date.now();
await c.db
.insert(taskIndex)
.values({
@ -415,19 +426,61 @@ async function createTaskMutation(c: any, cmd: CreateTaskCommand): Promise<TaskR
.run();
}
const created = await task.initialize({ providerId: cmd.providerId });
await task.send(
taskWorkflowQueueName("task.command.initialize"),
{ providerId: cmd.providerId },
{
wait: false,
},
);
const history = await getOrCreateHistory(c, c.state.workspaceId, c.state.repoId);
await history.append({
kind: "task.created",
taskId,
payload: {
repoId: c.state.repoId,
providerId: cmd.providerId,
},
});
void history
.append({
kind: "task.created",
taskId,
payload: {
repoId: c.state.repoId,
providerId: cmd.providerId,
},
})
.catch((error: unknown) => {
logActorWarning("project", "failed appending task.created history event", {
workspaceId: c.state.workspaceId,
repoId: c.state.repoId,
taskId,
error: resolveErrorMessage(error),
});
});
return created;
return {
workspaceId: c.state.workspaceId,
repoId: c.state.repoId,
repoRemote: c.state.remoteUrl,
taskId,
branchName: initialBranchName,
title: initialTitle,
task: cmd.task,
providerId: cmd.providerId,
status: "init_enqueue_provision",
statusMessage: "provision queued",
activeSandboxId: null,
activeSessionId: null,
sandboxes: [],
agentType: cmd.agentType ?? null,
prSubmitted: false,
diffStat: null,
hasUnpushed: null,
conflictsWithMain: null,
parentBranch: null,
prUrl: null,
prAuthor: null,
ciStatus: null,
reviewStatus: null,
reviewer: null,
createdAt: now,
updatedAt: now,
} satisfies TaskRecord;
}
async function registerTaskBranchMutation(c: any, cmd: RegisterTaskBranchCommand): Promise<{ branchName: string; headSha: string }> {
@ -661,135 +714,6 @@ async function runRepoStackActionMutation(c: any, cmd: RunRepoStackActionCommand
};
}
async function applyPrSyncResultMutation(c: any, body: PrSyncResult): Promise<void> {
await c.db.delete(prCache).run();
for (const item of body.items) {
await c.db
.insert(prCache)
.values({
branchName: item.headRefName,
prNumber: item.number,
state: item.state,
title: item.title,
prUrl: item.url ?? null,
prAuthor: item.author ?? null,
isDraft: item.isDraft ? 1 : 0,
ciStatus: item.ciStatus ?? null,
reviewStatus: item.reviewStatus ?? null,
reviewer: item.reviewer ?? null,
fetchedAt: body.at,
updatedAt: body.at,
})
.onConflictDoUpdate({
target: prCache.branchName,
set: {
prNumber: item.number,
state: item.state,
title: item.title,
prUrl: item.url ?? null,
prAuthor: item.author ?? null,
isDraft: item.isDraft ? 1 : 0,
ciStatus: item.ciStatus ?? null,
reviewStatus: item.reviewStatus ?? null,
reviewer: item.reviewer ?? null,
fetchedAt: body.at,
updatedAt: body.at,
},
})
.run();
}
for (const item of body.items) {
if (item.state !== "MERGED" && item.state !== "CLOSED") {
continue;
}
const row = await c.db.select({ taskId: taskIndex.taskId }).from(taskIndex).where(eq(taskIndex.branchName, item.headRefName)).get();
if (!row) {
continue;
}
try {
const h = getTask(c, c.state.workspaceId, c.state.repoId, row.taskId);
await h.archive({ reason: `PR ${item.state.toLowerCase()}` });
} catch (error) {
if (isStaleTaskReferenceError(error)) {
await deleteStaleTaskIndexRow(c, row.taskId);
logActorWarning("project", "pruned stale task index row during PR close archive", {
workspaceId: c.state.workspaceId,
repoId: c.state.repoId,
taskId: row.taskId,
branchName: item.headRefName,
prState: item.state,
});
continue;
}
logActorWarning("project", "failed to auto-archive task after PR close", {
workspaceId: c.state.workspaceId,
repoId: c.state.repoId,
taskId: row.taskId,
branchName: item.headRefName,
prState: item.state,
error: resolveErrorMessage(error),
});
}
}
}
async function applyBranchSyncResultMutation(c: any, body: BranchSyncResult): Promise<void> {
const incoming = new Set(body.items.map((item) => item.branchName));
for (const item of body.items) {
const existing = await c.db
.select({
firstSeenAt: branches.firstSeenAt,
})
.from(branches)
.where(eq(branches.branchName, item.branchName))
.get();
await c.db
.insert(branches)
.values({
branchName: item.branchName,
commitSha: item.commitSha,
parentBranch: item.parentBranch ?? null,
trackedInStack: item.trackedInStack ? 1 : 0,
diffStat: item.diffStat ?? null,
hasUnpushed: item.hasUnpushed ? 1 : 0,
conflictsWithMain: item.conflictsWithMain ? 1 : 0,
firstSeenAt: existing?.firstSeenAt ?? body.at,
lastSeenAt: body.at,
updatedAt: body.at,
})
.onConflictDoUpdate({
target: branches.branchName,
set: {
commitSha: item.commitSha,
parentBranch: item.parentBranch ?? null,
trackedInStack: item.trackedInStack ? 1 : 0,
diffStat: item.diffStat ?? null,
hasUnpushed: item.hasUnpushed ? 1 : 0,
conflictsWithMain: item.conflictsWithMain ? 1 : 0,
firstSeenAt: existing?.firstSeenAt ?? body.at,
lastSeenAt: body.at,
updatedAt: body.at,
},
})
.run();
}
const existingRows = await c.db.select({ branchName: branches.branchName }).from(branches).all();
for (const row of existingRows) {
if (incoming.has(row.branchName)) {
continue;
}
await c.db.delete(branches).where(eq(branches.branchName, row.branchName)).run();
}
}
export async function runProjectWorkflow(ctx: any): Promise<void> {
await ctx.loop("project-command-loop", async (loopCtx: any) => {
const msg = await loopCtx.queue.next("next-project-command", {
@ -846,32 +770,13 @@ export async function runProjectWorkflow(ctx: any): Promise<void> {
return Loop.continue(undefined);
}
if (msg.name === "project.command.applyPrSyncResult") {
await loopCtx.step({
name: "project-apply-pr-sync-result",
timeout: 60_000,
run: async () => applyPrSyncResultMutation(loopCtx, msg.body as PrSyncResult),
});
await msg.complete({ ok: true });
return Loop.continue(undefined);
}
if (msg.name === "project.command.applyBranchSyncResult") {
await loopCtx.step({
name: "project-apply-branch-sync-result",
timeout: 60_000,
run: async () => applyBranchSyncResultMutation(loopCtx, msg.body as BranchSyncResult),
});
await msg.complete({ ok: true });
}
return Loop.continue(undefined);
});
}
export const projectActions = {
async ensure(c: any, cmd: EnsureProjectCommand): Promise<EnsureProjectResult> {
const self = selfProject(c);
const self = selfRepository(c);
return expectQueueResponse<EnsureProjectResult>(
await self.send(projectWorkflowQueueName("project.command.ensure"), cmd, {
wait: true,
@ -881,13 +786,86 @@ export const projectActions = {
},
async createTask(c: any, cmd: CreateTaskCommand): Promise<TaskRecord> {
const self = selfProject(c);
return expectQueueResponse<TaskRecord>(
await self.send(projectWorkflowQueueName("project.command.createTask"), cmd, {
wait: true,
timeout: 12 * 60_000,
}),
const self = selfRepository(c);
const taskId = cmd.taskId?.trim() || randomUUID();
const now = Date.now();
const initialBranchName = cmd.onBranch?.trim() || null;
const initialTitle = initialBranchName ? deriveFallbackTitle(cmd.task, cmd.explicitTitle ?? undefined) : null;
const localPath = c.state.localPath ?? foundryRepoClonePath(getActorRuntimeContext().config, c.state.workspaceId, c.state.repoId);
await getOrCreateTask(c, c.state.workspaceId, c.state.repoId, taskId, {
workspaceId: c.state.workspaceId,
repoId: c.state.repoId,
taskId,
repoRemote: c.state.remoteUrl,
repoLocalPath: localPath,
branchName: initialBranchName,
title: initialTitle,
task: cmd.task,
providerId: cmd.providerId,
agentType: cmd.agentType,
explicitTitle: initialBranchName ? null : cmd.explicitTitle,
explicitBranchName: initialBranchName ? null : cmd.explicitBranchName,
initialPrompt: cmd.initialPrompt,
createdAt: now,
updatedAt: now,
});
await c.db
.insert(taskIndex)
.values({
taskId,
branchName: initialBranchName,
createdAt: now,
updatedAt: now,
})
.onConflictDoUpdate({
target: taskIndex.taskId,
set: {
branchName: initialBranchName,
updatedAt: now,
},
})
.run();
await self.send(
projectWorkflowQueueName("project.command.createTask"),
{
...cmd,
taskId,
},
{
wait: false,
},
);
return {
workspaceId: c.state.workspaceId,
repoId: c.state.repoId,
repoRemote: c.state.remoteUrl,
taskId,
branchName: initialBranchName,
title: initialTitle,
task: cmd.task,
providerId: cmd.providerId,
status: "init_enqueue_provision",
statusMessage: "provision queued",
activeSandboxId: null,
activeSessionId: null,
sandboxes: [],
agentType: cmd.agentType ?? null,
prSubmitted: false,
diffStat: null,
hasUnpushed: null,
conflictsWithMain: null,
parentBranch: null,
prUrl: null,
prAuthor: null,
ciStatus: null,
reviewStatus: null,
reviewer: null,
createdAt: now,
updatedAt: now,
} satisfies TaskRecord;
},
async listReservedBranches(c: any, _cmd?: ListReservedBranchesCommand): Promise<string[]> {
@ -899,7 +877,7 @@ export const projectActions = {
},
async registerTaskBranch(c: any, cmd: RegisterTaskBranchCommand): Promise<{ branchName: string; headSha: string }> {
const self = selfProject(c);
const self = selfRepository(c);
return expectQueueResponse<{ branchName: string; headSha: string }>(
await self.send(projectWorkflowQueueName("project.command.registerTaskBranch"), cmd, {
wait: true,
@ -909,7 +887,7 @@ export const projectActions = {
},
async hydrateTaskIndex(c: any, cmd?: HydrateTaskIndexCommand): Promise<void> {
const self = selfProject(c);
const self = selfRepository(c);
await self.send(projectWorkflowQueueName("project.command.hydrateTaskIndex"), cmd ?? {}, {
wait: true,
timeout: 60_000,
@ -970,17 +948,7 @@ export const projectActions = {
const row = await c.db.select({ taskId: taskIndex.taskId }).from(taskIndex).where(eq(taskIndex.taskId, cmd.taskId)).get();
if (!row) {
try {
const h = getTask(c, c.state.workspaceId, c.state.repoId, cmd.taskId);
const record = await h.get();
await reinsertTaskIndexRow(c, cmd.taskId, record.branchName ?? null, record.updatedAt ?? Date.now());
return await enrichTaskRecord(c, record);
} catch (error) {
if (isStaleTaskReferenceError(error)) {
throw new Error(`Unknown task in repo ${c.state.repoId}: ${cmd.taskId}`);
}
throw error;
}
throw new Error(`Unknown task in repo ${c.state.repoId}: ${cmd.taskId}`);
}
try {
@ -1067,19 +1035,18 @@ export const projectActions = {
}
}
const prRows = await c.db
.select({
branchName: prCache.branchName,
prNumber: prCache.prNumber,
prState: prCache.state,
prUrl: prCache.prUrl,
ciStatus: prCache.ciStatus,
reviewStatus: prCache.reviewStatus,
reviewer: prCache.reviewer,
})
.from(prCache)
.all();
const prByBranch = new Map(prRows.map((row) => [row.branchName, row]));
const githubState = await getOrCreateGithubState(c, c.state.workspaceId);
const pullRequests = await githubState.listPullRequestsForRepository({ repoId: c.state.repoId });
const prByBranch = new Map(
pullRequests.map((row) => [
row.headRefName,
{
prNumber: row.number,
prState: row.state,
prUrl: row.url,
},
]),
);
const combinedRows = sortBranchesForOverview(
branchRowsRaw.map((row) => ({
@ -1109,9 +1076,9 @@ export const projectActions = {
prNumber: pr?.prNumber ?? null,
prState: pr?.prState ?? null,
prUrl: pr?.prUrl ?? null,
ciStatus: pr?.ciStatus ?? null,
reviewStatus: pr?.reviewStatus ?? null,
reviewer: pr?.reviewer ?? null,
ciStatus: null,
reviewStatus: null,
reviewer: null,
firstSeenAt: row.firstSeenAt ?? null,
lastSeenAt: row.lastSeenAt ?? null,
updatedAt: Math.max(row.updatedAt, taskMeta?.updatedAt ?? 0),
@ -1129,33 +1096,60 @@ export const projectActions = {
};
},
async getPullRequestForBranch(c: any, cmd: GetPullRequestForBranchCommand): Promise<{ number: number; status: "draft" | "ready" } | null> {
async getPullRequestForBranch(
c: any,
cmd: GetPullRequestForBranchCommand,
): Promise<{ number: number; status: "draft" | "ready" | "closed" | "merged" } | null> {
const branchName = cmd.branchName?.trim();
if (!branchName) {
return null;
}
const pr = await c.db
.select({
prNumber: prCache.prNumber,
prState: prCache.state,
})
.from(prCache)
.where(eq(prCache.branchName, branchName))
.get();
const githubState = await getOrCreateGithubState(c, c.state.workspaceId);
const pr = await githubState.getPullRequestForBranch({
repoId: c.state.repoId,
branchName,
});
if (!pr?.prNumber) {
if (!pr?.number) {
return null;
}
return {
number: pr.prNumber,
status: pr.prState === "draft" ? "draft" : "ready",
number: pr.number,
status: pr.status,
};
},
async applyGithubPullRequestState(c: any, cmd: ApplyGithubPullRequestStateCommand): Promise<void> {
const branchName = cmd.branchName?.trim();
if (!branchName) {
return;
}
const normalizedState = cmd.state.trim().toUpperCase();
if (normalizedState !== "CLOSED" && normalizedState !== "MERGED" && normalizedState !== "closed" && normalizedState !== "merged") {
return;
}
const row = await c.db.select({ taskId: taskIndex.taskId }).from(taskIndex).where(eq(taskIndex.branchName, branchName)).get();
if (!row) {
return;
}
try {
const task = getTask(c, c.state.workspaceId, c.state.repoId, row.taskId);
await task.archive({ reason: `PR ${normalizedState.toLowerCase()}` });
} catch (error) {
if (isStaleTaskReferenceError(error)) {
await deleteStaleTaskIndexRow(c, row.taskId);
return;
}
throw error;
}
},
async runRepoStackAction(c: any, cmd: RunRepoStackActionCommand): Promise<RepoStackActionResult> {
const self = selfProject(c);
const self = selfRepository(c);
return expectQueueResponse<RepoStackActionResult>(
await self.send(projectWorkflowQueueName("project.command.runRepoStackAction"), cmd, {
wait: true,
@ -1163,20 +1157,4 @@ export const projectActions = {
}),
);
},
async applyPrSyncResult(c: any, body: PrSyncResult): Promise<void> {
const self = selfProject(c);
await self.send(projectWorkflowQueueName("project.command.applyPrSyncResult"), body, {
wait: true,
timeout: 5 * 60_000,
});
},
async applyBranchSyncResult(c: any, body: BranchSyncResult): Promise<void> {
const self = selfProject(c);
await self.send(projectWorkflowQueueName("project.command.applyBranchSyncResult"), body, {
wait: true,
timeout: 5 * 60_000,
});
},
};

View file

@ -2,4 +2,4 @@ import { db } from "rivetkit/db/drizzle";
import * as schema from "./schema.js";
import migrations from "./migrations.js";
export const workspaceDb = db({ schema, migrations });
export const repositoryDb = db({ schema, migrations });

View file

@ -0,0 +1,6 @@
import { defineConfig } from "rivetkit/db/drizzle";
export default defineConfig({
out: "./src/actors/repository/db/drizzle",
schema: "./src/actors/repository/db/schema.ts",
});

View file

@ -0,0 +1,42 @@
const journal = {
entries: [
{
idx: 0,
when: 1773356100001,
tag: "0000_repository_state",
breakpoints: true,
},
],
} as const;
export default {
journal,
migrations: {
m0000: `CREATE TABLE \`branches\` (
\`branch_name\` text PRIMARY KEY NOT NULL,
\`commit_sha\` text NOT NULL,
\`parent_branch\` text,
\`tracked_in_stack\` integer,
\`diff_stat\` text,
\`has_unpushed\` integer,
\`conflicts_with_main\` integer,
\`first_seen_at\` integer,
\`last_seen_at\` integer,
\`updated_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`repo_meta\` (
\`id\` integer PRIMARY KEY NOT NULL,
\`remote_url\` text NOT NULL,
\`updated_at\` integer NOT NULL
);
--> statement-breakpoint
CREATE TABLE \`task_index\` (
\`task_id\` text PRIMARY KEY NOT NULL,
\`branch_name\` text,
\`created_at\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
} as const,
};

View file

@ -21,21 +21,6 @@ export const repoMeta = sqliteTable("repo_meta", {
updatedAt: integer("updated_at").notNull(),
});
export const prCache = sqliteTable("pr_cache", {
branchName: text("branch_name").notNull().primaryKey(),
prNumber: integer("pr_number").notNull(),
state: text("state").notNull(),
title: text("title").notNull(),
prUrl: text("pr_url"),
prAuthor: text("pr_author"),
isDraft: integer("is_draft"),
ciStatus: text("ci_status"),
reviewStatus: text("review_status"),
reviewer: text("reviewer"),
fetchedAt: integer("fetched_at"),
updatedAt: integer("updated_at").notNull(),
});
export const taskIndex = sqliteTable("task_index", {
taskId: text("task_id").notNull().primaryKey(),
branchName: text("branch_name"),

View file

@ -0,0 +1,39 @@
import { actor, queue } from "rivetkit";
import { workflow } from "rivetkit/workflow";
import { repositoryDb } from "./db/db.js";
import { reportWorkflowIssueToOrganization } from "../runtime-issues.js";
import { PROJECT_QUEUE_NAMES as REPOSITORY_QUEUE_NAMES, projectActions as repositoryActions, runProjectWorkflow as runRepositoryWorkflow } from "./actions.js";
export interface RepositoryInput {
workspaceId: string;
repoId: string;
remoteUrl: string;
}
const repositoryConfig: any = {
db: repositoryDb,
queues: Object.fromEntries(REPOSITORY_QUEUE_NAMES.map((name) => [name, queue()])),
options: {
actionTimeout: 5 * 60_000,
},
createState: (_c, input: RepositoryInput) => ({
workspaceId: input.workspaceId,
repoId: input.repoId,
remoteUrl: input.remoteUrl,
localPath: null as string | null,
taskIndexHydrated: false,
}),
actions: repositoryActions,
run: workflow(runRepositoryWorkflow, {
onError: async (c: any, event) => {
await reportWorkflowIssueToOrganization(c, event, {
actorType: "repository",
organizationId: c.state.workspaceId,
scopeId: c.state.repoId,
scopeLabel: `Repository ${c.state.repoId}`,
});
},
}),
};
export const repository = (actor as any)(repositoryConfig);

View file

@ -0,0 +1,160 @@
import type { WorkflowErrorEvent } from "rivetkit/workflow";
import type { FoundryActorRuntimeIssue, FoundryActorRuntimeType } from "@sandbox-agent/foundry-shared";
import { sql } from "drizzle-orm";
import { organizationActorIssues } from "./organization/db/schema.js";
import { getOrCreateOrganization } from "./handles.js";
export interface ActorRuntimeIssueRecord extends FoundryActorRuntimeIssue {}
interface NormalizedWorkflowIssue {
workflowId: string | null;
stepName: string | null;
attempt: number | null;
willRetry: boolean;
retryDelayMs: number | null;
message: string;
}
interface ReportWorkflowIssueInput {
actorType: FoundryActorRuntimeType;
scopeId?: string | null;
scopeLabel: string;
organizationId: string;
}
async function ensureOrganizationActorIssuesTable(c: any): Promise<void> {
await c.db.run(sql`
CREATE TABLE IF NOT EXISTS organization_actor_issues (
actor_id text PRIMARY KEY NOT NULL,
actor_type text NOT NULL,
scope_id text,
scope_label text NOT NULL,
message text NOT NULL,
workflow_id text,
step_name text,
attempt integer,
will_retry integer DEFAULT 0 NOT NULL,
retry_delay_ms integer,
occurred_at integer NOT NULL,
updated_at integer NOT NULL
)
`);
}
export async function upsertActorRuntimeIssue(c: any, issue: ActorRuntimeIssueRecord): Promise<void> {
await ensureOrganizationActorIssuesTable(c);
await c.db
.insert(organizationActorIssues)
.values({
actorId: issue.actorId,
actorType: issue.actorType,
scopeId: issue.scopeId,
scopeLabel: issue.scopeLabel,
message: issue.message,
workflowId: issue.workflowId,
stepName: issue.stepName,
attempt: issue.attempt,
willRetry: issue.willRetry ? 1 : 0,
retryDelayMs: issue.retryDelayMs,
occurredAt: issue.occurredAt,
updatedAt: issue.occurredAt,
})
.onConflictDoUpdate({
target: organizationActorIssues.actorId,
set: {
actorType: issue.actorType,
scopeId: issue.scopeId,
scopeLabel: issue.scopeLabel,
message: issue.message,
workflowId: issue.workflowId,
stepName: issue.stepName,
attempt: issue.attempt,
willRetry: issue.willRetry ? 1 : 0,
retryDelayMs: issue.retryDelayMs,
occurredAt: issue.occurredAt,
updatedAt: issue.occurredAt,
},
})
.run();
}
export async function listActorRuntimeIssues(c: any): Promise<ActorRuntimeIssueRecord[]> {
await ensureOrganizationActorIssuesTable(c);
const rows = await c.db.select().from(organizationActorIssues).orderBy(organizationActorIssues.occurredAt).all();
return rows
.map((row) => ({
actorId: row.actorId,
actorType: row.actorType as FoundryActorRuntimeType,
scopeId: row.scopeId ?? null,
scopeLabel: row.scopeLabel,
message: row.message,
workflowId: row.workflowId ?? null,
stepName: row.stepName ?? null,
attempt: row.attempt ?? null,
willRetry: Boolean(row.willRetry),
retryDelayMs: row.retryDelayMs ?? null,
occurredAt: row.occurredAt,
}))
.sort((left, right) => right.occurredAt - left.occurredAt);
}
function normalizeWorkflowIssue(event: WorkflowErrorEvent): NormalizedWorkflowIssue {
if ("step" in event) {
const error = event.step.error;
return {
workflowId: event.step.workflowId,
stepName: event.step.stepName,
attempt: event.step.attempt,
willRetry: event.step.willRetry,
retryDelayMs: event.step.retryDelay ?? null,
message: `${error.name}: ${error.message}`,
};
}
if ("rollback" in event) {
const error = event.rollback.error;
return {
workflowId: event.rollback.workflowId,
stepName: event.rollback.stepName,
attempt: null,
willRetry: false,
retryDelayMs: null,
message: `${error.name}: ${error.message}`,
};
}
const error = event.workflow.error;
return {
workflowId: event.workflow.workflowId,
stepName: null,
attempt: null,
willRetry: false,
retryDelayMs: null,
message: `${error.name}: ${error.message}`,
};
}
export async function reportWorkflowIssueToOrganization(c: any, event: WorkflowErrorEvent, input: ReportWorkflowIssueInput): Promise<void> {
const normalized = normalizeWorkflowIssue(event);
const issue: ActorRuntimeIssueRecord = {
actorId: c.actorId,
actorType: input.actorType,
scopeId: input.scopeId ?? null,
scopeLabel: input.scopeLabel,
message: normalized.message,
workflowId: normalized.workflowId,
stepName: normalized.stepName,
attempt: normalized.attempt,
willRetry: normalized.willRetry,
retryDelayMs: normalized.retryDelayMs,
occurredAt: Date.now(),
};
if (input.actorType === "organization" && input.organizationId === c.state.workspaceId) {
await upsertActorRuntimeIssue(c, issue);
return;
}
const organization = await getOrCreateOrganization(c, input.organizationId);
await organization.recordActorRuntimeIssue(issue);
}

View file

@ -8,6 +8,8 @@ import type {
ProcessInfo,
ProcessLogFollowQuery,
ProcessLogsResponse,
ProcessRunRequest,
ProcessRunResponse,
ProcessSignalQuery,
SessionEvent,
SessionRecord,
@ -18,6 +20,7 @@ import { SandboxInstancePersistDriver } from "./persist.js";
import { getActorRuntimeContext } from "../context.js";
import { selfSandboxInstance } from "../handles.js";
import { logActorWarning, resolveErrorMessage } from "../logging.js";
import { reportWorkflowIssueToOrganization } from "../runtime-issues.js";
import { expectQueueResponse } from "../../services/queue.js";
export interface SandboxInstanceInput {
@ -454,7 +457,7 @@ async function runSandboxInstanceWorkflow(ctx: any): Promise<void> {
});
}
export const sandboxInstance = actor({
const sandboxInstanceConfig: any = {
db: sandboxInstanceDb,
queues: Object.fromEntries(SANDBOX_INSTANCE_QUEUE_NAMES.map((name) => [name, queue()])),
options: {
@ -477,6 +480,11 @@ export const sandboxInstance = actor({
return created;
},
async runProcess(c: any, request: ProcessRunRequest): Promise<ProcessRunResponse> {
const client = await getSandboxAgentClient(c);
return await client.runProcess(request);
},
async listProcesses(c: any): Promise<{ processes: ProcessInfo[] }> {
const client = await getSandboxAgentClient(c);
return await client.listProcesses();
@ -632,5 +640,16 @@ export const sandboxInstance = actor({
return await derivePersistedSessionStatus(new SandboxInstancePersistDriver(c.db), command.sessionId);
},
},
run: workflow(runSandboxInstanceWorkflow),
});
run: workflow(runSandboxInstanceWorkflow, {
onError: async (c: any, event) => {
await reportWorkflowIssueToOrganization(c, event, {
actorType: "sandbox_instance",
organizationId: c.state.workspaceId,
scopeId: c.state.sandboxId,
scopeLabel: `Sandbox ${c.state.sandboxId}`,
});
},
}),
};
export const sandboxInstance = (actor as any)(sandboxInstanceConfig);

View file

@ -3,6 +3,7 @@ import { workflow } from "rivetkit/workflow";
import type { ProviderId } from "@sandbox-agent/foundry-shared";
import { getTask, getSandboxInstance, selfTaskStatusSync } from "../handles.js";
import { logActorWarning, resolveErrorMessage, resolveErrorStack } from "../logging.js";
import { reportWorkflowIssueToOrganization } from "../runtime-issues.js";
import { type PollingControlState, runWorkflowPollingLoop } from "../polling.js";
export interface TaskStatusSyncInput {
@ -35,6 +36,11 @@ const CONTROL = {
force: "task.status_sync.control.force",
} as const;
function isActorNotFoundError(error: unknown): boolean {
const message = resolveErrorMessage(error).toLowerCase();
return message.includes("actor not found");
}
async function pollSessionStatus(c: { state: TaskStatusSyncState }): Promise<void> {
const sandboxInstance = getSandboxInstance(c, c.state.workspaceId, c.state.providerId, c.state.sandboxId);
const status = await sandboxInstance.sessionStatus({ sessionId: c.state.sessionId });
@ -47,7 +53,37 @@ async function pollSessionStatus(c: { state: TaskStatusSyncState }): Promise<voi
});
}
export const taskStatusSync = actor({
async function runTaskStatusSyncWorkflow(ctx: any): Promise<void> {
await runWorkflowPollingLoop(ctx, {
loopName: "task-status-sync-loop",
control: CONTROL,
onPoll: async (loopCtx) => {
const pollingCtx = loopCtx as any;
try {
await pollSessionStatus(pollingCtx);
} catch (error) {
if (isActorNotFoundError(error)) {
pollingCtx.state.running = false;
logActorWarning("task-status-sync", "stopping orphaned poller", {
workspaceId: pollingCtx.state.workspaceId,
repoId: pollingCtx.state.repoId,
taskId: pollingCtx.state.taskId,
sandboxId: pollingCtx.state.sandboxId,
sessionId: pollingCtx.state.sessionId,
error: resolveErrorMessage(error),
});
return;
}
logActorWarning("task-status-sync", "poll failed", {
error: resolveErrorMessage(error),
stack: resolveErrorStack(error),
});
}
},
});
}
const taskStatusSyncConfig: any = {
queues: {
[CONTROL.start]: queue(),
[CONTROL.stop]: queue(),
@ -89,20 +125,16 @@ export const taskStatusSync = actor({
await self.send(CONTROL.force, {}, { wait: true, timeout: 5 * 60_000 });
},
},
run: workflow(async (ctx) => {
await runWorkflowPollingLoop<TaskStatusSyncState>(ctx, {
loopName: "task-status-sync-loop",
control: CONTROL,
onPoll: async (loopCtx) => {
try {
await pollSessionStatus(loopCtx);
} catch (error) {
logActorWarning("task-status-sync", "poll failed", {
error: resolveErrorMessage(error),
stack: resolveErrorStack(error),
});
}
},
});
run: workflow(runTaskStatusSyncWorkflow, {
onError: async (c: any, event) => {
await reportWorkflowIssueToOrganization(c, event, {
actorType: "task_status_sync",
organizationId: c.state.workspaceId,
scopeId: c.state.sessionId,
scopeLabel: `Task status sync ${c.state.taskId}`,
});
},
}),
});
};
export const taskStatusSync = (actor as any)(taskStatusSyncConfig);

View file

@ -12,13 +12,15 @@ import type {
ProviderId,
} from "@sandbox-agent/foundry-shared";
import { expectQueueResponse } from "../../services/queue.js";
import { selfTask } from "../handles.js";
import { reportWorkflowIssueToOrganization } from "../runtime-issues.js";
import { getOrCreateGithubState, selfTask } from "../handles.js";
import { taskDb } from "./db/db.js";
import { getCurrentRecord } from "./workflow/common.js";
import {
changeWorkbenchModel,
closeWorkbenchSession,
createWorkbenchSession,
getWorkbenchTaskSummary,
getWorkbenchTask,
markWorkbenchUnread,
publishWorkbenchPr,
@ -48,6 +50,8 @@ export interface TaskInput {
explicitTitle: string | null;
explicitBranchName: string | null;
initialPrompt: string | null;
createdAt?: number | null;
updatedAt?: number | null;
}
interface InitializeCommand {
@ -107,7 +111,7 @@ interface TaskWorkbenchSessionCommand {
sessionId: string;
}
export const task = actor({
const taskConfig: any = {
db: taskDb,
queues: Object.fromEntries(TASK_QUEUE_NAMES.map((name) => [name, queue()])),
options: {
@ -127,17 +131,18 @@ export const task = actor({
explicitTitle: input.explicitTitle,
explicitBranchName: input.explicitBranchName,
initialPrompt: input.initialPrompt,
createdAt: input.createdAt ?? Date.now(),
updatedAt: input.updatedAt ?? Date.now(),
initialized: false,
previousStatus: null as string | null,
}),
actions: {
async initialize(c, cmd: InitializeCommand): Promise<TaskRecord> {
const self = selfTask(c);
const result = await self.send(taskWorkflowQueueName("task.command.initialize"), cmd ?? {}, {
wait: true,
timeout: 60_000,
await self.send(taskWorkflowQueueName("task.command.initialize"), cmd ?? {}, {
wait: false,
});
return expectQueueResponse<TaskRecord>(result);
return await getCurrentRecord({ db: c.db, state: c.state });
},
async provision(c, cmd: InitializeCommand): Promise<{ ok: true }> {
@ -223,13 +228,31 @@ export const task = actor({
},
async get(c): Promise<TaskRecord> {
return await getCurrentRecord({ db: c.db, state: c.state });
const record = await getCurrentRecord({ db: c.db, state: c.state });
if (!record.branchName) {
return record;
}
const githubState = await getOrCreateGithubState(c, c.state.workspaceId);
const pr = await githubState.getPullRequestForBranch({
repoId: c.state.repoId,
branchName: record.branchName,
});
return {
...record,
prUrl: pr?.url ?? null,
};
},
async getWorkbench(c) {
return await getWorkbenchTask(c);
},
async getWorkbenchSummary(c) {
return await getWorkbenchTaskSummary(c);
},
async markWorkbenchUnread(c): Promise<void> {
const self = selfTask(c);
await self.send(
@ -383,7 +406,18 @@ export const task = actor({
});
},
},
run: workflow(runTaskWorkflow),
});
run: workflow(runTaskWorkflow, {
onError: async (c: any, event) => {
await reportWorkflowIssueToOrganization(c, event, {
actorType: "task",
organizationId: c.state.workspaceId,
scopeId: c.state.taskId,
scopeLabel: `Task ${c.state.taskId}`,
});
},
}),
};
export const task = (actor as any)(taskConfig);
export { TASK_QUEUE_NAMES };

View file

@ -2,10 +2,11 @@
import { basename } from "node:path";
import { asc, eq } from "drizzle-orm";
import { getActorRuntimeContext } from "../context.js";
import { getOrCreateTaskStatusSync, getOrCreateProject, getOrCreateWorkspace, getSandboxInstance, selfTask } from "../handles.js";
import { resolveWorkspaceGithubAuth } from "../../services/github-auth.js";
import { getOrCreateGithubState, getOrCreateTaskStatusSync, getOrCreateRepository, getOrCreateOrganization, getSandboxInstance, selfTask } from "../handles.js";
import { logActorWarning, resolveErrorMessage } from "../logging.js";
import { task as taskTable, taskRuntime, taskWorkbenchSessions } from "./db/schema.js";
import { getCurrentRecord } from "./workflow/common.js";
import { pushActiveBranchActivity } from "./workflow/push.js";
const STATUS_SYNC_INTERVAL_MS = 1_000;
@ -39,6 +40,71 @@ function agentKindForModel(model: string) {
return "Claude";
}
function taskLifecycleState(status: string) {
if (status === "error") {
return "error";
}
if (status === "archived") {
return "archived";
}
if (status === "killed") {
return "killed";
}
if (status === "running" || status === "idle" || status === "init_complete") {
return "ready";
}
return "starting";
}
function taskLifecycleLabel(status: string) {
switch (status) {
case "init_bootstrap_db":
return "Bootstrapping task state";
case "init_enqueue_provision":
return "Queueing sandbox provision";
case "init_ensure_name":
return "Preparing task name";
case "init_assert_name":
return "Confirming task name";
case "init_create_sandbox":
return "Creating sandbox";
case "init_ensure_agent":
return "Waiting for sandbox agent";
case "init_start_sandbox_instance":
return "Starting sandbox runtime";
case "init_create_session":
return "Creating first session";
case "init_write_db":
return "Saving task state";
case "init_start_status_sync":
return "Starting task status sync";
case "init_complete":
return "Task initialized";
case "running":
return "Agent running";
case "idle":
return "Task idle";
case "archive_stop_status_sync":
return "Stopping task status sync";
case "archive_release_sandbox":
return "Releasing sandbox";
case "archive_finalize":
return "Finalizing archive";
case "archived":
return "Task archived";
case "kill_destroy_sandbox":
return "Destroying sandbox";
case "kill_finalize":
return "Finalizing task shutdown";
case "killed":
return "Task killed";
case "error":
return "Task error";
default:
return status.replaceAll("_", " ");
}
}
export function agentTypeForModel(model: string) {
if (model === "gpt-4o" || model === "o3") {
return "codex";
@ -185,14 +251,13 @@ async function updateSessionMeta(c: any, sessionId: string, values: Record<strin
}
async function notifyWorkbenchUpdated(c: any): Promise<void> {
const workspace = await getOrCreateWorkspace(c, c.state.workspaceId);
if (typeof c?.client !== "function") {
return;
}
const workspace = await getOrCreateOrganization(c, c.state.workspaceId);
await workspace.notifyWorkbenchUpdated({});
}
function shellFragment(parts: string[]): string {
return parts.join(" && ");
}
async function executeInSandbox(
c: any,
params: {
@ -202,14 +267,18 @@ async function executeInSandbox(
label: string;
},
): Promise<{ exitCode: number; result: string }> {
const { providers } = getActorRuntimeContext();
const provider = providers.get(c.state.providerId);
return await provider.executeCommand({
workspaceId: c.state.workspaceId,
sandboxId: params.sandboxId,
command: `bash -lc ${JSON.stringify(shellFragment([`cd ${JSON.stringify(params.cwd)}`, params.command]))}`,
label: params.label,
const sandbox = getSandboxInstance(c, c.state.workspaceId, c.state.providerId, params.sandboxId);
const result = await sandbox.runProcess({
command: "bash",
args: ["-lc", params.command],
cwd: params.cwd,
timeoutMs: 120_000,
maxOutputBytes: 1024 * 1024 * 4,
});
return {
exitCode: typeof result.exitCode === "number" ? result.exitCode : result.timedOut ? 124 : 1,
result: [result.stdout ?? "", result.stderr ?? ""].filter(Boolean).join(""),
};
}
function parseGitStatus(output: string): Array<{ path: string; type: "M" | "A" | "D" }> {
@ -409,7 +478,7 @@ async function readPullRequestSummary(c: any, branchName: string | null) {
}
try {
const project = await getOrCreateProject(c, c.state.workspaceId, c.state.repoId, c.state.repoRemote);
const project = await getOrCreateRepository(c, c.state.workspaceId, c.state.repoId, c.state.repoRemote);
return await project.getPullRequestForBranch({ branchName });
} catch {
return null;
@ -428,6 +497,71 @@ export async function ensureWorkbenchSeeded(c: any): Promise<any> {
return record;
}
async function buildWorkbenchTabsSummary(c: any, record: any): Promise<any[]> {
const sessions = await listSessionMetaRows(c);
return sessions.map((meta) => {
const status =
record.activeSessionId === meta.sessionId ? (record.status === "error" ? "error" : record.status === "running" ? "running" : "idle") : "idle";
return {
id: meta.id,
sessionId: meta.sessionId,
sessionName: meta.sessionName,
agent: agentKindForModel(meta.model),
model: meta.model,
status,
thinkingSinceMs: status === "running" ? (meta.thinkingSinceMs ?? null) : null,
unread: Boolean(meta.unread),
created: Boolean(meta.created),
draft: {
text: meta.draftText ?? "",
attachments: Array.isArray(meta.draftAttachments) ? meta.draftAttachments : [],
updatedAtMs: meta.draftUpdatedAtMs ?? null,
},
transcript: [],
};
});
}
async function buildWorkbenchTaskPayload(
c: any,
record: any,
tabs: any[],
gitState: { fileChanges: any[]; diffs: Record<string, string>; fileTree: any[] },
): Promise<any> {
return {
id: c.state.taskId,
repoId: c.state.repoId,
title: record.title ?? "New Task",
status: record.status === "archived" ? "archived" : record.status === "running" ? "running" : record.status === "idle" ? "idle" : "new",
lifecycle: {
code: record.status,
state: taskLifecycleState(record.status),
label: taskLifecycleLabel(record.status),
message: record.statusMessage ?? null,
},
repoName: repoLabelFromRemote(c.state.repoRemote),
updatedAtMs: record.updatedAt,
branch: record.branchName,
pullRequest: await readPullRequestSummary(c, record.branchName),
tabs,
fileChanges: gitState.fileChanges,
diffs: gitState.diffs,
fileTree: gitState.fileTree,
minutesUsed: 0,
};
}
export async function getWorkbenchTaskSummary(c: any): Promise<any> {
const record = await ensureWorkbenchSeeded(c);
const tabs = await buildWorkbenchTabsSummary(c, record);
return await buildWorkbenchTaskPayload(c, record, tabs, {
fileChanges: [],
diffs: {},
fileTree: [],
});
}
export async function getWorkbenchTask(c: any): Promise<any> {
const record = await ensureWorkbenchSeeded(c);
const gitState = await collectWorkbenchGitState(c, record);
@ -462,21 +596,7 @@ export async function getWorkbenchTask(c: any): Promise<any> {
});
}
return {
id: c.state.taskId,
repoId: c.state.repoId,
title: record.title ?? "New Task",
status: record.status === "archived" ? "archived" : record.status === "running" ? "running" : record.status === "idle" ? "idle" : "new",
repoName: repoLabelFromRemote(c.state.repoRemote),
updatedAtMs: record.updatedAt,
branch: record.branchName,
pullRequest: await readPullRequestSummary(c, record.branchName),
tabs,
fileChanges: gitState.fileChanges,
diffs: gitState.diffs,
fileTree: gitState.fileTree,
minutesUsed: 0,
};
return await buildWorkbenchTaskPayload(c, record, tabs, gitState);
}
export async function renameWorkbenchTask(c: any, value: string): Promise<void> {
@ -540,7 +660,7 @@ export async function renameWorkbenchBranch(c: any, value: string): Promise<void
.run();
c.state.branchName = nextBranch;
const project = await getOrCreateProject(c, c.state.workspaceId, c.state.repoId, c.state.repoRemote);
const project = await getOrCreateRepository(c, c.state.workspaceId, c.state.repoId, c.state.repoRemote);
await project.registerTaskBranch({
taskId: c.state.taskId,
branchName: nextBranch,
@ -680,7 +800,16 @@ export async function sendWorkbenchMessage(c: any, sessionId: string, text: stri
});
await sync.setIntervalMs({ intervalMs: STATUS_SYNC_INTERVAL_MS });
await sync.start();
await sync.force();
void sync.force().catch((error: unknown) => {
logActorWarning("task.workbench", "session status sync force failed", {
workspaceId: c.state.workspaceId,
repoId: c.state.repoId,
taskId: c.state.taskId,
sandboxId: record.activeSandboxId,
sessionId,
error: resolveErrorMessage(error),
});
});
await notifyWorkbenchUpdated(c);
}
@ -803,10 +932,17 @@ export async function publishWorkbenchPr(c: any): Promise<void> {
if (!record.branchName) {
throw new Error("cannot publish PR without a branch");
}
const { driver } = getActorRuntimeContext();
const auth = await resolveWorkspaceGithubAuth(c, c.state.workspaceId);
const created = await driver.github.createPr(c.state.repoLocalPath, record.branchName, record.title ?? c.state.task, undefined, {
githubToken: auth?.githubToken ?? null,
await pushActiveBranchActivity(c, {
reason: "publish_pr",
historyKind: "task.push.pr_publish",
commitMessage: record.title ?? c.state.task,
});
const githubState = await getOrCreateGithubState(c, c.state.workspaceId);
await githubState.createPullRequest({
repoId: c.state.repoId,
repoPath: c.state.repoLocalPath,
branchName: record.branchName,
title: record.title ?? c.state.task,
});
await c.db
.update(taskTable)

View file

@ -1,7 +1,7 @@
// @ts-nocheck
import { eq } from "drizzle-orm";
import type { TaskRecord, TaskStatus } from "@sandbox-agent/foundry-shared";
import { getOrCreateWorkspace } from "../../handles.js";
import { getOrCreateOrganization } from "../../handles.js";
import { task as taskTable, taskRuntime, taskSandboxes } from "../db/schema.js";
import { historyKey } from "../../keys.js";
@ -83,8 +83,10 @@ export async function setTaskState(ctx: any, status: TaskStatus, statusMessage?:
.run();
}
const workspace = await getOrCreateWorkspace(ctx, ctx.state.workspaceId);
await workspace.notifyWorkbenchUpdated({});
if (typeof ctx?.client === "function") {
const workspace = await getOrCreateOrganization(ctx, ctx.state.workspaceId);
await workspace.notifyWorkbenchUpdated({});
}
}
export async function getCurrentRecord(ctx: any): Promise<TaskRecord> {
@ -110,7 +112,34 @@ export async function getCurrentRecord(ctx: any): Promise<TaskRecord> {
.get();
if (!row) {
throw new Error(`Task not found: ${ctx.state.taskId}`);
return {
workspaceId: ctx.state.workspaceId,
repoId: ctx.state.repoId,
repoRemote: ctx.state.repoRemote,
taskId: ctx.state.taskId,
branchName: ctx.state.branchName ?? null,
title: ctx.state.title ?? null,
task: ctx.state.task,
providerId: ctx.state.providerId,
status: "init_enqueue_provision",
statusMessage: "provision queued",
activeSandboxId: null,
activeSessionId: null,
sandboxes: [],
agentType: ctx.state.agentType ?? null,
prSubmitted: false,
diffStat: null,
hasUnpushed: null,
conflictsWithMain: null,
parentBranch: null,
prUrl: null,
prAuthor: null,
ciStatus: null,
reviewStatus: null,
reviewer: null,
createdAt: ctx.state.createdAt ?? Date.now(),
updatedAt: ctx.state.updatedAt ?? ctx.state.createdAt ?? Date.now(),
} satisfies TaskRecord;
}
const sandboxes = await db
@ -165,17 +194,19 @@ export async function getCurrentRecord(ctx: any): Promise<TaskRecord> {
}
export async function appendHistory(ctx: any, kind: string, payload: Record<string, unknown>): Promise<void> {
const client = ctx.client();
const history = await client.history.getOrCreate(historyKey(ctx.state.workspaceId, ctx.state.repoId), {
createWithInput: { workspaceId: ctx.state.workspaceId, repoId: ctx.state.repoId },
});
await history.append({
kind,
taskId: ctx.state.taskId,
branchName: ctx.state.branchName,
payload,
});
if (typeof ctx?.client === "function") {
const client = ctx.client();
const history = await client.history.getOrCreate(historyKey(ctx.state.workspaceId, ctx.state.repoId), {
createWithInput: { workspaceId: ctx.state.workspaceId, repoId: ctx.state.repoId },
});
await history.append({
kind,
taskId: ctx.state.taskId,
branchName: ctx.state.branchName,
payload,
});
const workspace = await getOrCreateWorkspace(ctx, ctx.state.workspaceId);
await workspace.notifyWorkbenchUpdated({});
const workspace = await getOrCreateOrganization(ctx, ctx.state.workspaceId);
await workspace.notifyWorkbenchUpdated({});
}
}

View file

@ -8,6 +8,7 @@ import {
initCompleteActivity,
initCreateSandboxActivity,
initCreateSessionActivity,
initEnqueueProvisionActivity,
initEnsureAgentActivity,
initEnsureNameActivity,
initExposeSandboxActivity,
@ -56,7 +57,7 @@ const commandHandlers: Record<TaskQueueName, WorkflowHandler> = {
const body = msg.body;
await loopCtx.step("init-bootstrap-db", async () => initBootstrapDbActivity(loopCtx, body));
await loopCtx.removed("init-enqueue-provision", "step");
await loopCtx.step("init-enqueue-provision", async () => initEnqueueProvisionActivity(loopCtx, body));
await loopCtx.removed("init-dispatch-provision-v2", "step");
const currentRecord = await loopCtx.step("init-read-current-record", async () => getCurrentRecord(loopCtx));

View file

@ -3,7 +3,14 @@ import { desc, eq } from "drizzle-orm";
import { resolveCreateFlowDecision } from "../../../services/create-flow.js";
import { resolveWorkspaceGithubAuth } from "../../../services/github-auth.js";
import { getActorRuntimeContext } from "../../context.js";
import { getOrCreateTaskStatusSync, getOrCreateHistory, getOrCreateProject, getOrCreateSandboxInstance, getSandboxInstance, selfTask } from "../../handles.js";
import {
getOrCreateTaskStatusSync,
getOrCreateHistory,
getOrCreateRepository,
getOrCreateSandboxInstance,
getSandboxInstance,
selfTask,
} from "../../handles.js";
import { logActorWarning, resolveErrorMessage } from "../../logging.js";
import { task as taskTable, taskRuntime, taskSandboxes } from "../db/schema.js";
import { TASK_ROW_ID, appendHistory, buildAgentPrompt, collectErrorMessages, resolveErrorDetail, setTaskState } from "./common.js";
@ -166,7 +173,7 @@ export async function initEnsureNameActivity(loopCtx: any): Promise<void> {
(branch: any) => branch.branchName,
);
const project = await getOrCreateProject(loopCtx, loopCtx.state.workspaceId, loopCtx.state.repoId, loopCtx.state.repoRemote);
const project = await getOrCreateRepository(loopCtx, loopCtx.state.workspaceId, loopCtx.state.repoId, loopCtx.state.repoRemote);
const reservedBranches = await project.listReservedBranches({});
const resolved = resolveCreateFlowDecision({
@ -516,7 +523,16 @@ export async function initStartStatusSyncActivity(loopCtx: any, body: any, sandb
});
await sync.start();
await sync.force();
void sync.force().catch((error: unknown) => {
logActorWarning("task.init", "initial status sync force failed", {
workspaceId: loopCtx.state.workspaceId,
repoId: loopCtx.state.repoId,
taskId: loopCtx.state.taskId,
sandboxId: sandbox.sandboxId,
sessionId,
error: resolveErrorMessage(error),
});
});
}
export async function initCompleteActivity(loopCtx: any, body: any, sandbox: any, session: any): Promise<void> {

View file

@ -1,18 +1,26 @@
// @ts-nocheck
import { eq } from "drizzle-orm";
import { getActorRuntimeContext } from "../../context.js";
import { resolveWorkspaceGithubAuth } from "../../../services/github-auth.js";
import { taskRuntime, taskSandboxes } from "../db/schema.js";
import { TASK_ROW_ID, appendHistory, getCurrentRecord } from "./common.js";
export interface PushActiveBranchOptions {
reason?: string | null;
historyKind?: string;
commitMessage?: string | null;
}
function wrapBashScript(script: string): string {
const encoded = Buffer.from(script, "utf8").toString("base64");
return `bash -lc "$(printf %s ${JSON.stringify(encoded)} | base64 -d)"`;
}
export async function pushActiveBranchActivity(loopCtx: any, options: PushActiveBranchOptions = {}): Promise<void> {
const record = await getCurrentRecord(loopCtx);
const activeSandboxId = record.activeSandboxId;
const branchName = loopCtx.state.branchName ?? record.branchName;
const commitMessage = (options.commitMessage?.trim() || loopCtx.state.title?.trim() || branchName || "Foundry update").slice(0, 240);
if (!activeSandboxId) {
throw new Error("cannot push: no active sandbox");
@ -30,6 +38,13 @@ export async function pushActiveBranchActivity(loopCtx: any, options: PushActive
const { providers } = getActorRuntimeContext();
const provider = providers.get(providerId);
const auth = await resolveWorkspaceGithubAuth(loopCtx, loopCtx.state.workspaceId);
const commandEnv =
auth?.githubToken && auth.githubToken.trim().length > 0
? {
GITHUB_TOKEN: auth.githubToken,
}
: undefined;
const now = Date.now();
await loopCtx.db
@ -47,15 +62,29 @@ export async function pushActiveBranchActivity(loopCtx: any, options: PushActive
const script = [
"set -euo pipefail",
`cd ${JSON.stringify(cwd)}`,
"export GIT_TERMINAL_PROMPT=0",
"git rev-parse --verify HEAD >/dev/null",
"git config credential.helper '!f() { echo username=x-access-token; echo password=${GH_TOKEN:-$GITHUB_TOKEN}; }; f'",
'git config user.email "foundry@local" >/dev/null 2>&1 || true',
'git config user.name "Foundry" >/dev/null 2>&1 || true',
'git config credential.helper ""',
"if ! git config --local --get http.https://github.com/.extraheader >/dev/null 2>&1; then",
' TOKEN="${GITHUB_TOKEN:-}"',
' if [ -z "$TOKEN" ]; then echo "missing github token for push" >&2; exit 1; fi',
" AUTH_HEADER=\"$(printf 'x-access-token:%s' \"$TOKEN\" | base64 | tr -d '\\n')\"",
' git config http.https://github.com/.extraheader "AUTHORIZATION: basic $AUTH_HEADER"',
"fi",
"git add -A",
"if ! git diff --cached --quiet --ignore-submodules --; then",
` git commit -m ${JSON.stringify(commitMessage)}`,
"fi",
`git push -u origin ${JSON.stringify(branchName)}`,
].join("; ");
].join("\n");
const result = await provider.executeCommand({
workspaceId: loopCtx.state.workspaceId,
sandboxId: activeSandboxId,
command: ["bash", "-lc", JSON.stringify(script)].join(" "),
command: wrapBashScript(script),
...(commandEnv ? { env: commandEnv } : {}),
label: `git push ${branchName}`,
});

View file

@ -1,6 +1,7 @@
// @ts-nocheck
import { eq } from "drizzle-orm";
import { getActorRuntimeContext } from "../../context.js";
import { getOrCreateGithubState } from "../../handles.js";
import { logActorWarning, resolveErrorMessage } from "../../logging.js";
import { resolveWorkspaceGithubAuth } from "../../../services/github-auth.js";
import { task as taskTable, taskRuntime, taskSandboxes } from "../db/schema.js";
@ -101,8 +102,12 @@ export async function idleSubmitPrActivity(loopCtx: any): Promise<void> {
historyKind: "task.push.auto",
});
const pr = await driver.github.createPr(loopCtx.state.repoLocalPath, loopCtx.state.branchName, loopCtx.state.title, undefined, {
githubToken: auth?.githubToken ?? null,
const githubState = await getOrCreateGithubState(loopCtx, loopCtx.state.workspaceId);
const pr = await githubState.createPullRequest({
repoId: loopCtx.state.repoId,
repoPath: loopCtx.state.repoLocalPath,
branchName: loopCtx.state.branchName,
title: loopCtx.state.title,
});
await db.update(taskTable).set({ prSubmitted: 1, updatedAt: Date.now() }).where(eq(taskTable.id, TASK_ROW_ID)).run();

View file

@ -0,0 +1,5 @@
import { db } from "rivetkit/db/drizzle";
import * as schema from "./schema.js";
import migrations from "./migrations.js";
export const userGithubDataDb = db({ schema, migrations });

View file

@ -0,0 +1,28 @@
const journal = {
entries: [
{
idx: 0,
when: 1773355200000,
tag: "0000_user_github_data",
breakpoints: true,
},
],
} as const;
export default {
journal,
migrations: {
m0000: `CREATE TABLE \`user_github_data\` (
\`id\` integer PRIMARY KEY NOT NULL,
\`github_user_id\` text NOT NULL,
\`github_login\` text NOT NULL,
\`display_name\` text NOT NULL,
\`email\` text NOT NULL,
\`access_token\` text NOT NULL,
\`scopes_json\` text NOT NULL,
\`eligible_organization_ids_json\` text NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
} as const,
};

View file

@ -0,0 +1,13 @@
import { integer, sqliteTable, text } from "rivetkit/db/drizzle";
export const userGithubData = sqliteTable("user_github_data", {
id: integer("id").primaryKey(),
githubUserId: text("github_user_id").notNull(),
githubLogin: text("github_login").notNull(),
displayName: text("display_name").notNull(),
email: text("email").notNull(),
accessToken: text("access_token").notNull(),
scopesJson: text("scopes_json").notNull(),
eligibleOrganizationIdsJson: text("eligible_organization_ids_json").notNull(),
updatedAt: integer("updated_at").notNull(),
});

View file

@ -0,0 +1,114 @@
// @ts-nocheck
import { eq } from "drizzle-orm";
import { actor } from "rivetkit";
import { userGithubDataDb } from "./db/db.js";
import { userGithubData } from "./db/schema.js";
const PROFILE_ROW_ID = 1;
interface UserGithubDataInput {
userId: string;
}
function parseEligibleOrganizationIds(value: string): string[] {
try {
const parsed = JSON.parse(value);
if (!Array.isArray(parsed)) {
return [];
}
return parsed.filter((entry): entry is string => typeof entry === "string" && entry.length > 0);
} catch {
return [];
}
}
function encodeEligibleOrganizationIds(value: string[]): string {
return JSON.stringify([...new Set(value)]);
}
async function readProfileRow(c: any) {
return await c.db.select().from(userGithubData).where(eq(userGithubData.id, PROFILE_ROW_ID)).get();
}
export const userGithub = actor({
db: userGithubDataDb,
createState: (_c, input: UserGithubDataInput) => ({
userId: input.userId,
}),
actions: {
async upsert(
c,
input: {
githubUserId: string;
githubLogin: string;
displayName: string;
email: string;
accessToken: string;
scopes: string[];
eligibleOrganizationIds: string[];
},
): Promise<void> {
const now = Date.now();
await c.db
.insert(userGithubData)
.values({
id: PROFILE_ROW_ID,
githubUserId: input.githubUserId,
githubLogin: input.githubLogin,
displayName: input.displayName,
email: input.email,
accessToken: input.accessToken,
scopesJson: JSON.stringify(input.scopes),
eligibleOrganizationIdsJson: encodeEligibleOrganizationIds(input.eligibleOrganizationIds),
updatedAt: now,
})
.onConflictDoUpdate({
target: userGithubData.id,
set: {
githubUserId: input.githubUserId,
githubLogin: input.githubLogin,
displayName: input.displayName,
email: input.email,
accessToken: input.accessToken,
scopesJson: JSON.stringify(input.scopes),
eligibleOrganizationIdsJson: encodeEligibleOrganizationIds(input.eligibleOrganizationIds),
updatedAt: now,
},
})
.run();
},
async getProfile(c): Promise<{
userId: string;
githubUserId: string;
githubLogin: string;
displayName: string;
email: string;
eligibleOrganizationIds: string[];
} | null> {
const row = await readProfileRow(c);
if (!row) {
return null;
}
return {
userId: c.state.userId,
githubUserId: row.githubUserId,
githubLogin: row.githubLogin,
displayName: row.displayName,
email: row.email,
eligibleOrganizationIds: parseEligibleOrganizationIds(row.eligibleOrganizationIdsJson),
};
},
async getAuth(c): Promise<{ accessToken: string; scopes: string[] } | null> {
const row = await readProfileRow(c);
if (!row) {
return null;
}
return {
accessToken: row.accessToken,
scopes: JSON.parse(row.scopesJson) as string[],
};
},
},
});

View file

@ -1,6 +0,0 @@
import { defineConfig } from "rivetkit/db/drizzle";
export default defineConfig({
out: "./src/actors/workspace/db/drizzle",
schema: "./src/actors/workspace/db/schema.ts",
});

View file

@ -1,187 +0,0 @@
// This file is generated by src/actors/_scripts/generate-actor-migrations.ts.
// Source of truth is drizzle-kit output under ./drizzle (meta/_journal.json + *.sql).
// Do not hand-edit this file.
const journal = {
entries: [
{
idx: 0,
when: 1770924376525,
tag: "0000_rare_iron_man",
breakpoints: true,
},
{
idx: 1,
when: 1770947252912,
tag: "0001_sleepy_lady_deathstrike",
breakpoints: true,
},
{
idx: 2,
when: 1772668800000,
tag: "0002_tiny_silver_surfer",
breakpoints: true,
},
{
idx: 3,
when: 1773100800000,
tag: "0003_app_shell_organization_profile",
breakpoints: true,
},
{
idx: 4,
when: 1773100800001,
tag: "0004_app_shell_organization_members",
breakpoints: true,
},
{
idx: 5,
when: 1773100800002,
tag: "0005_app_shell_seat_assignments",
breakpoints: true,
},
{
idx: 6,
when: 1773100800003,
tag: "0006_app_shell_invoices",
breakpoints: true,
},
{
idx: 7,
when: 1773100800004,
tag: "0007_app_shell_sessions",
breakpoints: true,
},
{
idx: 8,
when: 1773100800005,
tag: "0008_app_shell_stripe_lookup",
breakpoints: true,
},
{
idx: 9,
when: 1773100800006,
tag: "0009_github_sync_status",
breakpoints: true,
},
{
idx: 10,
when: 1772928000000,
tag: "0010_app_session_starter_repo",
breakpoints: true,
},
],
} as const;
export default {
journal,
migrations: {
m0000: `CREATE TABLE \`provider_profiles\` (
\`provider_id\` text PRIMARY KEY NOT NULL,
\`profile_json\` text NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
m0001: `CREATE TABLE \`repos\` (
\`repo_id\` text PRIMARY KEY NOT NULL,
\`remote_url\` text NOT NULL,
\`created_at\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
m0002: `CREATE TABLE \`task_lookup\` (
\`task_id\` text PRIMARY KEY NOT NULL,
\`repo_id\` text NOT NULL
);
`,
m0003: `CREATE TABLE \`organization_profile\` (
\`id\` text PRIMARY KEY NOT NULL,
\`kind\` text NOT NULL,
\`github_account_id\` text NOT NULL,
\`github_login\` text NOT NULL,
\`github_account_type\` text NOT NULL,
\`display_name\` text NOT NULL,
\`slug\` text NOT NULL,
\`primary_domain\` text NOT NULL,
\`default_model\` text NOT NULL,
\`auto_import_repos\` integer NOT NULL,
\`repo_import_status\` text NOT NULL,
\`github_connected_account\` text NOT NULL,
\`github_installation_status\` text NOT NULL,
\`github_installation_id\` integer,
\`github_last_sync_label\` text NOT NULL,
\`stripe_customer_id\` text,
\`stripe_subscription_id\` text,
\`stripe_price_id\` text,
\`billing_plan_id\` text NOT NULL,
\`billing_status\` text NOT NULL,
\`billing_seats_included\` integer NOT NULL,
\`billing_trial_ends_at\` text,
\`billing_renewal_at\` text,
\`billing_payment_method_label\` text NOT NULL,
\`created_at\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
m0004: `CREATE TABLE \`organization_members\` (
\`id\` text PRIMARY KEY NOT NULL,
\`name\` text NOT NULL,
\`email\` text NOT NULL,
\`role\` text NOT NULL,
\`state\` text NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
m0005: `CREATE TABLE \`seat_assignments\` (
\`email\` text PRIMARY KEY NOT NULL,
\`created_at\` integer NOT NULL
);
`,
m0006: `CREATE TABLE \`invoices\` (
\`id\` text PRIMARY KEY NOT NULL,
\`label\` text NOT NULL,
\`issued_at\` text NOT NULL,
\`amount_usd\` integer NOT NULL,
\`status\` text NOT NULL,
\`created_at\` integer NOT NULL
);
`,
m0007: `CREATE TABLE \`app_sessions\` (
\`id\` text PRIMARY KEY NOT NULL,
\`current_user_id\` text,
\`current_user_name\` text,
\`current_user_email\` text,
\`current_user_github_login\` text,
\`current_user_role_label\` text,
\`eligible_organization_ids_json\` text NOT NULL,
\`active_organization_id\` text,
\`github_access_token\` text,
\`github_scope\` text NOT NULL,
\`starter_repo_status\` text NOT NULL,
\`starter_repo_starred_at\` integer,
\`starter_repo_skipped_at\` integer,
\`oauth_state\` text,
\`oauth_state_expires_at\` integer,
\`created_at\` integer NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
m0008: `CREATE TABLE \`stripe_lookup\` (
\`lookup_key\` text PRIMARY KEY NOT NULL,
\`organization_id\` text NOT NULL,
\`updated_at\` integer NOT NULL
);
`,
m0009: `ALTER TABLE \`organization_profile\` ADD COLUMN \`github_sync_status\` text NOT NULL DEFAULT 'pending';
ALTER TABLE \`organization_profile\` ADD COLUMN \`github_last_sync_at\` integer;
UPDATE \`organization_profile\`
SET \`github_sync_status\` = CASE
WHEN \`repo_import_status\` = 'ready' THEN 'synced'
WHEN \`repo_import_status\` = 'importing' THEN 'syncing'
ELSE 'pending'
END;
`,
m0010: `-- no-op: starter_repo_* columns are already present in m0007 app_sessions
`,
} as const,
};

View file

@ -1,17 +0,0 @@
import { actor, queue } from "rivetkit";
import { workflow } from "rivetkit/workflow";
import { workspaceDb } from "./db/db.js";
import { runWorkspaceWorkflow, WORKSPACE_QUEUE_NAMES, workspaceActions } from "./actions.js";
export const workspace = actor({
db: workspaceDb,
queues: Object.fromEntries(WORKSPACE_QUEUE_NAMES.map((name) => [name, queue()])),
options: {
actionTimeout: 5 * 60_000,
},
createState: (_c, workspaceId: string) => ({
workspaceId,
}),
actions: workspaceActions,
run: workflow(runWorkspaceWorkflow),
});

View file

@ -9,11 +9,19 @@ import type {
ProcessInfo,
ProcessLogFollowQuery,
ProcessLogsResponse,
ProcessRunRequest,
ProcessRunResponse,
ProcessSignalQuery,
SessionEvent,
SessionRecord,
} from "sandbox-agent";
import type { DaytonaClientOptions, DaytonaCreateSandboxOptions, DaytonaPreviewEndpoint, DaytonaSandbox } from "./integrations/daytona/client.js";
import type {
DaytonaClientOptions,
DaytonaCreateSandboxOptions,
DaytonaExecuteCommandResult,
DaytonaPreviewEndpoint,
DaytonaSandbox,
} from "./integrations/daytona/client.js";
import {
validateRemote,
ensureCloned,
@ -35,7 +43,7 @@ import {
gitSpiceSyncRepo,
gitSpiceTrackBranch,
} from "./integrations/git-spice/index.js";
import { listPullRequests, createPr, starRepository } from "./integrations/github/index.js";
import { listPullRequests, getPrInfo, createPr, starRepository } from "./integrations/github/index.js";
import { SandboxAgentClient } from "./integrations/sandbox-agent/client.js";
import { DaytonaClient } from "./integrations/daytona/client.js";
@ -69,6 +77,7 @@ export interface StackDriver {
export interface GithubDriver {
listPullRequests(repoPath: string, options?: { githubToken?: string | null }): Promise<PullRequestSnapshot[]>;
getPrInfo(repoPath: string, branchName: string, options?: { githubToken?: string | null }): Promise<PullRequestSnapshot | null>;
createPr(
repoPath: string,
headBranch: string,
@ -85,6 +94,7 @@ export interface SandboxAgentClientLike {
listSessions(request?: ListPageRequest): Promise<ListPage<SessionRecord>>;
listEvents(request: ListEventsRequest): Promise<ListPage<SessionEvent>>;
createProcess(request: ProcessCreateRequest): Promise<ProcessInfo>;
runProcess(request: ProcessRunRequest): Promise<ProcessRunResponse>;
listProcesses(): Promise<{ processes: ProcessInfo[] }>;
getProcessLogs(processId: string, query?: ProcessLogFollowQuery): Promise<ProcessLogsResponse>;
stopProcess(processId: string, query?: ProcessSignalQuery): Promise<ProcessInfo>;
@ -105,8 +115,8 @@ export interface DaytonaClientLike {
startSandbox(sandboxId: string, timeoutSeconds?: number): Promise<void>;
stopSandbox(sandboxId: string, timeoutSeconds?: number): Promise<void>;
deleteSandbox(sandboxId: string): Promise<void>;
executeCommand(sandboxId: string, command: string): Promise<{ exitCode: number; result: string }>;
getPreviewEndpoint(sandboxId: string, port: number): Promise<DaytonaPreviewEndpoint>;
executeCommand(sandboxId: string, command: string, env?: Record<string, string>, timeoutSeconds?: number): Promise<DaytonaExecuteCommandResult>;
}
export interface DaytonaDriver {
@ -154,6 +164,7 @@ export function createDefaultDriver(): BackendDriver {
},
github: {
listPullRequests,
getPrInfo,
createPr,
starRepository,
},

View file

@ -2,7 +2,7 @@ import { Hono } from "hono";
import { cors } from "hono/cors";
import { initActorRuntimeContext } from "./actors/context.js";
import { registry, resolveManagerPort } from "./actors/index.js";
import { workspaceKey } from "./actors/keys.js";
import { organizationKey } from "./actors/keys.js";
import { loadConfig } from "./config/backend.js";
import { createBackends, createNotificationService } from "./notifications/index.js";
import { createDefaultDriver } from "./driver.js";
@ -10,7 +10,7 @@ import { createProviderRegistry } from "./providers/index.js";
import { createClient } from "rivetkit/client";
import type { FoundryBillingPlanId } from "@sandbox-agent/foundry-shared";
import { createDefaultAppShellServices } from "./services/app-shell-runtime.js";
import { APP_SHELL_WORKSPACE_ID } from "./actors/workspace/app-shell.js";
import { APP_SHELL_ORGANIZATION_ID } from "./actors/organization/app-shell.js";
export interface BackendStartOptions {
host?: string;
@ -40,9 +40,13 @@ async function withRetries<T>(run: () => Promise<T>, attempts = 20, delayMs = 25
}
export async function startBackend(options: BackendStartOptions = {}): Promise<void> {
process.on("unhandledRejection", (reason) => {
console.error("foundry backend unhandled rejection", reason);
});
// sandbox-agent agent plugins vary on which env var they read for OpenAI/Codex auth.
// Normalize to keep local dev + docker-compose simple.
if (!process.env.CODEX_API_KEY && process.env.OPENAI_API_KEY) {
// Prefer a real OpenAI API key over stale exported Codex auth tokens when both exist.
if (process.env.OPENAI_API_KEY) {
process.env.CODEX_API_KEY = process.env.OPENAI_API_KEY;
}
@ -137,8 +141,8 @@ export async function startBackend(options: BackendStartOptions = {}): Promise<v
const appWorkspace = async () =>
await withRetries(
async () =>
await actorClient.workspace.getOrCreate(workspaceKey(APP_SHELL_WORKSPACE_ID), {
createWithInput: APP_SHELL_WORKSPACE_ID,
await actorClient.organization.getOrCreate(organizationKey(APP_SHELL_ORGANIZATION_ID), {
createWithInput: APP_SHELL_ORGANIZATION_ID,
}),
);
@ -175,6 +179,31 @@ export async function startBackend(options: BackendStartOptions = {}): Promise<v
return Response.redirect(result.redirectTo, 302);
});
app.post("/api/rivet/app/auth/github/bootstrap", async (c) => {
const body = await c.req.json();
const accessToken = typeof body?.accessToken === "string" ? body.accessToken.trim() : "";
const organizationLogins = Array.isArray(body?.organizationLogins)
? body.organizationLogins
.filter((value): value is string => typeof value === "string")
.map((value) => value.trim())
.filter((value) => value.length > 0)
: null;
if (!accessToken) {
return c.text("Missing accessToken", 400);
}
const sessionId = await resolveSessionId(c);
const result = await appWorkspaceAction(
async (workspace) =>
await workspace.bootstrapAppGithubSession({
accessToken,
sessionId,
organizationLogins,
}),
);
c.header("x-foundry-session", result.sessionId);
return c.json(result);
});
app.post("/api/rivet/app/sign-out", async (c) => {
const sessionId = await resolveSessionId(c);
return c.json(await appWorkspaceAction(async (workspace) => await workspace.signOutApp({ sessionId })));
@ -334,6 +363,16 @@ export async function startBackend(options: BackendStartOptions = {}): Promise<v
app.post("/api/rivet/app/webhooks/stripe", handleStripeWebhook);
app.post("/api/rivet/app/stripe/webhook", handleStripeWebhook);
app.post("/api/rivet/app/webhooks/github", async (c) => {
const payload = await c.req.text();
await (await appWorkspace()).handleAppGithubWebhook({
payload,
signatureHeader: c.req.header("x-hub-signature-256") ?? null,
eventHeader: c.req.header("x-github-event") ?? null,
});
return c.json({ ok: true });
});
app.all("/api/rivet", forward);
app.all("/api/rivet/*", forward);

View file

@ -19,6 +19,11 @@ export interface DaytonaPreviewEndpoint {
token?: string;
}
export interface DaytonaExecuteCommandResult {
exitCode?: number;
result?: string;
}
export interface DaytonaClientOptions {
apiUrl?: string;
apiKey?: string;
@ -88,15 +93,6 @@ export class DaytonaClient {
await this.daytona.delete(sandbox);
}
async executeCommand(sandboxId: string, command: string): Promise<{ exitCode: number; result: string }> {
const sandbox = await this.daytona.get(sandboxId);
const response = await sandbox.process.executeCommand(command);
return {
exitCode: response.exitCode,
result: response.result,
};
}
async getPreviewEndpoint(sandboxId: string, port: number): Promise<DaytonaPreviewEndpoint> {
const sandbox = await this.daytona.get(sandboxId);
// Use signed preview URLs for server-to-sandbox communication.
@ -110,4 +106,13 @@ export class DaytonaClient {
token: preview.token,
};
}
async executeCommand(sandboxId: string, command: string, env?: Record<string, string>, timeoutSeconds?: number): Promise<DaytonaExecuteCommandResult> {
const sandbox = await this.daytona.get(sandboxId);
const response = await sandbox.process.executeCommand(command, undefined, env, timeoutSeconds);
return {
exitCode: response.exitCode,
result: response.result,
};
}
}

View file

@ -15,7 +15,7 @@ interface GitAuthOptions {
}
function resolveGithubToken(options?: GitAuthOptions): string | null {
const token = options?.githubToken ?? process.env.GH_TOKEN ?? process.env.GITHUB_TOKEN ?? process.env.HF_GITHUB_TOKEN ?? process.env.HF_GH_TOKEN ?? null;
const token = options?.githubToken ?? process.env.GITHUB_TOKEN ?? null;
if (!token) return null;
const trimmed = token.trim();
return trimmed.length > 0 ? trimmed : null;
@ -35,8 +35,7 @@ function ensureAskpassScript(): string {
const content = [
"#!/bin/sh",
'prompt="$1"',
// Prefer GH_TOKEN/GITHUB_TOKEN but support HF_* aliases too.
'token="${GH_TOKEN:-${GITHUB_TOKEN:-${HF_GITHUB_TOKEN:-${HF_GH_TOKEN:-}}}}"',
'token="${GITHUB_TOKEN:-}"',
'case "$prompt" in',
' *Username*) echo "x-access-token" ;;',
' *Password*) echo "$token" ;;',
@ -58,9 +57,7 @@ function gitEnv(options?: GitAuthOptions): Record<string, string> {
const token = resolveGithubToken(options);
if (token) {
env.GIT_ASKPASS = ensureAskpassScript();
// Some tooling expects these vars; keep them aligned.
env.GITHUB_TOKEN = token;
env.GH_TOKEN = token;
}
return env;

View file

@ -11,7 +11,6 @@ function ghEnv(options?: GithubAuthOptions): Record<string, string> {
const env: Record<string, string> = { ...(process.env as Record<string, string>) };
const token = options?.githubToken?.trim();
if (token) {
env.GH_TOKEN = token;
env.GITHUB_TOKEN = token;
}
return env;

View file

@ -7,6 +7,8 @@ import type {
ProcessInfo,
ProcessLogFollowQuery,
ProcessLogsResponse,
ProcessRunRequest,
ProcessRunResponse,
ProcessSignalQuery,
SessionEvent,
SessionPersistDriver,
@ -216,6 +218,11 @@ export class SandboxAgentClient {
return await sdk.createProcess(request);
}
async runProcess(request: ProcessRunRequest): Promise<ProcessRunResponse> {
const sdk = await this.sdk();
return await sdk.runProcess(request);
}
async listProcesses(): Promise<{ processes: ProcessInfo[] }> {
const sdk = await this.sdk();
return await sdk.listProcesses();

View file

@ -1,3 +1,4 @@
import { setTimeout as delay } from "node:timers/promises";
import type {
AgentEndpoint,
AttachTarget,
@ -30,6 +31,10 @@ export interface DaytonaProviderConfig {
autoStopInterval?: number;
}
function shellQuote(value: string): string {
return `'${value.replace(/'/g, `'\\''`)}'`;
}
export class DaytonaProvider implements SandboxProvider {
constructor(
private readonly config: DaytonaProviderConfig,
@ -47,7 +52,6 @@ export class DaytonaProvider implements SandboxProvider {
"CODEX_API_KEY",
"OPENCODE_API_KEY",
"CEREBRAS_API_KEY",
"GH_TOKEN",
"GITHUB_TOKEN",
] as const;
@ -145,37 +149,124 @@ export class DaytonaProvider implements SandboxProvider {
return envVars;
}
private buildShellExports(extra: Record<string, string> = {}): string[] {
const merged = {
...this.buildEnvVars(),
...extra,
};
return Object.entries(merged).map(([key, value]) => {
const encoded = Buffer.from(value, "utf8").toString("base64");
return `export ${key}="$(printf %s ${JSON.stringify(encoded)} | base64 -d)"`;
});
private wrapBashScript(script: string): string {
const compact = script
.split("\n")
.map((line) => line.trim())
.filter((line) => line.length > 0)
.join("; ");
return `bash -lc ${shellQuote(compact)}`;
}
private buildSnapshotImage() {
// Use Daytona image build + snapshot caching so base tooling (git + sandbox-agent)
// is prepared once and reused for subsequent sandboxes.
return Image.base(this.config.image).runCommands(
"apt-get update && apt-get install -y curl ca-certificates git openssh-client nodejs npm",
`curl -fsSL https://releases.rivet.dev/sandbox-agent/${DaytonaProvider.SANDBOX_AGENT_VERSION}/install.sh | sh`,
`bash -lc 'export PATH="$HOME/.local/bin:$PATH"; sandbox-agent install-agent codex || true; sandbox-agent install-agent claude || true'`,
);
// Daytona keeps its own wrapper as PID 1, so sandbox-agent must be started
// after sandbox creation via the native process API rather than image entrypoint/CMD.
return Image.base(this.config.image)
.runCommands(
"apt-get update && apt-get install -y curl ca-certificates git openssh-client",
"curl -fsSL https://deb.nodesource.com/setup_20.x | bash -",
"apt-get install -y nodejs",
`curl -fsSL https://releases.rivet.dev/sandbox-agent/${DaytonaProvider.SANDBOX_AGENT_VERSION}/install.sh | sh`,
`bash -lc 'export PATH="$HOME/.local/bin:$PATH"; sandbox-agent install-agent codex; sandbox-agent install-agent claude'`,
)
.env({
SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS: this.getAcpRequestTimeoutMs().toString(),
});
}
private async runCheckedCommand(sandboxId: string, command: string, label: string): Promise<void> {
private async startSandboxAgent(sandboxId: string): Promise<void> {
const client = this.requireClient();
const startScript = [
"set -euo pipefail",
'export PATH="$HOME/.local/bin:$PATH"',
`if ps -ef | grep -F "sandbox-agent server --no-token --host 0.0.0.0 --port ${DaytonaProvider.SANDBOX_AGENT_PORT}" | grep -v grep >/dev/null 2>&1; then exit 0; fi`,
'rm -f "$HOME/.codex/auth.json" "$HOME/.config/codex/auth.json" /tmp/sandbox-agent.log',
`nohup sandbox-agent server --no-token --host 0.0.0.0 --port ${DaytonaProvider.SANDBOX_AGENT_PORT} >/tmp/sandbox-agent.log 2>&1 &`,
].join("\n");
const result = await this.withTimeout("start sandbox-agent", () =>
client.executeCommand(sandboxId, this.wrapBashScript(startScript), undefined, Math.ceil(this.getRequestTimeoutMs() / 1000)),
);
const result = await this.withTimeout(`execute command (${label})`, () => client.executeCommand(sandboxId, command));
if (result.exitCode !== 0) {
throw new Error(`daytona ${label} failed (${result.exitCode}): ${result.result}`);
if ((result.exitCode ?? 1) !== 0) {
throw new Error(`daytona start sandbox-agent failed (${result.exitCode ?? "unknown"}): ${result.result ?? ""}`);
}
}
private async getSandboxAgentEndpoint(sandboxId: string, label: string): Promise<AgentEndpoint> {
const client = this.requireClient();
const preview = await this.withTimeout(`get preview endpoint (${label})`, () => client.getPreviewEndpoint(sandboxId, DaytonaProvider.SANDBOX_AGENT_PORT));
return preview.token ? { endpoint: preview.url, token: preview.token } : { endpoint: preview.url };
}
private async waitForSandboxAgentHealth(sandboxId: string, label: string): Promise<AgentEndpoint> {
const deadline = Date.now() + this.getRequestTimeoutMs();
let lastDetail = "sandbox-agent health unavailable";
while (Date.now() < deadline) {
try {
const endpoint = await this.getSandboxAgentEndpoint(sandboxId, label);
const response = await fetch(`${endpoint.endpoint.replace(/\/$/, "")}/v1/health`, {
headers: {
...(endpoint.token ? { Authorization: `Bearer ${endpoint.token}` } : {}),
},
});
if (response.ok) {
return endpoint;
}
lastDetail = `${response.status} ${response.statusText}`;
} catch (error) {
lastDetail = error instanceof Error ? error.message : String(error);
}
await delay(1_000);
}
throw new Error(`daytona sandbox-agent ${label} failed health check: ${lastDetail}`);
}
private async runViaSandboxAgent(
endpoint: AgentEndpoint,
command: string,
env: Record<string, string> | undefined,
label: string,
): Promise<ExecuteSandboxCommandResult> {
const response = await this.withTimeout(`execute via sandbox-agent (${label})`, async () => {
return await fetch(`${endpoint.endpoint.replace(/\/$/, "")}/v1/processes/run`, {
method: "POST",
headers: {
"Content-Type": "application/json",
...(endpoint.token ? { Authorization: `Bearer ${endpoint.token}` } : {}),
},
body: JSON.stringify({
command: "bash",
args: ["-lc", command],
...(env && Object.keys(env).length > 0 ? { env } : {}),
timeoutMs: this.getRequestTimeoutMs(),
maxOutputBytes: 1024 * 1024 * 4,
}),
});
});
if (!response.ok) {
const detail = await response.text().catch(() => "");
throw new Error(`daytona sandbox-agent ${label} failed (${response.status}): ${detail || response.statusText}`);
}
const body = (await response.json()) as {
exitCode?: number | null;
stdout?: string;
stderr?: string;
timedOut?: boolean;
};
return {
exitCode: typeof body.exitCode === "number" ? body.exitCode : body.timedOut ? 124 : 1,
result: [body.stdout ?? "", body.stderr ?? ""].filter(Boolean).join(""),
};
}
id() {
return "daytona" as const;
}
@ -224,55 +315,43 @@ export class DaytonaProvider implements SandboxProvider {
});
const repoDir = `/home/daytona/foundry/${req.workspaceId}/${req.repoId}/${req.taskId}/repo`;
// Prepare a working directory for the agent. This must succeed for the task to work.
const installStartedAt = Date.now();
await this.runCheckedCommand(
sandbox.id,
[
"bash",
"-lc",
`'set -euo pipefail; export DEBIAN_FRONTEND=noninteractive; if command -v git >/dev/null 2>&1 && command -v npx >/dev/null 2>&1; then exit 0; fi; apt-get update -y >/tmp/apt-update.log 2>&1; apt-get install -y git openssh-client ca-certificates nodejs npm >/tmp/apt-install.log 2>&1'`,
].join(" "),
"install git + node toolchain",
);
emitDebug("daytona.createSandbox.install_toolchain.done", {
const agent = await this.ensureSandboxAgent({
workspaceId: req.workspaceId,
sandboxId: sandbox.id,
durationMs: Date.now() - installStartedAt,
});
const cloneStartedAt = Date.now();
await this.runCheckedCommand(
sandbox.id,
[
"bash",
"-lc",
`${JSON.stringify(
[
"set -euo pipefail",
"export GIT_TERMINAL_PROMPT=0",
"export GIT_ASKPASS=/bin/echo",
`TOKEN=${JSON.stringify(req.githubToken ?? "")}`,
'if [ -z "$TOKEN" ]; then TOKEN="${GH_TOKEN:-${GITHUB_TOKEN:-}}"; fi',
"GIT_AUTH_ARGS=()",
`if [ -n "$TOKEN" ] && [[ "${req.repoRemote}" == https://github.com/* ]]; then AUTH_HEADER="$(printf 'x-access-token:%s' "$TOKEN" | base64 | tr -d '\\n')"; GIT_AUTH_ARGS=(-c "http.https://github.com/.extraheader=AUTHORIZATION: basic $AUTH_HEADER"); fi`,
`rm -rf "${repoDir}"`,
`mkdir -p "${repoDir}"`,
`rmdir "${repoDir}"`,
// Foundry test repos can be private, so clone/fetch must use the sandbox's GitHub token when available.
`git "\${GIT_AUTH_ARGS[@]}" clone "${req.repoRemote}" "${repoDir}"`,
`cd "${repoDir}"`,
`if [ -n "$TOKEN" ] && [[ "${req.repoRemote}" == https://github.com/* ]]; then git config --local credential.helper ""; git config --local http.https://github.com/.extraheader "AUTHORIZATION: basic $AUTH_HEADER"; fi`,
`git "\${GIT_AUTH_ARGS[@]}" fetch origin --prune`,
// The task branch may not exist remotely yet (agent push creates it). Base off current branch (default branch).
`if git show-ref --verify --quiet "refs/remotes/origin/${req.branchName}"; then git checkout -B "${req.branchName}" "origin/${req.branchName}"; else git checkout -B "${req.branchName}" "$(git branch --show-current 2>/dev/null || echo main)"; fi`,
`git config user.email "foundry@local" >/dev/null 2>&1 || true`,
`git config user.name "Foundry" >/dev/null 2>&1 || true`,
].join("; "),
)}`,
].join(" "),
"clone repo",
);
const commandEnv: Record<string, string> = {};
if (req.githubToken && req.githubToken.trim().length > 0) {
commandEnv.GITHUB_TOKEN = req.githubToken;
}
const cloneScript = [
"set -euo pipefail",
"export GIT_TERMINAL_PROMPT=0",
`REMOTE=${shellQuote(req.repoRemote)}`,
`BRANCH_NAME=${shellQuote(req.branchName)}`,
'TOKEN="${GITHUB_TOKEN:-}"',
'AUTH_REMOTE="$REMOTE"',
'AUTH_HEADER=""',
'if [ -n "$TOKEN" ] && [[ "$REMOTE" == https://github.com/* ]]; then AUTH_REMOTE="https://x-access-token:${TOKEN}@${REMOTE#https://}"; AUTH_HEADER="$(printf \'x-access-token:%s\' \"$TOKEN\" | base64 | tr -d \'\\n\')"; fi',
`rm -rf "${repoDir}"`,
`mkdir -p "${repoDir}"`,
`rmdir "${repoDir}"`,
// Foundry test repos can be private, so clone/fetch must use the sandbox's GitHub token when available.
`git clone "$AUTH_REMOTE" "${repoDir}"`,
`cd "${repoDir}"`,
'git remote set-url origin "$REMOTE"',
'if [ -n "$AUTH_HEADER" ]; then git config --local credential.helper ""; git config --local http.https://github.com/.extraheader "AUTHORIZATION: basic $AUTH_HEADER"; fi',
`git fetch origin --prune`,
// The task branch may not exist remotely yet (agent push creates it). Base off current branch (default branch).
'if git show-ref --verify --quiet "refs/remotes/origin/$BRANCH_NAME"; then git checkout -B "$BRANCH_NAME" "origin/$BRANCH_NAME"; else git checkout -B "$BRANCH_NAME" "$(git branch --show-current 2>/dev/null || echo main)"; fi',
`git config user.email "foundry@local" >/dev/null 2>&1 || true`,
`git config user.name "Foundry" >/dev/null 2>&1 || true`,
].join("\n");
const cloneResult = await this.runViaSandboxAgent(agent, this.wrapBashScript(cloneScript), commandEnv, "clone repo");
if (cloneResult.exitCode !== 0) {
throw new Error(`daytona clone repo failed (${cloneResult.exitCode}): ${cloneResult.result}`);
}
emitDebug("daytona.createSandbox.clone_repo.done", {
sandboxId: sandbox.id,
durationMs: Date.now() - cloneStartedAt,
@ -352,92 +431,9 @@ export class DaytonaProvider implements SandboxProvider {
}
async ensureSandboxAgent(req: EnsureAgentRequest): Promise<AgentEndpoint> {
const client = this.requireClient();
const acpRequestTimeoutMs = this.getAcpRequestTimeoutMs();
const sandboxAgentExports = this.buildShellExports({
SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS: acpRequestTimeoutMs.toString(),
});
await this.ensureStarted(req.sandboxId);
await this.runCheckedCommand(
req.sandboxId,
[
"bash",
"-lc",
`'set -euo pipefail; if command -v curl >/dev/null 2>&1; then exit 0; fi; export DEBIAN_FRONTEND=noninteractive; apt-get update -y >/tmp/apt-update.log 2>&1; apt-get install -y curl ca-certificates >/tmp/apt-install.log 2>&1'`,
].join(" "),
"install curl",
);
await this.runCheckedCommand(
req.sandboxId,
[
"bash",
"-lc",
`'set -euo pipefail; if command -v npx >/dev/null 2>&1; then exit 0; fi; export DEBIAN_FRONTEND=noninteractive; apt-get update -y >/tmp/apt-update.log 2>&1; apt-get install -y nodejs npm >/tmp/apt-install.log 2>&1'`,
].join(" "),
"install node toolchain",
);
await this.runCheckedCommand(
req.sandboxId,
[
"bash",
"-lc",
`'set -euo pipefail; export PATH="$HOME/.local/bin:$PATH"; if sandbox-agent --version 2>/dev/null | grep -q "${DaytonaProvider.SANDBOX_AGENT_VERSION}"; then exit 0; fi; curl -fsSL https://releases.rivet.dev/sandbox-agent/${DaytonaProvider.SANDBOX_AGENT_VERSION}/install.sh | sh'`,
].join(" "),
"install sandbox-agent",
);
for (const agentId of DaytonaProvider.AGENT_IDS) {
try {
await this.runCheckedCommand(
req.sandboxId,
["bash", "-lc", `'export PATH="$HOME/.local/bin:$PATH"; sandbox-agent install-agent ${agentId}'`].join(" "),
`install agent ${agentId}`,
);
} catch {
// Some sandbox-agent builds may not ship every agent plugin; treat this as best-effort.
}
}
await this.runCheckedCommand(
req.sandboxId,
[
"bash",
"-lc",
JSON.stringify(
[
"set -euo pipefail",
'export PATH="$HOME/.local/bin:$PATH"',
...sandboxAgentExports,
"command -v sandbox-agent >/dev/null 2>&1",
"if pgrep -x sandbox-agent >/dev/null; then exit 0; fi",
'rm -f "$HOME/.codex/auth.json" "$HOME/.config/codex/auth.json"',
`nohup sandbox-agent server --no-token --host 0.0.0.0 --port ${DaytonaProvider.SANDBOX_AGENT_PORT} >/tmp/sandbox-agent.log 2>&1 &`,
].join("; "),
),
].join(" "),
"start sandbox-agent",
);
await this.runCheckedCommand(
req.sandboxId,
[
"bash",
"-lc",
`'for i in $(seq 1 45); do curl -fsS "http://127.0.0.1:${DaytonaProvider.SANDBOX_AGENT_PORT}/v1/health" >/dev/null && exit 0; sleep 1; done; echo "sandbox-agent failed to become healthy" >&2; tail -n 80 /tmp/sandbox-agent.log >&2; exit 1'`,
].join(" "),
"wait for sandbox-agent health",
);
const preview = await this.withTimeout("get preview endpoint", () => client.getPreviewEndpoint(req.sandboxId, DaytonaProvider.SANDBOX_AGENT_PORT));
return {
endpoint: preview.url,
token: preview.token,
};
await this.startSandboxAgent(req.sandboxId);
return await this.waitForSandboxAgentHealth(req.sandboxId, "ensure sandbox-agent");
}
async health(req: SandboxHealthRequest): Promise<SandboxHealth> {
@ -478,8 +474,10 @@ export class DaytonaProvider implements SandboxProvider {
}
async executeCommand(req: ExecuteSandboxCommandRequest): Promise<ExecuteSandboxCommandResult> {
const client = this.requireClient();
await this.ensureStarted(req.sandboxId);
return await this.withTimeout(`execute command (${req.label ?? "command"})`, () => client.executeCommand(req.sandboxId, req.command));
const endpoint = await this.ensureSandboxAgent({
workspaceId: req.workspaceId,
sandboxId: req.sandboxId,
});
return await this.runViaSandboxAgent(endpoint, req.command, req.env, req.label ?? "command");
}
}

View file

@ -108,7 +108,6 @@ export class LocalProvider implements SandboxProvider {
...(process.env.CLAUDE_API_KEY ? { CLAUDE_API_KEY: process.env.CLAUDE_API_KEY } : {}),
...(process.env.OPENAI_API_KEY ? { OPENAI_API_KEY: process.env.OPENAI_API_KEY } : {}),
...(process.env.CODEX_API_KEY ? { CODEX_API_KEY: process.env.CODEX_API_KEY } : {}),
...(process.env.GH_TOKEN ? { GH_TOKEN: process.env.GH_TOKEN } : {}),
...(process.env.GITHUB_TOKEN ? { GITHUB_TOKEN: process.env.GITHUB_TOKEN } : {}),
},
},
@ -217,7 +216,10 @@ export class LocalProvider implements SandboxProvider {
try {
const { stdout, stderr } = await execFileAsync("bash", ["-lc", req.command], {
cwd,
env: process.env as Record<string, string>,
env: {
...(process.env as Record<string, string>),
...(req.env ?? {}),
},
maxBuffer: 1024 * 1024 * 16,
});
return {

View file

@ -51,6 +51,7 @@ export interface ExecuteSandboxCommandRequest {
workspaceId: string;
sandboxId: string;
command: string;
env?: Record<string, string>;
label?: string;
}

View file

@ -15,6 +15,16 @@ export interface GitHubOAuthSession {
scopes: string[];
}
function parseScopesHeader(value: string | null): string[] {
if (!value) {
return [];
}
return value
.split(",")
.map((entry) => entry.trim())
.filter((entry) => entry.length > 0);
}
export interface GitHubViewerIdentity {
id: string;
login: string;
@ -39,6 +49,29 @@ export interface GitHubRepositoryRecord {
private: boolean;
}
export interface GitHubMemberRecord {
id: string;
login: string;
name: string;
email: string | null;
role: string | null;
state: "active" | "invited";
}
export interface GitHubPullRequestRecord {
repoFullName: string;
cloneUrl: string;
number: number;
title: string;
body: string | null;
state: string;
url: string;
headRefName: string;
baseRefName: string;
authorLogin: string | null;
isDraft: boolean;
}
interface GitHubTokenResponse {
access_token?: string;
scope?: string;
@ -57,7 +90,17 @@ export interface GitHubWebhookEvent {
repositories_added?: Array<{ id: number; full_name: string; private: boolean }>;
repositories_removed?: Array<{ id: number; full_name: string }>;
repository?: { id: number; full_name: string; clone_url?: string; private?: boolean; owner?: { login?: string } };
pull_request?: { number: number; title?: string; state?: string; head?: { ref?: string }; base?: { ref?: string } };
pull_request?: {
number: number;
title?: string;
body?: string | null;
state?: string;
html_url?: string;
draft?: boolean;
user?: { login?: string } | null;
head?: { ref?: string };
base?: { ref?: string };
};
sender?: { login?: string; id?: number };
[key: string]: unknown;
}
@ -237,6 +280,25 @@ export class GitHubAppClient {
};
}
async getTokenScopes(accessToken: string): Promise<string[]> {
const response = await fetch(`${this.apiBaseUrl}/user`, {
headers: {
Accept: "application/vnd.github+json",
Authorization: `Bearer ${accessToken}`,
"X-GitHub-Api-Version": "2022-11-28",
},
});
const payload = await parseJsonPayload<{ message?: string } | Record<string, unknown>>(response, "GitHub scope request failed for /user");
if (!response.ok) {
const message =
typeof payload === "object" && payload && "message" in payload && typeof payload.message === "string" ? payload.message : "GitHub request failed";
throw new GitHubAppError(message, response.status);
}
return parseScopesHeader(response.headers.get("x-oauth-scopes"));
}
async listOrganizations(accessToken: string): Promise<GitHubOrgIdentity[]> {
const organizations = await this.paginate<{ id: number; login: string; description?: string | null }>("/user/orgs?per_page=100", accessToken);
return organizations.map((organization) => ({
@ -305,6 +367,56 @@ export class GitHubAppClient {
}));
}
async listInstallationMembers(installationId: number, organizationLogin: string): Promise<GitHubMemberRecord[]> {
const accessToken = await this.createInstallationAccessToken(installationId);
const members = await this.paginate<{
id: number;
login: string;
type?: string;
}>(`/orgs/${organizationLogin}/members?per_page=100`, accessToken);
return members.map((member) => ({
id: String(member.id),
login: member.login,
name: member.login,
email: null,
role: member.type === "User" ? "member" : null,
state: "active",
}));
}
async listOrganizationMembers(accessToken: string, organizationLogin: string): Promise<GitHubMemberRecord[]> {
const members = await this.paginate<{
id: number;
login: string;
type?: string;
}>(`/orgs/${organizationLogin}/members?per_page=100`, accessToken);
return members.map((member) => ({
id: String(member.id),
login: member.login,
name: member.login,
email: null,
role: member.type === "User" ? "member" : null,
state: "active",
}));
}
async listInstallationPullRequests(installationId: number): Promise<GitHubPullRequestRecord[]> {
const accessToken = await this.createInstallationAccessToken(installationId);
const repositories = await this.listInstallationRepositories(installationId);
return await this.listPullRequestsForRepositories(repositories, accessToken);
}
async listUserPullRequests(accessToken: string): Promise<GitHubPullRequestRecord[]> {
const repositories = await this.listUserRepositories(accessToken);
return await this.listPullRequestsForRepositories(repositories, accessToken);
}
async listPullRequestsForUserRepositories(accessToken: string, repositories: GitHubRepositoryRecord[]): Promise<GitHubPullRequestRecord[]> {
return await this.listPullRequestsForRepositories(repositories, accessToken);
}
async buildInstallationUrl(organizationLogin: string, state: string): Promise<string> {
if (!this.isAppConfigured()) {
throw new GitHubAppError("GitHub App is not configured", 500);
@ -333,7 +445,7 @@ export class GitHubAppClient {
},
});
const payload = (await response.json()) as { token?: string; message?: string };
const payload = await parseJsonPayload<{ token?: string; message?: string }>(response, "Unable to mint GitHub installation token");
if (!response.ok || !payload.token) {
throw new GitHubAppError(payload.message ?? "Unable to mint GitHub installation token", response.status);
}
@ -371,7 +483,7 @@ export class GitHubAppClient {
},
});
const payload = (await response.json()) as T | { message?: string };
const payload = await parseJsonPayload<T | { message?: string }>(response, `GitHub app request failed for ${path}`);
if (!response.ok) {
throw new GitHubAppError(
typeof payload === "object" && payload && "message" in payload ? (payload.message ?? "GitHub request failed") : "GitHub request failed",
@ -403,7 +515,7 @@ export class GitHubAppClient {
},
});
const payload = (await response.json()) as T | { message?: string };
const payload = await parseJsonPayload<T | { message?: string }>(response, `GitHub request failed for ${path}`);
if (!response.ok) {
throw new GitHubAppError(
typeof payload === "object" && payload && "message" in payload ? (payload.message ?? "GitHub request failed") : "GitHub request failed",
@ -426,6 +538,64 @@ export class GitHubAppClient {
return items;
}
private async listPullRequestsForRepositories(repositories: GitHubRepositoryRecord[], accessToken: string): Promise<GitHubPullRequestRecord[]> {
const pullRequests: GitHubPullRequestRecord[] = [];
for (const repository of repositories) {
const [owner, name] = repository.fullName.split("/", 2);
if (!owner || !name) {
continue;
}
let items: Array<{
number: number;
title: string;
body?: string | null;
state: string;
html_url: string;
draft?: boolean;
user?: { login?: string } | null;
head?: { ref?: string } | null;
base?: { ref?: string } | null;
}>;
try {
items = await this.paginate<{
number: number;
title: string;
body?: string | null;
state: string;
html_url: string;
draft?: boolean;
user?: { login?: string } | null;
head?: { ref?: string } | null;
base?: { ref?: string } | null;
}>(`/repos/${owner}/${name}/pulls?state=all&per_page=100`, accessToken);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.warn(`[foundry][github] skipping PR sync for ${repository.fullName}: ${message}`);
continue;
}
for (const item of items) {
pullRequests.push({
repoFullName: repository.fullName,
cloneUrl: repository.cloneUrl,
number: item.number,
title: item.title,
body: item.body ?? null,
state: item.state,
url: item.html_url,
headRefName: item.head?.ref ?? "",
baseRefName: item.base?.ref ?? "",
authorLogin: item.user?.login ?? null,
isDraft: Boolean(item.draft),
});
}
}
return pullRequests;
}
private async requestPage<T>(url: string, accessToken: string): Promise<GitHubPageResponse<T>> {
const response = await fetch(url, {
headers: {
@ -435,7 +605,7 @@ export class GitHubAppClient {
},
});
const payload = (await response.json()) as T[] | { repositories?: T[]; message?: string };
const payload = await parseJsonPayload<T[] | { repositories?: T[]; message?: string }>(response, `GitHub page request failed for ${url}`);
if (!response.ok) {
throw new GitHubAppError(
typeof payload === "object" && payload && "message" in payload ? (payload.message ?? "GitHub request failed") : "GitHub request failed",
@ -459,7 +629,7 @@ export class GitHubAppClient {
},
});
const payload = (await response.json()) as T[] | { installations?: T[]; message?: string };
const payload = await parseJsonPayload<T[] | { installations?: T[]; message?: string }>(response, `GitHub app page request failed for ${url}`);
if (!response.ok) {
throw new GitHubAppError(
typeof payload === "object" && payload && "message" in payload ? (payload.message ?? "GitHub request failed") : "GitHub request failed",
@ -491,6 +661,17 @@ function parseNextLink(linkHeader: string | null): string | null {
return null;
}
async function parseJsonPayload<T>(response: Response, context: string): Promise<T> {
const text = await response.text();
try {
return JSON.parse(text) as T;
} catch {
const excerpt = text.slice(0, 200).replace(/\s+/g, " ").trim();
const suffix = excerpt ? `: ${excerpt}` : "";
throw new GitHubAppError(`${context}${suffix}`, response.status || 502);
}
}
function base64UrlEncode(value: string | Buffer): string {
const source = typeof value === "string" ? Buffer.from(value, "utf8") : value;
return source.toString("base64").replace(/\+/g, "-").replace(/\//g, "_").replace(/=+$/g, "");

View file

@ -1,8 +1,10 @@
import {
GitHubAppClient,
type GitHubInstallationRecord,
type GitHubMemberRecord,
type GitHubOAuthSession,
type GitHubOrgIdentity,
type GitHubPullRequestRecord,
type GitHubRepositoryRecord,
type GitHubViewerIdentity,
type GitHubWebhookEvent,
@ -23,11 +25,15 @@ export type AppShellGithubClient = Pick<
| "isWebhookConfigured"
| "buildAuthorizeUrl"
| "exchangeCode"
| "getTokenScopes"
| "getViewer"
| "listOrganizations"
| "listInstallations"
| "listUserRepositories"
| "listUserPullRequests"
| "listInstallationRepositories"
| "listInstallationMembers"
| "listInstallationPullRequests"
| "buildInstallationUrl"
| "verifyWebhookEvent"
>;
@ -67,8 +73,10 @@ export function createDefaultAppShellServices(options: CreateAppShellServicesOpt
export type {
GitHubInstallationRecord,
GitHubMemberRecord,
GitHubOAuthSession,
GitHubOrgIdentity,
GitHubPullRequestRecord,
GitHubRepositoryRecord,
GitHubViewerIdentity,
GitHubWebhookEvent,

View file

@ -1,5 +1,5 @@
import { getOrCreateWorkspace } from "../actors/handles.js";
import { APP_SHELL_WORKSPACE_ID } from "../actors/workspace/app-shell.js";
import { getOrCreateOrganization } from "../actors/handles.js";
import { APP_SHELL_ORGANIZATION_ID } from "../actors/organization/app-shell.js";
export interface ResolvedGithubAuth {
githubToken: string;
@ -7,12 +7,12 @@ export interface ResolvedGithubAuth {
}
export async function resolveWorkspaceGithubAuth(c: any, workspaceId: string): Promise<ResolvedGithubAuth | null> {
if (!workspaceId || workspaceId === APP_SHELL_WORKSPACE_ID) {
if (!workspaceId || workspaceId === APP_SHELL_ORGANIZATION_ID) {
return null;
}
try {
const appWorkspace = await getOrCreateWorkspace(c, APP_SHELL_WORKSPACE_ID);
const appWorkspace = await getOrCreateOrganization(c, APP_SHELL_ORGANIZATION_ID);
const resolved = await appWorkspace.resolveAppGithubToken({
organizationId: workspaceId,
requireRepoScope: true,

View file

@ -1,11 +1,24 @@
import { describe, expect, it } from "vitest";
import { afterEach, describe, expect, it } from "vitest";
import type { DaytonaClientLike, DaytonaDriver } from "../src/driver.js";
import type { DaytonaCreateSandboxOptions } from "../src/integrations/daytona/client.js";
import { DaytonaProvider } from "../src/providers/daytona/index.js";
interface RecordedFetchCall {
url: string;
method: string;
headers: Record<string, string>;
bodyText?: string;
}
class RecordingDaytonaClient implements DaytonaClientLike {
createSandboxCalls: DaytonaCreateSandboxOptions[] = [];
executedCommands: string[] = [];
getPreviewEndpointCalls: Array<{ sandboxId: string; port: number }> = [];
executeCommandCalls: Array<{
sandboxId: string;
command: string;
env?: Record<string, string>;
timeoutSeconds?: number;
}> = [];
async createSandbox(options: DaytonaCreateSandboxOptions) {
this.createSandboxCalls.push(options);
@ -32,17 +45,21 @@ class RecordingDaytonaClient implements DaytonaClientLike {
async deleteSandbox(_sandboxId: string) {}
async executeCommand(_sandboxId: string, command: string) {
this.executedCommands.push(command);
return { exitCode: 0, result: "" };
}
async getPreviewEndpoint(sandboxId: string, port: number) {
this.getPreviewEndpointCalls.push({ sandboxId, port });
return {
url: `https://preview.example/sandbox/${sandboxId}/port/${port}`,
token: "preview-token",
};
}
async executeCommand(sandboxId: string, command: string, env?: Record<string, string>, timeoutSeconds?: number) {
this.executeCommandCalls.push({ sandboxId, command, env, timeoutSeconds });
return {
exitCode: 0,
result: "",
};
}
}
function createProviderWithClient(client: DaytonaClientLike): DaytonaProvider {
@ -59,79 +76,159 @@ function createProviderWithClient(client: DaytonaClientLike): DaytonaProvider {
);
}
function withFetchStub(implementation: (call: RecordedFetchCall) => Response | Promise<Response>): () => void {
const previous = globalThis.fetch;
globalThis.fetch = (async (input, init) => {
const headers = new Headers(init?.headers);
const headerRecord: Record<string, string> = {};
headers.forEach((value, key) => {
headerRecord[key] = value;
});
const bodyText = typeof init?.body === "string" ? init.body : init?.body instanceof Uint8Array ? Buffer.from(init.body).toString("utf8") : undefined;
return await implementation({
url: typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url,
method: init?.method ?? "GET",
headers: headerRecord,
bodyText,
});
}) as typeof fetch;
return () => {
globalThis.fetch = previous;
};
}
afterEach(() => {
delete process.env.HF_SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS;
delete process.env.HF_DAYTONA_REQUEST_TIMEOUT_MS;
});
describe("daytona provider snapshot image behavior", () => {
it("creates sandboxes using a snapshot-capable image recipe", async () => {
it("creates sandboxes using a snapshot-capable image recipe and clones via sandbox-agent process api", async () => {
const client = new RecordingDaytonaClient();
const provider = createProviderWithClient(client);
const fetchCalls: RecordedFetchCall[] = [];
const restoreFetch = withFetchStub(async (call) => {
fetchCalls.push(call);
const handle = await provider.createSandbox({
workspaceId: "default",
repoId: "repo-1",
repoRemote: "https://github.com/acme/repo.git",
branchName: "feature/test",
taskId: "task-1",
if (call.url.endsWith("/v1/health")) {
return new Response(JSON.stringify({ ok: true }), {
status: 200,
headers: { "Content-Type": "application/json" },
});
}
if (call.url.endsWith("/v1/processes/run")) {
return new Response(JSON.stringify({ exitCode: 0, stdout: "", stderr: "" }), {
status: 200,
headers: { "Content-Type": "application/json" },
});
}
throw new Error(`unexpected fetch: ${call.method} ${call.url}`);
});
expect(client.createSandboxCalls).toHaveLength(1);
const createCall = client.createSandboxCalls[0];
if (!createCall) {
throw new Error("expected create sandbox call");
try {
const handle = await provider.createSandbox({
workspaceId: "default",
repoId: "repo-1",
repoRemote: "https://github.com/acme/repo.git",
branchName: "feature/test",
taskId: "task-1",
githubToken: "github-token",
});
expect(client.createSandboxCalls).toHaveLength(1);
const createCall = client.createSandboxCalls[0];
if (!createCall) {
throw new Error("expected create sandbox call");
}
expect(typeof createCall.image).not.toBe("string");
if (typeof createCall.image === "string") {
throw new Error("expected daytona image recipe object");
}
const dockerfile = createCall.image.dockerfile;
expect(dockerfile).toContain("apt-get install -y curl ca-certificates git openssh-client");
expect(dockerfile).toContain("deb.nodesource.com/setup_20.x");
expect(dockerfile).toContain("apt-get install -y nodejs");
expect(dockerfile).toContain("sandbox-agent/0.3.0/install.sh");
expect(dockerfile).toContain("sandbox-agent install-agent codex; sandbox-agent install-agent claude");
expect(dockerfile).not.toContain("|| true");
expect(dockerfile).not.toContain("ENTRYPOINT [");
expect(client.getPreviewEndpointCalls).toEqual([{ sandboxId: "sandbox-1", port: 2468 }]);
expect(client.executeCommandCalls).toHaveLength(1);
expect(client.executeCommandCalls[0]?.sandboxId).toBe("sandbox-1");
expect(client.executeCommandCalls[0]?.command).toContain("nohup sandbox-agent server --no-token --host 0.0.0.0 --port 2468");
expect(fetchCalls.map((call) => `${call.method} ${call.url}`)).toEqual([
"GET https://preview.example/sandbox/sandbox-1/port/2468/v1/health",
"POST https://preview.example/sandbox/sandbox-1/port/2468/v1/processes/run",
]);
const runCall = fetchCalls[1];
if (!runCall?.bodyText) {
throw new Error("expected process run request body");
}
const runBody = JSON.parse(runCall.bodyText) as {
command: string;
args: string[];
env?: Record<string, string>;
};
expect(runBody.command).toBe("bash");
expect(runBody.args).toHaveLength(2);
expect(runBody.args[0]).toBe("-lc");
expect(runBody.env).toEqual({
GITHUB_TOKEN: "github-token",
});
expect(runBody.args[1]).toContain("GIT_TERMINAL_PROMPT=0");
expect(runBody.args[1]).toContain('AUTH_REMOTE="$REMOTE"');
expect(runBody.args[1]).toContain('git clone "$AUTH_REMOTE"');
expect(runBody.args[1]).toContain('AUTH_HEADER="$(printf');
expect(handle.metadata.snapshot).toBe("snapshot-foundry");
expect(handle.metadata.image).toBe("ubuntu:24.04");
expect(handle.metadata.cwd).toBe("/home/daytona/foundry/default/repo-1/task-1/repo");
} finally {
restoreFetch();
}
expect(typeof createCall.image).not.toBe("string");
if (typeof createCall.image === "string") {
throw new Error("expected daytona image recipe object");
}
const dockerfile = createCall.image.dockerfile;
expect(dockerfile).toContain("apt-get install -y curl ca-certificates git openssh-client nodejs npm");
expect(dockerfile).toContain("sandbox-agent/0.3.0/install.sh");
const installAgentLines = dockerfile.match(/sandbox-agent install-agent [a-z0-9-]+/gi) ?? [];
expect(installAgentLines.length).toBeGreaterThanOrEqual(2);
const commands = client.executedCommands.join("\n");
expect(commands).toContain("GIT_TERMINAL_PROMPT=0");
expect(commands).toContain("GIT_ASKPASS=/bin/echo");
expect(handle.metadata.snapshot).toBe("snapshot-foundry");
expect(handle.metadata.image).toBe("ubuntu:24.04");
expect(handle.metadata.cwd).toBe("/home/daytona/foundry/default/repo-1/task-1/repo");
expect(client.executedCommands.length).toBeGreaterThan(0);
});
it("starts sandbox-agent with ACP timeout env override", async () => {
const previous = process.env.HF_SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS;
it("ensures sandbox-agent by checking health through the preview endpoint", async () => {
process.env.HF_SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS = "240000";
try {
const client = new RecordingDaytonaClient();
const provider = createProviderWithClient(client);
const client = new RecordingDaytonaClient();
const provider = createProviderWithClient(client);
const fetchCalls: RecordedFetchCall[] = [];
const restoreFetch = withFetchStub(async (call) => {
fetchCalls.push(call);
return new Response(JSON.stringify({ ok: true }), {
status: 200,
headers: { "Content-Type": "application/json" },
});
});
await provider.ensureSandboxAgent({
try {
const endpoint = await provider.ensureSandboxAgent({
workspaceId: "default",
sandboxId: "sandbox-1",
});
const startCommand = client.executedCommands.find((command) =>
command.includes("nohup env SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS=240000 sandbox-agent server"),
);
const joined = client.executedCommands.join("\n");
expect(joined).toContain("sandbox-agent/0.3.0/install.sh");
expect(joined).toContain("SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS=240000");
expect(joined).toContain("apt-get install -y nodejs npm");
expect(joined).toContain("sandbox-agent server --no-token --host 0.0.0.0 --port 2468");
expect(startCommand).toBeTruthy();
expect(endpoint).toEqual({
endpoint: "https://preview.example/sandbox/sandbox-1/port/2468",
token: "preview-token",
});
expect(client.executeCommandCalls).toHaveLength(1);
expect(client.executeCommandCalls[0]?.command).toContain("nohup sandbox-agent server --no-token --host 0.0.0.0 --port 2468");
expect(client.getPreviewEndpointCalls).toEqual([{ sandboxId: "sandbox-1", port: 2468 }]);
expect(fetchCalls.map((call) => `${call.method} ${call.url}`)).toEqual(["GET https://preview.example/sandbox/sandbox-1/port/2468/v1/health"]);
} finally {
if (previous === undefined) {
delete process.env.HF_SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS;
} else {
process.env.HF_SANDBOX_AGENT_ACP_REQUEST_TIMEOUT_MS = previous;
}
restoreFetch();
}
});
it("fails with explicit timeout when daytona createSandbox hangs", async () => {
const previous = process.env.HF_DAYTONA_REQUEST_TIMEOUT_MS;
process.env.HF_DAYTONA_REQUEST_TIMEOUT_MS = "120";
const hangingClient: DaytonaClientLike = {
@ -140,13 +237,20 @@ describe("daytona provider snapshot image behavior", () => {
startSandbox: async () => {},
stopSandbox: async () => {},
deleteSandbox: async () => {},
executeCommand: async () => ({ exitCode: 0, result: "" }),
getPreviewEndpoint: async (sandboxId, port) => ({
url: `https://preview.example/sandbox/${sandboxId}/port/${port}`,
token: "preview-token",
}),
executeCommand: async () => ({
exitCode: 0,
result: "",
}),
};
const restoreFetch = withFetchStub(async () => {
throw new Error("unexpected fetch");
});
try {
const provider = createProviderWithClient(hangingClient);
await expect(
@ -159,26 +263,64 @@ describe("daytona provider snapshot image behavior", () => {
}),
).rejects.toThrow("daytona create sandbox timed out after 120ms");
} finally {
if (previous === undefined) {
delete process.env.HF_DAYTONA_REQUEST_TIMEOUT_MS;
} else {
process.env.HF_DAYTONA_REQUEST_TIMEOUT_MS = previous;
}
restoreFetch();
}
});
it("executes backend-managed sandbox commands through provider API", async () => {
it("executes backend-managed sandbox commands through sandbox-agent process api", async () => {
const client = new RecordingDaytonaClient();
const provider = createProviderWithClient(client);
const fetchCalls: RecordedFetchCall[] = [];
const restoreFetch = withFetchStub(async (call) => {
fetchCalls.push(call);
const result = await provider.executeCommand({
workspaceId: "default",
sandboxId: "sandbox-1",
command: "echo backend-push",
label: "manual push",
if (call.url.endsWith("/v1/health")) {
return new Response(JSON.stringify({ ok: true }), {
status: 200,
headers: { "Content-Type": "application/json" },
});
}
if (call.url.endsWith("/v1/processes/run")) {
return new Response(JSON.stringify({ exitCode: 0, stdout: "backend-push\n", stderr: "" }), {
status: 200,
headers: { "Content-Type": "application/json" },
});
}
throw new Error(`unexpected fetch: ${call.method} ${call.url}`);
});
expect(result.exitCode).toBe(0);
expect(client.executedCommands).toContain("echo backend-push");
try {
const result = await provider.executeCommand({
workspaceId: "default",
sandboxId: "sandbox-1",
command: "echo backend-push",
env: { GITHUB_TOKEN: "user-token" },
label: "manual push",
});
expect(result.exitCode).toBe(0);
expect(result.result).toBe("backend-push\n");
expect(fetchCalls.map((call) => `${call.method} ${call.url}`)).toEqual([
"GET https://preview.example/sandbox/sandbox-1/port/2468/v1/health",
"POST https://preview.example/sandbox/sandbox-1/port/2468/v1/processes/run",
]);
const runCall = fetchCalls[1];
if (!runCall?.bodyText) {
throw new Error("expected process run body");
}
const runBody = JSON.parse(runCall.bodyText) as {
command: string;
args: string[];
env?: Record<string, string>;
};
expect(runBody.command).toBe("bash");
expect(runBody.args).toEqual(["-lc", "echo backend-push"]);
expect(runBody.env).toEqual({ GITHUB_TOKEN: "user-token" });
} finally {
restoreFetch();
}
});
});

View file

@ -54,6 +54,7 @@ export function createTestStackDriver(overrides?: Partial<StackDriver>): StackDr
export function createTestGithubDriver(overrides?: Partial<GithubDriver>): GithubDriver {
return {
listPullRequests: async () => [],
getPrInfo: async () => null,
createPr: async (_repoPath, _headBranch, _title) => ({
number: 1,
url: `https://github.com/test/repo/pull/1`,
@ -101,6 +102,15 @@ export function createTestSandboxAgentClient(overrides?: Partial<SandboxAgentCli
nextCursor: undefined,
}),
createProcess: async () => defaultProcess,
runProcess: async () => ({
durationMs: 1,
exitCode: 0,
stderr: "",
stderrTruncated: false,
stdout: "",
stdoutTruncated: false,
timedOut: false,
}),
listProcesses: async () => ({ processes: [defaultProcess] }),
getProcessLogs: async () => defaultLogs,
stopProcess: async () => ({ ...defaultProcess, status: "exited", exitCode: 0, exitedAtMs: Date.now() }),
@ -127,11 +137,14 @@ export function createTestDaytonaClient(overrides?: Partial<DaytonaClientLike>):
startSandbox: async () => {},
stopSandbox: async () => {},
deleteSandbox: async () => {},
executeCommand: async () => ({ exitCode: 0, result: "" }),
getPreviewEndpoint: async (sandboxId, port) => ({
url: `https://preview.example/sandbox/${sandboxId}/port/${port}`,
token: "preview-token",
}),
executeCommand: async () => ({
exitCode: 0,
result: "",
}),
...overrides,
};
}

View file

@ -1,31 +1,34 @@
import { describe, expect, it } from "vitest";
import {
githubStateKey,
historyKey,
organizationKey,
repositoryKey,
sandboxInstanceKey,
taskKey,
taskStatusSyncKey,
historyKey,
projectBranchSyncKey,
projectKey,
projectPrSyncKey,
sandboxInstanceKey,
workspaceKey,
userGithubDataKey,
} from "../src/actors/keys.js";
describe("actor keys", () => {
it("prefixes every key with workspace namespace", () => {
it("prefixes every key with organization namespace", () => {
const keys = [
workspaceKey("default"),
projectKey("default", "repo"),
organizationKey("default"),
repositoryKey("default", "repo"),
githubStateKey("default"),
taskKey("default", "repo", "task"),
sandboxInstanceKey("default", "daytona", "sbx"),
historyKey("default", "repo"),
projectPrSyncKey("default", "repo"),
projectBranchSyncKey("default", "repo"),
taskStatusSyncKey("default", "repo", "task", "sandbox-1", "session-1"),
];
for (const key of keys) {
expect(key[0]).toBe("ws");
expect(key[0]).toBe("org");
expect(key[1]).toBe("default");
}
});
it("uses a separate namespace for user-scoped GitHub auth", () => {
expect(userGithubDataKey("user-123")).toEqual(["user", "user-123", "github"]);
});
});

View file

@ -1,5 +1,5 @@
import { describe, expect, it } from "vitest";
import { normalizeParentBranch, parentLookupFromStack, sortBranchesForOverview } from "../src/actors/project/stack-model.js";
import { normalizeParentBranch, parentLookupFromStack, sortBranchesForOverview } from "../src/actors/repository/stack-model.js";
describe("stack-model", () => {
it("normalizes self-parent references to null", () => {

View file

@ -6,7 +6,7 @@ import { execFileSync } from "node:child_process";
import { setTimeout as delay } from "node:timers/promises";
import { describe, expect, it } from "vitest";
import { setupTest } from "rivetkit/test";
import { workspaceKey } from "../src/actors/keys.js";
import { organizationKey } from "../src/actors/keys.js";
import { registry } from "../src/actors/index.js";
import { createTestDriver } from "./helpers/test-driver.js";
import { createTestRuntimeContext } from "./helpers/test-context.js";
@ -41,10 +41,10 @@ describe("workspace isolation", () => {
createTestRuntimeContext(testDriver);
const { client } = await setupTest(t, registry);
const wsA = await client.workspace.getOrCreate(workspaceKey("alpha"), {
const wsA = await client.organization.getOrCreate(organizationKey("alpha"), {
createWithInput: "alpha",
});
const wsB = await client.workspace.getOrCreate(workspaceKey("beta"), {
const wsB = await client.organization.getOrCreate(organizationKey("beta"), {
createWithInput: "beta",
});

View file

@ -1,7 +1,7 @@
// @ts-nocheck
import { describe, expect, it } from "vitest";
import { setupTest } from "rivetkit/test";
import { workspaceKey } from "../src/actors/keys.js";
import { organizationKey } from "../src/actors/keys.js";
import { registry } from "../src/actors/index.js";
import { createTestDriver } from "./helpers/test-driver.js";
import { createTestRuntimeContext } from "./helpers/test-context.js";
@ -26,7 +26,7 @@ describe("workspace star sandbox agent repo", () => {
createTestRuntimeContext(testDriver);
const { client } = await setupTest(t, registry);
const ws = await client.workspace.getOrCreate(workspaceKey("alpha"), {
const ws = await client.organization.getOrCreate(organizationKey("alpha"), {
createWithInput: "alpha",
});

View file

@ -9,13 +9,14 @@
"build": "tsup src/index.ts --format esm --dts",
"typecheck": "tsc --noEmit",
"test": "vitest run",
"test:e2e:full": "HF_ENABLE_DAEMON_FULL_E2E=1 vitest run test/e2e/full-integration-e2e.test.ts",
"test:e2e:workbench": "HF_ENABLE_DAEMON_WORKBENCH_E2E=1 vitest run test/e2e/workbench-e2e.test.ts",
"test:e2e:workbench-load": "HF_ENABLE_DAEMON_WORKBENCH_LOAD_E2E=1 vitest run test/e2e/workbench-load-e2e.test.ts"
"test:e2e:github-pr": "vitest run --config vitest.e2e.config.ts test/e2e/github-pr-e2e.test.ts",
"test:e2e:full": "vitest run --config vitest.e2e.config.ts test/e2e/full-integration-e2e.test.ts",
"test:e2e:workbench": "vitest run --config vitest.e2e.config.ts test/e2e/workbench-e2e.test.ts",
"test:e2e:workbench-load": "vitest run --config vitest.e2e.config.ts test/e2e/workbench-load-e2e.test.ts"
},
"dependencies": {
"@sandbox-agent/foundry-shared": "workspace:*",
"rivetkit": "2.1.6",
"rivetkit": "https://pkg.pr.new/rivet-dev/rivet/rivetkit@4409",
"sandbox-agent": "workspace:*"
},
"devDependencies": {

View file

@ -18,6 +18,7 @@ import type {
TaskWorkbenchSetSessionUnreadInput,
TaskWorkbenchSendMessageInput,
TaskWorkbenchSnapshot,
WorkbenchTask,
TaskWorkbenchTabInput,
TaskWorkbenchUpdateDraftInput,
HistoryEvent,
@ -34,7 +35,7 @@ import type {
} from "@sandbox-agent/foundry-shared";
import type { ProcessCreateRequest, ProcessInfo, ProcessLogFollowQuery, ProcessLogsResponse, ProcessSignalQuery } from "sandbox-agent";
import { createMockBackendClient } from "./mock/backend-client.js";
import { sandboxInstanceKey, workspaceKey } from "./keys.js";
import { sandboxInstanceKey, organizationKey, taskKey } from "./keys.js";
export type TaskAction = "push" | "sync" | "merge" | "archive" | "kill";
@ -103,6 +104,10 @@ interface WorkspaceHandle {
revertWorkbenchFile(input: TaskWorkbenchDiffInput): Promise<void>;
}
interface TaskHandle {
getWorkbench(): Promise<WorkbenchTask>;
}
interface SandboxInstanceHandle {
createSession(input: {
prompt: string;
@ -124,9 +129,12 @@ interface SandboxInstanceHandle {
}
interface RivetClient {
workspace: {
organization: {
getOrCreate(key?: string | string[], opts?: { createWithInput?: unknown }): WorkspaceHandle;
};
task: {
get(key?: string | string[]): TaskHandle;
};
sandboxInstance: {
getOrCreate(key?: string | string[], opts?: { createWithInput?: unknown }): SandboxInstanceHandle;
};
@ -238,6 +246,7 @@ export interface BackendClient {
): Promise<{ providerId: ProviderId; sandboxId: string; state: string; at: number }>;
getSandboxAgentConnection(workspaceId: string, providerId: ProviderId, sandboxId: string): Promise<{ endpoint: string; token?: string }>;
getWorkbench(workspaceId: string): Promise<TaskWorkbenchSnapshot>;
getWorkbenchTask(workspaceId: string, taskId: string): Promise<WorkbenchTask>;
subscribeWorkbench(workspaceId: string, listener: () => void): () => void;
createWorkbenchTask(workspaceId: string, input: TaskWorkbenchCreateTaskInput): Promise<TaskWorkbenchCreateTaskResponse>;
markWorkbenchUnread(workspaceId: string, input: TaskWorkbenchSelectInput): Promise<void>;
@ -482,7 +491,8 @@ export function createBackendClient(options: BackendClientOptions): BackendClien
const shouldUseCandidate = metadata.clientEndpoint ? await probeMetadataEndpoint(candidateEndpoint, metadata.clientNamespace, 1_500) : true;
const resolvedEndpoint = shouldUseCandidate ? candidateEndpoint : options.endpoint;
return createClient({
const buildClient = createClient as any;
return buildClient({
endpoint: resolvedEndpoint,
namespace: metadata.clientNamespace,
token: metadata.clientToken,
@ -495,7 +505,7 @@ export function createBackendClient(options: BackendClientOptions): BackendClien
};
const workspace = async (workspaceId: string): Promise<WorkspaceHandle> =>
(await getClient()).workspace.getOrCreate(workspaceKey(workspaceId), {
(await getClient()).organization.getOrCreate(organizationKey(workspaceId), {
createWithInput: workspaceId,
});
@ -504,6 +514,13 @@ export function createBackendClient(options: BackendClientOptions): BackendClien
return (client as any).sandboxInstance.get(sandboxInstanceKey(workspaceId, providerId, sandboxId));
};
const taskById = async (workspaceId: string, taskId: string): Promise<TaskHandle> => {
const ws = await workspace(workspaceId);
const detail = await ws.getTask({ workspaceId, taskId });
const client = await getClient();
return client.task.get(taskKey(workspaceId, detail.repoId, taskId));
};
function isActorNotFoundError(error: unknown): boolean {
const message = error instanceof Error ? error.message : String(error);
return message.includes("Actor not found");
@ -576,8 +593,14 @@ export function createBackendClient(options: BackendClientOptions): BackendClien
entry.listeners.add(listener);
if (!entry.disposeConnPromise) {
entry.disposeConnPromise = (async () => {
const ensureConnection = (currentEntry: NonNullable<typeof entry>) => {
if (currentEntry.disposeConnPromise) {
return;
}
let reconnecting = false;
let disposeConnPromise: Promise<(() => Promise<void>) | null> | null = null;
disposeConnPromise = (async () => {
const handle = await workspace(workspaceId);
const conn = (handle as any).connect();
const unsubscribeEvent = conn.on("workbenchUpdated", () => {
@ -589,14 +612,39 @@ export function createBackendClient(options: BackendClientOptions): BackendClien
currentListener();
}
});
const unsubscribeError = conn.onError(() => {});
const unsubscribeError = conn.onError(() => {
if (reconnecting) {
return;
}
reconnecting = true;
const current = workbenchSubscriptions.get(workspaceId);
if (!current || current.disposeConnPromise !== disposeConnPromise) {
return;
}
current.disposeConnPromise = null;
void disposeConnPromise?.then(async (disposeConn) => {
await disposeConn?.();
});
if (current.listeners.size > 0) {
ensureConnection(current);
for (const currentListener of [...current.listeners]) {
currentListener();
}
}
});
return async () => {
unsubscribeEvent();
unsubscribeError();
await conn.dispose();
};
})().catch(() => null);
}
currentEntry.disposeConnPromise = disposeConnPromise;
};
ensureConnection(entry);
return () => {
const current = workbenchSubscriptions.get(workspaceId);
@ -984,6 +1032,10 @@ export function createBackendClient(options: BackendClientOptions): BackendClien
return (await workspace(workspaceId)).getWorkbench({ workspaceId });
},
async getWorkbenchTask(workspaceId: string, taskId: string): Promise<WorkbenchTask> {
return (await taskById(workspaceId, taskId)).getWorkbench();
},
subscribeWorkbench(workspaceId: string, listener: () => void): () => void {
return subscribeWorkbench(workspaceId, listener);
},

View file

@ -1,34 +1,30 @@
export type ActorKey = string[];
export function workspaceKey(workspaceId: string): ActorKey {
return ["ws", workspaceId];
export function organizationKey(organizationId: string): ActorKey {
return ["org", organizationId];
}
export function projectKey(workspaceId: string, repoId: string): ActorKey {
return ["ws", workspaceId, "project", repoId];
export function repositoryKey(organizationId: string, repoId: string): ActorKey {
return ["org", organizationId, "repo", repoId];
}
export function taskKey(workspaceId: string, repoId: string, taskId: string): ActorKey {
return ["ws", workspaceId, "project", repoId, "task", taskId];
export function githubStateKey(organizationId: string): ActorKey {
return ["org", organizationId, "github"];
}
export function sandboxInstanceKey(workspaceId: string, providerId: string, sandboxId: string): ActorKey {
return ["ws", workspaceId, "provider", providerId, "sandbox", sandboxId];
export function taskKey(organizationId: string, repoId: string, taskId: string): ActorKey {
return ["org", organizationId, "repo", repoId, "task", taskId];
}
export function historyKey(workspaceId: string, repoId: string): ActorKey {
return ["ws", workspaceId, "project", repoId, "history"];
export function sandboxInstanceKey(organizationId: string, providerId: string, sandboxId: string): ActorKey {
return ["org", organizationId, "provider", providerId, "sandbox", sandboxId];
}
export function projectPrSyncKey(workspaceId: string, repoId: string): ActorKey {
return ["ws", workspaceId, "project", repoId, "pr-sync"];
export function historyKey(organizationId: string, repoId: string): ActorKey {
return ["org", organizationId, "repo", repoId, "history"];
}
export function projectBranchSyncKey(workspaceId: string, repoId: string): ActorKey {
return ["ws", workspaceId, "project", repoId, "branch-sync"];
}
export function taskStatusSyncKey(workspaceId: string, repoId: string, taskId: string, sandboxId: string, sessionId: string): ActorKey {
export function taskStatusSyncKey(organizationId: string, repoId: string, taskId: string, sandboxId: string, sessionId: string): ActorKey {
// Include sandbox + session so multiple sandboxes/sessions can be tracked per task.
return ["ws", workspaceId, "project", repoId, "task", taskId, "status-sync", sandboxId, sessionId];
return ["org", organizationId, "repo", repoId, "task", taskId, "status-sync", sandboxId, sessionId];
}

View file

@ -6,6 +6,8 @@ export type MockGithubInstallationStatus = "connected" | "install_required" | "r
export type MockGithubSyncStatus = "pending" | "syncing" | "synced" | "error";
export type MockOrganizationKind = "personal" | "organization";
export type MockStarterRepoStatus = "pending" | "starred" | "skipped";
export type MockActorRuntimeStatus = "healthy" | "error";
export type MockActorRuntimeType = "organization" | "repository" | "task" | "history" | "sandbox_instance" | "task_status_sync";
export interface MockFoundryUser {
id: string;
@ -52,6 +54,27 @@ export interface MockFoundryGithubState {
lastSyncAt: number | null;
}
export interface MockFoundryActorRuntimeIssue {
actorId: string;
actorType: MockActorRuntimeType;
scopeId: string | null;
scopeLabel: string;
message: string;
workflowId: string | null;
stepName: string | null;
attempt: number | null;
willRetry: boolean;
retryDelayMs: number | null;
occurredAt: number;
}
export interface MockFoundryActorRuntimeState {
status: MockActorRuntimeStatus;
errorCount: number;
lastErrorAt: number | null;
issues: MockFoundryActorRuntimeIssue[];
}
export interface MockFoundryOrganizationSettings {
displayName: string;
slug: string;
@ -67,6 +90,7 @@ export interface MockFoundryOrganization {
kind: MockOrganizationKind;
settings: MockFoundryOrganizationSettings;
github: MockFoundryGithubState;
runtime: MockFoundryActorRuntimeState;
billing: MockFoundryBillingState;
members: MockFoundryOrganizationMember[];
seatAssignments: string[];
@ -140,6 +164,15 @@ function syncStatusFromLegacy(value: unknown): MockGithubSyncStatus {
}
}
function buildHealthyRuntimeState(): MockFoundryActorRuntimeState {
return {
status: "healthy",
errorCount: 0,
lastErrorAt: null,
issues: [],
};
}
function buildDefaultSnapshot(): MockFoundryAppSnapshot {
return {
auth: {
@ -203,6 +236,7 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot {
lastSyncLabel: "Synced just now",
lastSyncAt: Date.now() - 60_000,
},
runtime: buildHealthyRuntimeState(),
billing: {
planId: "free",
status: "active",
@ -237,6 +271,7 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot {
lastSyncLabel: "Waiting for first import",
lastSyncAt: null,
},
runtime: buildHealthyRuntimeState(),
billing: {
planId: "team",
status: "active",
@ -279,6 +314,7 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot {
lastSyncLabel: "Sync stalled 2 hours ago",
lastSyncAt: Date.now() - 2 * 60 * 60_000,
},
runtime: buildHealthyRuntimeState(),
billing: {
planId: "team",
status: "trialing",
@ -317,6 +353,7 @@ function buildDefaultSnapshot(): MockFoundryAppSnapshot {
lastSyncLabel: "Synced yesterday",
lastSyncAt: Date.now() - 24 * 60 * 60_000,
},
runtime: buildHealthyRuntimeState(),
billing: {
planId: "free",
status: "active",
@ -370,6 +407,7 @@ function parseStoredSnapshot(): MockFoundryAppSnapshot | null {
syncStatus: syncStatusFromLegacy(organization.github?.syncStatus ?? organization.repoImportStatus),
lastSyncAt: organization.github?.lastSyncAt ?? null,
},
runtime: organization.runtime ?? buildHealthyRuntimeState(),
})),
};
} catch {

View file

@ -82,6 +82,35 @@ function toTaskStatus(status: TaskRecord["status"], archived: boolean): TaskReco
return status;
}
function mapWorkbenchTaskStatus(task: TaskWorkbenchSnapshot["tasks"][number]): TaskRecord["status"] {
if (task.status === "archived") {
return "archived";
}
if (task.lifecycle?.state === "error") {
return "error";
}
if (task.status === "idle") {
return "idle";
}
if (task.status === "new") {
return task.lifecycle?.code ?? "init_create_sandbox";
}
return "running";
}
function mapWorkbenchTaskStatusMessage(task: TaskWorkbenchSnapshot["tasks"][number], status: TaskRecord["status"]): string {
if (status === "archived") {
return "archived";
}
if (status === "error") {
return task.lifecycle?.message ?? "mock task initialization failed";
}
if (task.status === "new") {
return task.lifecycle?.message ?? "mock sandbox provisioning";
}
return task.tabs.some((tab) => tab.status === "running") ? "agent responding" : "mock sandbox ready";
}
export function createMockBackendClient(defaultWorkspaceId = "default"): BackendClient {
const workbench = getSharedMockWorkbenchClient();
const listenersBySandboxId = new Map<string, Set<() => void>>();
@ -121,6 +150,8 @@ export function createMockBackendClient(defaultWorkspaceId = "default"): Backend
const task = requireTask(taskId);
const cwd = mockCwd(task.repoName, task.id);
const archived = task.status === "archived";
const taskStatus = mapWorkbenchTaskStatus(task);
const sandboxAvailable = task.status !== "new" && taskStatus !== "error" && taskStatus !== "archived";
return {
workspaceId: defaultWorkspaceId,
repoId: task.repoId,
@ -130,21 +161,23 @@ export function createMockBackendClient(defaultWorkspaceId = "default"): Backend
title: task.title,
task: task.title,
providerId: "local",
status: toTaskStatus(archived ? "archived" : "running", archived),
statusMessage: archived ? "archived" : "mock sandbox ready",
activeSandboxId: task.id,
activeSessionId: task.tabs[0]?.sessionId ?? null,
sandboxes: [
{
sandboxId: task.id,
providerId: "local",
sandboxActorId: "mock-sandbox",
switchTarget: `mock://${task.id}`,
cwd,
createdAt: task.updatedAtMs,
updatedAt: task.updatedAtMs,
},
],
status: toTaskStatus(taskStatus, archived),
statusMessage: mapWorkbenchTaskStatusMessage(task, taskStatus),
activeSandboxId: sandboxAvailable ? task.id : null,
activeSessionId: sandboxAvailable ? (task.tabs[0]?.sessionId ?? null) : null,
sandboxes: sandboxAvailable
? [
{
sandboxId: task.id,
providerId: "local",
sandboxActorId: "mock-sandbox",
switchTarget: `mock://${task.id}`,
cwd,
createdAt: task.updatedAtMs,
updatedAt: task.updatedAtMs,
},
]
: [],
agentType: task.tabs[0]?.agent === "Codex" ? "codex" : "claude",
prSubmitted: Boolean(task.pullRequest),
diffStat: task.fileChanges.length > 0 ? `+${task.fileChanges.length}/-${task.fileChanges.length}` : "+0/-0",
@ -272,7 +305,7 @@ export function createMockBackendClient(defaultWorkspaceId = "default"): Backend
taskId: task.id,
branchName: task.branch,
title: task.title,
status: task.status === "archived" ? "archived" : "running",
status: mapWorkbenchTaskStatus(task),
updatedAt: task.updatedAtMs,
}));
},
@ -462,6 +495,10 @@ export function createMockBackendClient(defaultWorkspaceId = "default"): Backend
return workbench.getSnapshot();
},
async getWorkbenchTask(_workspaceId: string, taskId: string) {
return requireTask(taskId);
},
subscribeWorkbench(_workspaceId: string, listener: () => void): () => void {
return workbench.subscribe(listener);
},

View file

@ -267,6 +267,40 @@ export function removeFileTreePath(nodes: FileTreeNode[], targetPath: string): F
export function buildInitialTasks(): Task[] {
return [
// ── rivet-dev/sandbox-agent ──
{
id: "h0",
repoId: "sandbox-agent",
title: "Recover from sandbox session bootstrap timeout",
status: "idle",
lifecycle: {
code: "error",
state: "error",
label: "Session startup failed",
message: "createSession failed after 3 attempts: upstream 504 Gateway Timeout",
},
repoName: "rivet-dev/sandbox-agent",
updatedAtMs: minutesAgo(1),
branch: "fix/session-bootstrap-timeout",
pullRequest: null,
tabs: [
{
id: "t0",
sessionId: null,
sessionName: "Failed startup",
agent: "Claude",
model: "claude-sonnet-4",
status: "error",
thinkingSinceMs: null,
unread: false,
created: false,
draft: { text: "", attachments: [], updatedAtMs: null },
transcript: [],
},
],
fileChanges: [],
diffs: {},
fileTree: [],
},
{
id: "h1",
repoId: "sandbox-agent",

View file

@ -3,7 +3,7 @@ import { describe, expect, it } from "vitest";
import type { HistoryEvent, RepoOverview } from "@sandbox-agent/foundry-shared";
import { createBackendClient } from "../../src/backend-client.js";
const RUN_FULL_E2E = process.env.HF_ENABLE_DAEMON_FULL_E2E === "1";
const DEFAULT_E2E_GITHUB_REPO = "rivet-dev/sandbox-agent-testing";
function requiredEnv(name: string): string {
const value = process.env[name]?.trim();
@ -106,10 +106,10 @@ async function ensureRemoteBranchExists(token: string, fullName: string, branchN
}
describe("e2e(client): full integration stack workflow", () => {
it.skipIf(!RUN_FULL_E2E)("adds repo, loads branch graph, and executes a stack restack action", { timeout: 8 * 60_000 }, async () => {
it("adds repo, loads branch graph, and executes a stack restack action", { timeout: 8 * 60_000 }, async () => {
const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/api/rivet";
const workspaceId = process.env.HF_E2E_WORKSPACE?.trim() || "default";
const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO");
const repoRemote = process.env.HF_E2E_GITHUB_REPO?.trim() || DEFAULT_E2E_GITHUB_REPO;
const githubToken = requiredEnv("GITHUB_TOKEN");
const { fullName } = parseGithubRepo(repoRemote);
const normalizedRepoRemote = `https://github.com/${fullName}.git`;

View file

@ -2,7 +2,7 @@ import { describe, expect, it } from "vitest";
import type { TaskRecord, HistoryEvent } from "@sandbox-agent/foundry-shared";
import { createBackendClient } from "../../src/backend-client.js";
const RUN_E2E = process.env.HF_ENABLE_DAEMON_E2E === "1";
const DEFAULT_E2E_GITHUB_REPO = "rivet-dev/sandbox-agent-testing";
function requiredEnv(name: string): string {
const value = process.env[name]?.trim();
@ -143,10 +143,10 @@ async function githubApi(token: string, path: string, init?: RequestInit): Promi
}
describe("e2e: backend -> sandbox-agent -> git -> PR", () => {
it.skipIf(!RUN_E2E)("creates a task, waits for agent to implement, and opens a PR", { timeout: 15 * 60_000 }, async () => {
it("creates a task, waits for agent to implement, and opens a PR", { timeout: 15 * 60_000 }, async () => {
const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/api/rivet";
const workspaceId = process.env.HF_E2E_WORKSPACE?.trim() || "default";
const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO");
const repoRemote = process.env.HF_E2E_GITHUB_REPO?.trim() || DEFAULT_E2E_GITHUB_REPO;
const githubToken = requiredEnv("GITHUB_TOKEN");
const { fullName } = parseGithubRepo(repoRemote);

View file

@ -4,7 +4,7 @@ import { describe, expect, it } from "vitest";
import type { TaskWorkbenchSnapshot, WorkbenchAgentTab, WorkbenchTask, WorkbenchModelId, WorkbenchTranscriptEvent } from "@sandbox-agent/foundry-shared";
import { createBackendClient } from "../../src/backend-client.js";
const RUN_WORKBENCH_E2E = process.env.HF_ENABLE_DAEMON_WORKBENCH_E2E === "1";
const DEFAULT_E2E_GITHUB_REPO = "rivet-dev/sandbox-agent-testing";
const execFileAsync = promisify(execFile);
function requiredEnv(name: string): string {
@ -144,10 +144,11 @@ function transcriptIncludesAgentText(transcript: WorkbenchTranscriptEvent[], exp
}
describe("e2e(client): workbench flows", () => {
it.skipIf(!RUN_WORKBENCH_E2E)("creates a task, adds sessions, exchanges messages, and manages workbench state", { timeout: 20 * 60_000 }, async () => {
it("creates a task, adds sessions, exchanges messages, and manages workbench state", { timeout: 20 * 60_000 }, async () => {
const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/api/rivet";
const workspaceId = process.env.HF_E2E_WORKSPACE?.trim() || "default";
const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO");
const repoRemote = process.env.HF_E2E_GITHUB_REPO?.trim() || DEFAULT_E2E_GITHUB_REPO;
requiredEnv("GITHUB_TOKEN");
const model = workbenchModelEnv("HF_E2E_MODEL", "gpt-4o");
const runId = `wb-${Date.now().toString(36)}`;
const expectedFile = `${runId}.txt`;

View file

@ -2,7 +2,7 @@ import { describe, expect, it } from "vitest";
import type { TaskWorkbenchSnapshot, WorkbenchAgentTab, WorkbenchTask, WorkbenchModelId, WorkbenchTranscriptEvent } from "@sandbox-agent/foundry-shared";
import { createBackendClient } from "../../src/backend-client.js";
const RUN_WORKBENCH_LOAD_E2E = process.env.HF_ENABLE_DAEMON_WORKBENCH_LOAD_E2E === "1";
const DEFAULT_E2E_GITHUB_REPO = "rivet-dev/sandbox-agent-testing";
function requiredEnv(name: string): string {
const value = process.env[name]?.trim();
@ -174,10 +174,11 @@ async function measureWorkbenchSnapshot(
}
describe("e2e(client): workbench load", () => {
it.skipIf(!RUN_WORKBENCH_LOAD_E2E)("runs a simple sequential load profile against the real backend", { timeout: 30 * 60_000 }, async () => {
it("runs a simple sequential load profile against the real backend", { timeout: 30 * 60_000 }, async () => {
const endpoint = process.env.HF_E2E_BACKEND_ENDPOINT?.trim() || "http://127.0.0.1:7741/api/rivet";
const workspaceId = process.env.HF_E2E_WORKSPACE?.trim() || "default";
const repoRemote = requiredEnv("HF_E2E_GITHUB_REPO");
const repoRemote = process.env.HF_E2E_GITHUB_REPO?.trim() || DEFAULT_E2E_GITHUB_REPO;
requiredEnv("GITHUB_TOKEN");
const model = workbenchModelEnv("HF_E2E_MODEL", "gpt-4o");
const taskCount = intEnv("HF_LOAD_TASK_COUNT", 3);
const extraSessionCount = intEnv("HF_LOAD_EXTRA_SESSION_COUNT", 2);

View file

@ -1,21 +1,20 @@
import { describe, expect, it } from "vitest";
import { taskKey, taskStatusSyncKey, historyKey, projectBranchSyncKey, projectKey, projectPrSyncKey, sandboxInstanceKey, workspaceKey } from "../src/keys.js";
import { githubStateKey, historyKey, organizationKey, repositoryKey, sandboxInstanceKey, taskKey, taskStatusSyncKey } from "../src/keys.js";
describe("actor keys", () => {
it("prefixes every key with workspace namespace", () => {
it("prefixes every key with organization namespace", () => {
const keys = [
workspaceKey("default"),
projectKey("default", "repo"),
organizationKey("default"),
repositoryKey("default", "repo"),
githubStateKey("default"),
taskKey("default", "repo", "task"),
sandboxInstanceKey("default", "daytona", "sbx"),
historyKey("default", "repo"),
projectPrSyncKey("default", "repo"),
projectBranchSyncKey("default", "repo"),
taskStatusSyncKey("default", "repo", "task", "sandbox-1", "session-1"),
];
for (const key of keys) {
expect(key[0]).toBe("ws");
expect(key[0]).toBe("org");
expect(key[1]).toBe("default");
}
});

View file

@ -0,0 +1,8 @@
import { defineConfig } from "vitest/config";
export default defineConfig({
test: {
include: ["test/**/*.test.ts"],
exclude: ["test/e2e/**/*.test.ts"],
},
});

View file

@ -0,0 +1,7 @@
import { defineConfig } from "vitest/config";
export default defineConfig({
test: {
include: ["test/e2e/**/*.test.ts"],
},
});

View file

@ -28,6 +28,7 @@ import {
type ModelId,
} from "./mock-layout/view-model";
import { activeMockOrganization, useMockAppSnapshot } from "../lib/mock-app";
import { backendClient } from "../lib/backend";
import { getTaskWorkbenchClient } from "../lib/workbench";
function firstAgentTabId(task: Task): string | null {
@ -63,6 +64,39 @@ function sanitizeActiveTabId(task: Task, tabId: string | null | undefined, openD
return openDiffs.length > 0 ? diffTabId(openDiffs[openDiffs.length - 1]!) : lastAgentTabId;
}
function resolvedTaskLifecycle(task: Task) {
return (
task.lifecycle ?? {
code: task.status === "running" ? "running" : task.status === "idle" ? "idle" : task.status === "archived" ? "archived" : "init_create_sandbox",
state: task.status === "running" || task.status === "idle" ? "ready" : task.status === "archived" ? "archived" : "starting",
label:
task.status === "running" ? "Agent running" : task.status === "idle" ? "Task idle" : task.status === "archived" ? "Task archived" : "Creating sandbox",
message: null,
}
);
}
function taskLifecycleAccent(task: Task): string {
switch (resolvedTaskLifecycle(task).state) {
case "error":
return "#ef4444";
case "starting":
return "#f59e0b";
case "ready":
return "#10b981";
case "archived":
case "killed":
return "#94a3b8";
default:
return "#94a3b8";
}
}
function shouldShowTaskLifecycle(task: Task): boolean {
const lifecycle = resolvedTaskLifecycle(task);
return lifecycle.state === "starting" || lifecycle.state === "error";
}
const TranscriptPanel = memo(function TranscriptPanel({
taskWorkbenchClient,
task,
@ -445,6 +479,7 @@ const TranscriptPanel = memo(function TranscriptPanel({
activeAgentTab?.status === "running" && activeAgentTab.thinkingSinceMs !== null
? formatThinkingDuration(timerNowMs - activeAgentTab.thinkingSinceMs)
: null;
const lifecycle = resolvedTaskLifecycle(task);
return (
<SPanel>
@ -470,6 +505,37 @@ const TranscriptPanel = memo(function TranscriptPanel({
onToggleRightSidebar={onToggleRightSidebar}
onNavigateToUsage={onNavigateToUsage}
/>
{shouldShowTaskLifecycle(task) ? (
<div
style={{
display: "flex",
flexDirection: "column",
gap: "4px",
padding: "10px 16px",
borderLeft: `3px solid ${taskLifecycleAccent(task)}`,
background: lifecycle.state === "error" ? "rgba(127, 29, 29, 0.35)" : "rgba(120, 53, 15, 0.28)",
borderBottom: "1px solid rgba(255, 255, 255, 0.08)",
}}
>
<div
style={{
display: "flex",
alignItems: "center",
gap: "8px",
fontSize: "12px",
fontWeight: 700,
letterSpacing: "0.04em",
textTransform: "uppercase",
}}
>
<span style={{ color: taskLifecycleAccent(task) }}>{lifecycle.label}</span>
<span style={{ opacity: 0.6 }}>{lifecycle.code}</span>
</div>
<div style={{ fontSize: "13px", color: "#e4e4e7" }}>
{lifecycle.message ?? (lifecycle.state === "starting" ? "Waiting for the sandbox and first session to come online." : "Task startup failed.")}
</div>
</div>
) : null}
<div
style={{
flex: 1,
@ -530,7 +596,12 @@ const TranscriptPanel = memo(function TranscriptPanel({
}}
>
<h2 style={{ margin: 0, fontSize: "20px", fontWeight: 600 }}>Create the first session</h2>
<p style={{ margin: 0, opacity: 0.75 }}>Sessions are where you chat with the agent. Start one now to send the first prompt on this task.</p>
<p style={{ margin: 0, opacity: 0.75 }}>
{lifecycle.state === "starting"
? `Task startup is still in progress: ${lifecycle.label}.`
: "Sessions are where you chat with the agent. Start one now to send the first prompt on this task."}
</p>
{lifecycle.message ? <p style={{ margin: 0, fontSize: "13px", color: "#d4d4d8" }}>{lifecycle.message}</p> : null}
<button
type="button"
onClick={addTab}
@ -814,6 +885,72 @@ interface MockLayoutProps {
selectedSessionId?: string | null;
}
function githubStatusPill(organization: ReturnType<typeof activeMockOrganization>) {
if (!organization) {
return null;
}
const label =
organization.github.installationStatus !== "connected"
? "GitHub disconnected"
: organization.github.syncStatus === "syncing"
? "GitHub syncing"
: organization.github.syncStatus === "error"
? "GitHub error"
: organization.github.syncStatus === "pending"
? "GitHub pending"
: "GitHub synced";
const colors =
organization.github.installationStatus !== "connected"
? { background: "rgba(255, 193, 7, 0.18)", color: "#ffe6a6" }
: organization.github.syncStatus === "syncing"
? { background: "rgba(24, 140, 255, 0.18)", color: "#b9d8ff" }
: organization.github.syncStatus === "error"
? { background: "rgba(255, 79, 0, 0.18)", color: "#ffd6c7" }
: { background: "rgba(46, 160, 67, 0.16)", color: "#b7f0c3" };
return (
<span
style={{
border: "1px solid rgba(255,255,255,0.08)",
borderRadius: "999px",
padding: "6px 10px",
background: colors.background,
color: colors.color,
fontSize: "12px",
fontWeight: 700,
}}
>
{label}
</span>
);
}
function actorRuntimePill(organization: ReturnType<typeof activeMockOrganization>) {
if (!organization || organization.runtime.status !== "error") {
return null;
}
const label = organization.runtime.errorCount === 1 ? "1 actor error" : `${organization.runtime.errorCount} actor errors`;
return (
<span
style={{
border: "1px solid rgba(255,255,255,0.08)",
borderRadius: "999px",
padding: "6px 10px",
background: "rgba(255, 79, 0, 0.2)",
color: "#ffd6c7",
fontSize: "12px",
fontWeight: 800,
}}
>
{label}
</span>
);
}
function MockWorkspaceOrgBar() {
const navigate = useNavigate();
const snapshot = useMockAppSnapshot();
@ -834,6 +971,7 @@ function MockWorkspaceOrgBar() {
fontSize: "13px",
fontWeight: 600,
} satisfies React.CSSProperties;
const latestRuntimeIssue = organization.runtime.issues[0] ?? null;
return (
<div
@ -851,6 +989,15 @@ function MockWorkspaceOrgBar() {
<strong style={{ fontSize: "14px", fontWeight: 600 }}>{organization.settings.displayName}</strong>
<span style={{ fontSize: "12px", color: t.textMuted }}>{organization.settings.primaryDomain}</span>
</div>
<div style={{ display: "flex", alignItems: "center", gap: "10px", flexWrap: "wrap" }}>
{actorRuntimePill(organization)}
{githubStatusPill(organization)}
<span style={{ fontSize: "12px", color: t.textMuted }}>
{organization.runtime.status === "error" && latestRuntimeIssue
? `${latestRuntimeIssue.scopeLabel}: ${latestRuntimeIssue.message}`
: organization.github.lastSyncLabel}
</span>
</div>
<div style={{ display: "flex", gap: "8px", flexWrap: "wrap" }}>
<button
type="button"
@ -923,6 +1070,8 @@ export function MockLayout({ workspaceId, selectedTaskId, selectedSessionId }: M
const [lastAgentTabIdByTask, setLastAgentTabIdByTask] = useState<Record<string, string | null>>({});
const [openDiffsByTask, setOpenDiffsByTask] = useState<Record<string, string[]>>({});
const [selectedNewTaskRepoId, setSelectedNewTaskRepoId] = useState("");
const [activeTaskDetail, setActiveTaskDetail] = useState<Task | null>(null);
const [activeTaskDetailLoading, setActiveTaskDetailLoading] = useState(false);
const [leftWidth, setLeftWidth] = useState(() => readStoredWidth(LEFT_WIDTH_STORAGE_KEY, LEFT_SIDEBAR_DEFAULT_WIDTH));
const [rightWidth, setRightWidth] = useState(() => readStoredWidth(RIGHT_WIDTH_STORAGE_KEY, RIGHT_SIDEBAR_DEFAULT_WIDTH));
const leftWidthRef = useRef(leftWidth);
@ -995,7 +1144,43 @@ export function MockLayout({ workspaceId, selectedTaskId, selectedSessionId }: M
startRightRef.current = rightWidthRef.current;
}, []);
const activeTask = useMemo(() => tasks.find((task) => task.id === selectedTaskId) ?? tasks[0] ?? null, [tasks, selectedTaskId]);
const activeTaskSummary = useMemo(() => tasks.find((task) => task.id === selectedTaskId) ?? tasks[0] ?? null, [tasks, selectedTaskId]);
const activeTask = useMemo(() => {
if (activeTaskSummary && activeTaskDetail?.id === activeTaskSummary.id) {
return activeTaskDetail;
}
return activeTaskSummary;
}, [activeTaskDetail, activeTaskSummary]);
useEffect(() => {
if (!activeTaskSummary) {
setActiveTaskDetail(null);
setActiveTaskDetailLoading(false);
return;
}
let cancelled = false;
setActiveTaskDetailLoading(true);
void backendClient
.getWorkbenchTask(workspaceId, activeTaskSummary.id)
.then((task) => {
if (cancelled) return;
setActiveTaskDetail(task as Task);
})
.catch((error) => {
if (cancelled) return;
console.error("failed to load active task detail", error);
setActiveTaskDetail(null);
})
.finally(() => {
if (cancelled) return;
setActiveTaskDetailLoading(false);
});
return () => {
cancelled = true;
};
}, [activeTaskSummary?.id, workspaceId, viewModel]);
useEffect(() => {
if (activeTask) {
@ -1091,6 +1276,9 @@ export function MockLayout({ workspaceId, selectedTaskId, selectedSessionId }: M
if (!activeTask) {
return;
}
if (activeTaskDetailLoading) {
return;
}
if (activeTask.tabs.length > 0) {
autoCreatingSessionForTaskRef.current.delete(activeTask.id);
return;
@ -1113,7 +1301,7 @@ export function MockLayout({ workspaceId, selectedTaskId, selectedSessionId }: M
autoCreatingSessionForTaskRef.current.delete(activeTask.id);
}
})();
}, [activeTask, selectedSessionId, syncRouteSession, taskWorkbenchClient]);
}, [activeTask, activeTaskDetailLoading, selectedSessionId, syncRouteSession, taskWorkbenchClient]);
const createTask = useCallback(() => {
void (async () => {

View file

@ -122,6 +122,24 @@ export const RightSidebar = memo(function RightSidebar({
observer.observe(node);
}, []);
const pullRequestUrl = task.pullRequest != null ? `https://github.com/${task.repoName}/pull/${task.pullRequest.number}` : null;
const pullRequestStatusLabel =
task.pullRequest?.status === "merged"
? "Merged"
: task.pullRequest?.status === "closed"
? "Closed"
: task.pullRequest?.status === "draft"
? "Draft"
: task.pullRequest?.status === "ready"
? "Open"
: null;
const pullRequestActionLabel =
task.pullRequest?.status === "merged"
? "Open merged PR"
: task.pullRequest?.status === "closed"
? "Open closed PR"
: pullRequestUrl
? "Open PR"
: "Publish PR";
const copyFilePath = useCallback(async (path: string) => {
try {
@ -155,6 +173,38 @@ export const RightSidebar = memo(function RightSidebar({
<div ref={headerRef} className={css({ display: "flex", alignItems: "center", flex: 1, minWidth: 0, justifyContent: "flex-end", gap: "2px" })}>
{!isTerminal ? (
<div className={css({ display: "flex", alignItems: "center", gap: "2px", flexShrink: 1, minWidth: 0 })}>
{pullRequestStatusLabel ? (
<span
className={css({
display: "inline-flex",
alignItems: "center",
padding: compact ? "3px 6px" : "4px 8px",
borderRadius: "999px",
fontSize: "10px",
fontWeight: 600,
color:
task.pullRequest?.status === "merged"
? "#86efac"
: task.pullRequest?.status === "closed"
? "#fca5a5"
: task.pullRequest?.status === "draft"
? t.accent
: t.textSecondary,
backgroundColor:
task.pullRequest?.status === "merged"
? "rgba(22, 101, 52, 0.35)"
: task.pullRequest?.status === "closed"
? "rgba(127, 29, 29, 0.35)"
: task.pullRequest?.status === "draft"
? "rgba(154, 52, 18, 0.28)"
: t.interactiveHover,
whiteSpace: "nowrap",
flexShrink: 0,
})}
>
{pullRequestStatusLabel}
</span>
) : null}
<button
onClick={() => {
if (pullRequestUrl) {
@ -188,7 +238,7 @@ export const RightSidebar = memo(function RightSidebar({
})}
>
<GitPullRequest size={12} style={{ flexShrink: 0 }} />
{!compact && <span>{pullRequestUrl ? "Open PR" : "Publish PR"}</span>}
{!compact && <span>{pullRequestActionLabel}</span>}
</button>
<button
className={css({

Some files were not shown because too many files have changed in this diff Show more