Initial companion-offload skill for Claude Code

This commit is contained in:
Harivansh Rathi 2026-03-09 23:14:32 -07:00
commit ec7919c4ba
4 changed files with 352 additions and 0 deletions

7
.gitignore vendored Normal file
View file

@ -0,0 +1,7 @@
node_modules/
.env
.companion_env
__pycache__/
*.pyc
.venv/
.DS_Store

62
README.md Normal file
View file

@ -0,0 +1,62 @@
# Companion Offload Skill
A [skills.sh](https://skills.sh) skill for Claude Code that offloads tasks to a Companion sandbox in the cloud.
## What it does
When you invoke this skill, Claude will:
1. Use `companion ssh` to connect to your sandbox
2. Sync your entire project, environment variables, and Claude session history
3. Install Claude Code on the sandbox if needed
4. Start a Claude Code session with `--dangerously-skip-permissions` so it runs autonomously
The remote session picks up right where you left off - full context, full environment.
## Install
```bash
npx skills add getcompanion-ai/companion-skill
```
## Usage
Inside a Claude Code session, tell Claude to offload:
- "offload this task to my companion"
- "continue this in the cloud"
- "run this on my sandbox"
Or use the standalone script:
```bash
./companion-offload/scripts/offload.sh "finish implementing the API"
```
## Prerequisites
- `companion` CLI installed: `npm install -g @getcompanion/cli`
- Logged in: `companion login`
- A running sandbox: `companion create <handle>`
- `rsync` available locally
## Remote workspace
Files are synced to whichever path exists on the sandbox:
- `/home/node/.openclaw/workspace`
- `/home/node/.pi/workspace`
## Reconnecting
After offloading, reconnect to the remote session:
```bash
companion ssh
cd /home/node/.openclaw/workspace
claude --continue
```
## License
MIT

169
companion-offload/SKILL.md Normal file
View file

@ -0,0 +1,169 @@
---
name: companion-offload
description: Offload a Claude Code task to a remote Companion sandbox via SSH. Use when the user wants to continue a task in the cloud, run something remotely, or offload work to their Companion sandbox. Triggers on "offload", "run in cloud", "companion run", "remote task", or "send to sandbox".
compatibility: Requires the companion CLI (npm install -g @getcompanion/cli), an active Companion sandbox, and rsync. Designed for Claude Code.
allowed-tools: Bash(companion:*) Bash(rsync:*) Bash(scp:*) Bash(ssh:*) Bash(cat:*) Bash(env:*) Read
---
# Companion Offload
Offload the current Claude Code session to a remote Companion sandbox so it can continue running autonomously in the cloud.
## When to use
- The user says "offload this task" or "run this in the cloud"
- A long-running task needs to continue without tying up the local machine
- The user wants to hand off work to their Companion sandbox
## Prerequisites
- The `companion` CLI must be installed and the user must be logged in (`companion login`)
- An active sandbox must exist and be running (`companion ls` to check)
- `rsync` must be available locally
## Instructions
### Step 1: Verify companion CLI is available and user is logged in
```bash
companion whoami
```
If this fails, tell the user to run `companion login` first.
### Step 2: Check that a sandbox is running
```bash
companion ls
```
If no sandbox exists or it's not running, tell the user to create one with `companion create <handle>`.
### Step 3: Inject SSH key into the sandbox
Use `--inject-only` to set up SSH access without connecting:
```bash
companion ssh --inject-only
```
This outputs the SSH key path and the manual SSH command. Parse the output to extract:
- The key path (e.g. `~/.companion/ssh/id_ed25519_<handle>`)
- The sandbox handle
### Step 4: Build the SSH/rsync connection parameters
The Companion SSH gateway uses a ProxyCommand. Construct the connection parameters:
```bash
HANDLE="<sandbox-handle-from-step-2>"
KEY_PATH="$HOME/.companion/ssh/id_ed25519_$HANDLE"
SSH_GATEWAY="${COMPANION_SSH_GATEWAY:-ssh.os.companion.ai}"
SSH_PORT="2222"
PROXY_CMD="sh -c '( printf \"COMPANION:${HANDLE}\n\"; cat ) | nc ${SSH_GATEWAY} ${SSH_PORT}'"
SSH_OPTS="-i $KEY_PATH -o StrictHostKeyChecking=accept-new -o LogLevel=ERROR -o ProxyCommand=$PROXY_CMD"
```
### Step 5: Determine the remote workspace path
```bash
ssh $SSH_OPTS node@$HANDLE '[ -d /home/node/.openclaw/workspace ] && echo "/home/node/.openclaw/workspace" || ([ -d /home/node/.pi/workspace ] && echo "/home/node/.pi/workspace" || echo "NONE")'
```
If `NONE`, use `/home/node/.openclaw/workspace` and create it:
```bash
ssh $SSH_OPTS node@$HANDLE 'mkdir -p /home/node/.openclaw/workspace'
```
Store the result as `REMOTE_WORKSPACE`.
### Step 6: Install Claude Code on the sandbox if missing
```bash
ssh $SSH_OPTS node@$HANDLE 'command -v claude >/dev/null 2>&1 && echo "installed" || echo "missing"'
```
If missing:
```bash
ssh $SSH_OPTS node@$HANDLE 'curl -fsSL https://claude.ai/install.sh | sh 2>/dev/null || npm install -g @anthropic-ai/claude-code'
```
### Step 7: Sync the repository to the sandbox
Use rsync over the SSH ProxyCommand to transfer the current project:
```bash
rsync -avz --progress \
--exclude='node_modules' \
--exclude='.git/objects' \
--exclude='__pycache__' \
--exclude='.venv' \
--exclude='venv' \
--exclude='.next' \
--exclude='dist' \
--exclude='build' \
-e "ssh $SSH_OPTS" \
"$(pwd)/" "node@$HANDLE:$REMOTE_WORKSPACE/"
```
### Step 8: Sync the Claude session history
Transfer the local `.claude/` directory (session history, settings, memory) so the remote Claude session has full context:
```bash
rsync -avz --progress \
-e "ssh $SSH_OPTS" \
"$(pwd)/.claude/" "node@$HANDLE:$REMOTE_WORKSPACE/.claude/"
```
### Step 9: Transfer environment variables
Capture relevant env vars (especially ANTHROPIC_API_KEY) and send them to the sandbox:
```bash
env | grep -v '^_=' | grep -v '^SHELL=' | grep -v '^TERM_' | grep -v '^SSH_' | \
grep -v '^DISPLAY=' | grep -v '^HOME=' | grep -v '^USER=' | grep -v '^LOGNAME=' | \
grep -v '^PATH=' | grep -v '^PWD=' | grep -v '^OLDPWD=' | grep -v '^SHLVL=' \
> /tmp/companion_env_export.txt
rsync -avz -e "ssh $SSH_OPTS" /tmp/companion_env_export.txt "node@$HANDLE:$REMOTE_WORKSPACE/.companion_env"
rm -f /tmp/companion_env_export.txt
```
### Step 10: Launch Claude Code on the sandbox
If the user provided a specific task/prompt to offload:
```bash
ssh -t $SSH_OPTS node@$HANDLE "cd $REMOTE_WORKSPACE && set -a && source .companion_env 2>/dev/null; set +a && claude --dangerously-skip-permissions --prompt '${TASK_PROMPT}'"
```
Otherwise, continue the session:
```bash
ssh -t $SSH_OPTS node@$HANDLE "cd $REMOTE_WORKSPACE && set -a && source .companion_env 2>/dev/null; set +a && claude --dangerously-skip-permissions --continue"
```
### Step 11: Confirm to the user
After launching, tell the user:
- The session is now running on their Companion sandbox
- They can reconnect with: `companion ssh` then `cd $REMOTE_WORKSPACE && claude --continue`
- To sync results back locally: `rsync -avz -e "ssh $SSH_OPTS" node@$HANDLE:$REMOTE_WORKSPACE/ ./`
## Important notes
- Always exclude `node_modules`, `.git/objects`, and other large directories from rsync
- The `--dangerously-skip-permissions` flag is required so the remote session runs autonomously
- If the sandbox already has `ANTHROPIC_API_KEY` set, the offloaded session will use it. Otherwise ensure Step 9 transfers it.
- Session history in `.claude/` gives the remote Claude full context of the local conversation
- The SSH gateway at `ssh.os.companion.ai:2222` routes connections via the `COMPANION:<handle>` prefix
## Error handling
- If `companion whoami` fails: user needs to `companion login`
- If `companion ls` shows no sandbox: user needs to `companion create <handle>`
- If SSH key injection fails: sandbox may not be running
- If rsync fails: check disk space on sandbox
- If Claude is not installable: ensure Node.js >= 18 is on the sandbox

View file

@ -0,0 +1,114 @@
#!/usr/bin/env bash
# Companion Offload Script
# Syncs local Claude Code environment to a Companion sandbox and starts a Claude session there.
#
# Usage: ./offload.sh [task_prompt]
# Example: ./offload.sh "finish implementing the API endpoint"
#
# Prerequisites:
# - companion CLI installed and logged in (companion login)
# - A running sandbox (companion ls / companion create <handle>)
# - rsync available locally
set -euo pipefail
TASK_PROMPT="${1:-}"
LOCAL_DIR="$(pwd)"
# --- Check companion CLI ---
if ! command -v companion >/dev/null 2>&1; then
echo "[offload] ERROR: companion CLI not found. Install with: npm install -g @getcompanion/cli"
exit 1
fi
echo "[offload] Checking authentication..."
companion whoami || { echo "[offload] ERROR: Not logged in. Run 'companion login' first."; exit 1; }
# --- Get sandbox info ---
echo "[offload] Checking sandbox status..."
SANDBOX_OUTPUT=$(companion ls 2>&1) || { echo "[offload] ERROR: No sandbox found. Create one with: companion create <handle>"; exit 1; }
echo "$SANDBOX_OUTPUT"
# Extract handle from companion ls output
HANDLE=$(echo "$SANDBOX_OUTPUT" | grep -i "handle:" | head -1 | awk '{print $NF}' | tr -d '[:space:]')
if [ -z "$HANDLE" ]; then
echo "[offload] ERROR: Could not determine sandbox handle from 'companion ls' output."
exit 1
fi
echo "[offload] Sandbox handle: $HANDLE"
# --- Inject SSH key ---
echo "[offload] Setting up SSH key..."
companion ssh --inject-only 2>&1 || { echo "[offload] ERROR: Failed to inject SSH key."; exit 1; }
# --- Build SSH connection parameters ---
KEY_PATH="$HOME/.companion/ssh/id_ed25519_$HANDLE"
SSH_GATEWAY="${COMPANION_SSH_GATEWAY:-ssh.os.companion.ai}"
SSH_PORT="2222"
PROXY_CMD="sh -c '( printf \"COMPANION:${HANDLE}\n\"; cat ) | nc ${SSH_GATEWAY} ${SSH_PORT}'"
SSH_CMD="ssh -i $KEY_PATH -o StrictHostKeyChecking=accept-new -o LogLevel=ERROR -o ProxyCommand=${PROXY_CMD}"
RSYNC_SSH="ssh -i $KEY_PATH -o StrictHostKeyChecking=accept-new -o LogLevel=ERROR -o ProxyCommand=${PROXY_CMD}"
# --- Determine remote workspace ---
echo "[offload] Checking remote workspace..."
REMOTE_WORKSPACE=$(eval $SSH_CMD node@$HANDLE '[ -d /home/node/.openclaw/workspace ] && echo "/home/node/.openclaw/workspace" || ([ -d /home/node/.pi/workspace ] && echo "/home/node/.pi/workspace" || echo "")' 2>/dev/null || echo "")
if [ -z "$REMOTE_WORKSPACE" ]; then
REMOTE_WORKSPACE="/home/node/.openclaw/workspace"
echo "[offload] Creating workspace at $REMOTE_WORKSPACE"
eval $SSH_CMD node@$HANDLE "mkdir -p $REMOTE_WORKSPACE"
fi
echo "[offload] Remote workspace: $REMOTE_WORKSPACE"
# --- Install Claude if needed ---
echo "[offload] Checking for Claude Code on sandbox..."
HAS_CLAUDE=$(eval $SSH_CMD node@$HANDLE 'command -v claude >/dev/null 2>&1 && echo "yes" || echo "no"' 2>/dev/null)
if [ "$HAS_CLAUDE" = "no" ]; then
echo "[offload] Installing Claude Code on sandbox..."
eval $SSH_CMD node@$HANDLE 'curl -fsSL https://claude.ai/install.sh | sh 2>/dev/null || npm install -g @anthropic-ai/claude-code' || {
echo "[offload] ERROR: Could not install Claude Code on sandbox."
exit 1
}
fi
# --- Sync project ---
echo "[offload] Syncing project to sandbox..."
rsync -avz --progress \
--exclude='node_modules' \
--exclude='.git/objects' \
--exclude='__pycache__' \
--exclude='.venv' \
--exclude='venv' \
--exclude='.next' \
--exclude='dist' \
--exclude='build' \
-e "$RSYNC_SSH" \
"$LOCAL_DIR/" "node@$HANDLE:$REMOTE_WORKSPACE/"
# --- Sync Claude session ---
if [ -d "$LOCAL_DIR/.claude" ]; then
echo "[offload] Syncing Claude session data..."
rsync -avz --progress \
-e "$RSYNC_SSH" \
"$LOCAL_DIR/.claude/" "node@$HANDLE:$REMOTE_WORKSPACE/.claude/"
fi
# --- Transfer env vars ---
echo "[offload] Transferring environment variables..."
env | grep -v '^_=' | grep -v '^SHELL=' | grep -v '^TERM_' | grep -v '^SSH_' | \
grep -v '^DISPLAY=' | grep -v '^HOME=' | grep -v '^USER=' | grep -v '^LOGNAME=' | \
grep -v '^PATH=' | grep -v '^PWD=' | grep -v '^OLDPWD=' | grep -v '^SHLVL=' \
> /tmp/companion_env_export.txt 2>/dev/null || true
rsync -avz -e "$RSYNC_SSH" /tmp/companion_env_export.txt "node@$HANDLE:$REMOTE_WORKSPACE/.companion_env" 2>/dev/null
rm -f /tmp/companion_env_export.txt
# --- Launch Claude ---
echo "[offload] Launching Claude Code on sandbox..."
echo ""
if [ -n "$TASK_PROMPT" ]; then
eval $SSH_CMD -t node@$HANDLE "cd $REMOTE_WORKSPACE && set -a && source .companion_env 2>/dev/null; set +a && claude --dangerously-skip-permissions --prompt '$TASK_PROMPT'"
else
eval $SSH_CMD -t node@$HANDLE "cd $REMOTE_WORKSPACE && set -a && source .companion_env 2>/dev/null; set +a && claude --dangerously-skip-permissions --continue"
fi