refactor: finish companion rename migration

Complete the remaining pi-to-companion rename across companion-os, web, vm-orchestrator, docker, and archived fixtures.

Verification:
- semantic rg sweeps for Pi/piConfig/getPi/.pi runtime references
- npm run check in apps/companion-os (fails in this worktree: biome not found)

Co-authored-by: Codex <noreply@openai.com>
This commit is contained in:
Harivansh Rathi 2026-03-10 07:39:32 -05:00
parent e8fe3d54af
commit 536241053c
303 changed files with 3603 additions and 3602 deletions

View file

@ -46,7 +46,7 @@
### Fixed
- Fixed `continue()` to resume queued steering/follow-up messages when context currently ends in an assistant message, and preserved one-at-a-time steering ordering during assistant-tail resumes ([#1312](https://github.com/badlogic/pi-mono/pull/1312) by [@ferologics](https://github.com/ferologics))
- Fixed `continue()` to resume queued steering/follow-up messages when context currently ends in an assistant message, and preserved one-at-a-time steering ordering during assistant-tail resumes ([#1312](https://github.com/badlogic/companion-mono/pull/1312) by [@ferologics](https://github.com/ferologics))
## [0.52.6] - 2026-02-05
@ -82,7 +82,7 @@
### Added
- Added `maxRetryDelayMs` option to `AgentOptions` to cap server-requested retry delays. Passed through to the underlying stream function. ([#1123](https://github.com/badlogic/pi-mono/issues/1123))
- Added `maxRetryDelayMs` option to `AgentOptions` to cap server-requested retry delays. Passed through to the underlying stream function. ([#1123](https://github.com/badlogic/companion-mono/issues/1123))
## [0.50.7] - 2026-01-31
@ -158,7 +158,7 @@
### Added
- `thinkingBudgets` option on `Agent` and `AgentOptions` to customize token budgets per thinking level ([#529](https://github.com/badlogic/pi-mono/pull/529) by [@melihmucuk](https://github.com/melihmucuk))
- `thinkingBudgets` option on `Agent` and `AgentOptions` to customize token budgets per thinking level ([#529](https://github.com/badlogic/companion-mono/pull/529) by [@melihmucuk](https://github.com/melihmucuk))
## [0.37.8] - 2026-01-07
@ -208,7 +208,7 @@
### Breaking Changes
- **Queue API replaced with steer/followUp**: The `queueMessage()` method has been split into two methods with different delivery semantics ([#403](https://github.com/badlogic/pi-mono/issues/403)):
- **Queue API replaced with steer/followUp**: The `queueMessage()` method has been split into two methods with different delivery semantics ([#403](https://github.com/badlogic/companion-mono/issues/403)):
- `steer(msg)`: Interrupts the agent mid-run. Delivered after current tool execution, skips remaining tools.
- `followUp(msg)`: Waits until the agent finishes. Delivered only when there are no more tool calls or steering messages.
- **Queue mode renamed**: `queueMode` option renamed to `steeringMode`. Added new `followUpMode` option. Both control whether messages are delivered one-at-a-time or all at once.
@ -241,11 +241,11 @@
- **`UserMessageWithAttachments` and `Attachment` types removed**: Attachment handling is now the responsibility of the `convertToLlm` function.
- **Agent loop moved from `@mariozechner/pi-ai`**: The `agentLoop`, `agentLoopContinue`, and related types have moved to this package. Import from `@mariozechner/pi-agent-core` instead.
- **Agent loop moved from `@mariozechner/companion-ai`**: The `agentLoop`, `agentLoopContinue`, and related types have moved to this package. Import from `@mariozechner/companion-agent-core` instead.
### Added
- `streamFn` option on `Agent` for custom stream implementations. Default uses `streamSimple` from pi-ai.
- `streamFn` option on `Agent` for custom stream implementations. Default uses `streamSimple` from companion-ai.
- `streamProxy()` utility function for browser apps that need to proxy LLM calls through a backend server. Replaces the removed `AppTransport`.

View file

@ -1,18 +1,18 @@
# @mariozechner/pi-agent-core
# @mariozechner/companion-agent-core
Stateful agent with tool execution and event streaming. Built on `@mariozechner/pi-ai`.
Stateful agent with tool execution and event streaming. Built on `@mariozechner/companion-ai`.
## Installation
```bash
npm install @mariozechner/pi-agent-core
npm install @mariozechner/companion-agent-core
```
## Quick Start
```typescript
import { Agent } from "@mariozechner/pi-agent-core";
import { getModel } from "@mariozechner/pi-ai";
import { Agent } from "@mariozechner/companion-agent-core";
import { getModel } from "@mariozechner/companion-ai";
const agent = new Agent({
initialState: {
@ -298,7 +298,7 @@ Follow-up messages are checked only when there are no more tool calls and no ste
Extend `AgentMessage` via declaration merging:
```typescript
declare module "@mariozechner/pi-agent-core" {
declare module "@mariozechner/companion-agent-core" {
interface CustomAgentMessages {
notification: { role: "notification"; text: string; timestamp: number };
}
@ -378,7 +378,7 @@ Thrown errors are caught by the agent and reported to the LLM as tool errors wit
For browser apps that proxy through a backend:
```typescript
import { Agent, streamProxy } from "@mariozechner/pi-agent-core";
import { Agent, streamProxy } from "@mariozechner/companion-agent-core";
const agent = new Agent({
streamFn: (model, context, options) =>
@ -395,7 +395,7 @@ const agent = new Agent({
For direct control without the Agent class:
```typescript
import { agentLoop, agentLoopContinue } from "@mariozechner/pi-agent-core";
import { agentLoop, agentLoopContinue } from "@mariozechner/companion-agent-core";
const context: AgentContext = {
systemPrompt: "You are helpful.",

View file

@ -1,5 +1,5 @@
{
"name": "@mariozechner/pi-agent-core",
"name": "@mariozechner/companion-agent-core",
"version": "0.56.2",
"description": "General-purpose agent with transport abstraction, state management, and attachment support",
"type": "module",
@ -17,7 +17,7 @@
"prepublishOnly": "npm run clean && npm run build"
},
"dependencies": {
"@mariozechner/pi-ai": "^0.56.2"
"@mariozechner/companion-ai": "^0.56.2"
},
"keywords": [
"ai",

View file

@ -10,7 +10,7 @@ import {
streamSimple,
type ToolResultMessage,
validateToolArguments,
} from "@mariozechner/pi-ai";
} from "@mariozechner/companion-ai";
import type {
AgentContext,
AgentEvent,

View file

@ -12,7 +12,7 @@ import {
type TextContent,
type ThinkingBudgets,
type Transport,
} from "@mariozechner/pi-ai";
} from "@mariozechner/companion-ai";
import { agentLoop, agentLoopContinue } from "./agent-loop.js";
import type {
AgentContext,

View file

@ -14,7 +14,7 @@ import {
type SimpleStreamOptions,
type StopReason,
type ToolCall,
} from "@mariozechner/pi-ai";
} from "@mariozechner/companion-ai";
// Create stream class matching ProxyMessageEventStream
class ProxyMessageEventStream extends EventStream<

View file

@ -8,7 +8,7 @@ import type {
TextContent,
Tool,
ToolResultMessage,
} from "@mariozechner/pi-ai";
} from "@mariozechner/companion-ai";
import type { Static, TSchema } from "@sinclair/typebox";
/** Stream function - can return sync or Promise for async config lookup */

View file

@ -5,7 +5,7 @@ import {
type Message,
type Model,
type UserMessage,
} from "@mariozechner/pi-ai";
} from "@mariozechner/companion-ai";
import { Type } from "@sinclair/typebox";
import { describe, expect, it } from "vitest";
import { agentLoop, agentLoopContinue } from "../src/agent-loop.js";

View file

@ -3,7 +3,7 @@ import {
type AssistantMessageEvent,
EventStream,
getModel,
} from "@mariozechner/pi-ai";
} from "@mariozechner/companion-ai";
import { describe, expect, it } from "vitest";
import { Agent } from "../src/index.js";

View file

@ -9,7 +9,7 @@
*
* You can run this test suite with:
* ```bash
* $ AWS_REGION=us-east-1 BEDROCK_EXTENSIVE_MODEL_TEST=1 AWS_PROFILE=pi npm test -- ./test/bedrock-models.test.ts
* $ AWS_REGION=us-east-1 BEDROCK_EXTENSIVE_MODEL_TEST=1 AWS_PROFILE=companion npm test -- ./test/bedrock-models.test.ts
* ```
*
* ## Known Issues by Category
@ -21,8 +21,8 @@
* 5. **Invalid Signature Format**: Model validates signature format (Anthropic newer models).
*/
import type { AssistantMessage } from "@mariozechner/pi-ai";
import { getModels } from "@mariozechner/pi-ai";
import type { AssistantMessage } from "@mariozechner/companion-ai";
import { getModels } from "@mariozechner/companion-ai";
import { describe, expect, it } from "vitest";
import { Agent } from "../src/index.js";
import { hasBedrockCredentials } from "./bedrock-utils.js";

View file

@ -3,8 +3,8 @@ import type {
Model,
ToolResultMessage,
UserMessage,
} from "@mariozechner/pi-ai";
import { getModel } from "@mariozechner/pi-ai";
} from "@mariozechner/companion-ai";
import { getModel } from "@mariozechner/companion-ai";
import { describe, expect, it } from "vitest";
import { Agent } from "../src/index.js";
import { hasBedrockCredentials } from "./bedrock-utils.js";