refactor: finish companion rename migration

Complete the remaining pi-to-companion rename across companion-os, web, vm-orchestrator, docker, and archived fixtures.

Verification:
- semantic rg sweeps for Pi/piConfig/getPi/.pi runtime references
- npm run check in apps/companion-os (fails in this worktree: biome not found)

Co-authored-by: Codex <noreply@openai.com>
This commit is contained in:
Harivansh Rathi 2026-03-10 07:39:32 -05:00
parent e8fe3d54af
commit 536241053c
303 changed files with 3603 additions and 3602 deletions

View file

@ -1,4 +1,4 @@
# @mariozechner/pi-ai
# @mariozechner/companion-ai
Unified LLM API with automatic model discovery, provider configuration, token and cost tracking, and simple context persistence and hand-off to other models mid-session.
@ -72,10 +72,10 @@ Unified LLM API with automatic model discovery, provider configuration, token an
## Installation
```bash
npm install @mariozechner/pi-ai
npm install @mariozechner/companion-ai
```
TypeBox exports are re-exported from `@mariozechner/pi-ai`: `Type`, `Static`, and `TSchema`.
TypeBox exports are re-exported from `@mariozechner/companion-ai`: `Type`, `Static`, and `TSchema`.
## Quick Start
@ -88,7 +88,7 @@ import {
Context,
Tool,
StringEnum,
} from "@mariozechner/pi-ai";
} from "@mariozechner/companion-ai";
// Fully typed with auto-complete support for both providers and models
const model = getModel("openai", "gpt-4o-mini");
@ -223,7 +223,7 @@ Tools enable LLMs to interact with external systems. This library uses TypeBox s
### Defining Tools
```typescript
import { Type, Tool, StringEnum } from "@mariozechner/pi-ai";
import { Type, Tool, StringEnum } from "@mariozechner/companion-ai";
// Define tool parameters with TypeBox
const weatherTool: Tool = {
@ -356,7 +356,7 @@ When using `agentLoop`, tool arguments are automatically validated against your
When implementing your own tool execution loop with `stream()` or `complete()`, use `validateToolCall` to validate arguments before passing them to your tools:
```typescript
import { stream, validateToolCall, Tool } from "@mariozechner/pi-ai";
import { stream, validateToolCall, Tool } from "@mariozechner/companion-ai";
const tools: Tool[] = [weatherTool, calculatorTool];
const s = stream(model, { messages, tools });
@ -410,7 +410,7 @@ Models with vision capabilities can process images. You can check if a model sup
```typescript
import { readFileSync } from "fs";
import { getModel, complete } from "@mariozechner/pi-ai";
import { getModel, complete } from "@mariozechner/companion-ai";
const model = getModel("openai", "gpt-4o-mini");
@ -449,7 +449,7 @@ Many models support thinking/reasoning capabilities where they can show their in
### Unified Interface (streamSimple/completeSimple)
```typescript
import { getModel, streamSimple, completeSimple } from "@mariozechner/pi-ai";
import { getModel, streamSimple, completeSimple } from "@mariozechner/companion-ai";
// Many models across providers support thinking/reasoning
const model = getModel("anthropic", "claude-sonnet-4-20250514");
@ -491,7 +491,7 @@ for (const block of response.content) {
For fine-grained control, use the provider-specific options:
```typescript
import { getModel, complete } from "@mariozechner/pi-ai";
import { getModel, complete } from "@mariozechner/companion-ai";
// OpenAI Reasoning (o1, o3, gpt-5)
const openaiModel = getModel("openai", "gpt-5-mini");
@ -578,7 +578,7 @@ if (message.stopReason === "error" || message.stopReason === "aborted") {
The abort signal allows you to cancel in-progress requests. Aborted requests have `stopReason === 'aborted'`:
```typescript
import { getModel, stream } from "@mariozechner/pi-ai";
import { getModel, stream } from "@mariozechner/companion-ai";
const model = getModel("openai", "gpt-4o-mini");
const controller = new AbortController();
@ -682,7 +682,7 @@ A **provider** offers models through a specific API. For example:
### Querying Providers and Models
```typescript
import { getProviders, getModels, getModel } from "@mariozechner/pi-ai";
import { getProviders, getModels, getModel } from "@mariozechner/companion-ai";
// Get all available providers
const providers = getProviders();
@ -708,7 +708,7 @@ console.log(`Using ${model.name} via ${model.api} API`);
You can create custom models for local inference servers or custom endpoints:
```typescript
import { Model, stream } from "@mariozechner/pi-ai";
import { Model, stream } from "@mariozechner/companion-ai";
// Example: Ollama using OpenAI-compatible API
const ollamaModel: Model<"openai-completions"> = {
@ -802,7 +802,7 @@ If `compat` is not set, the library falls back to URL-based detection. If `compa
Models are typed by their API, which keeps the model metadata accurate. Provider-specific option types are enforced when you call the provider functions directly. The generic `stream` and `complete` functions accept `StreamOptions` with additional provider fields.
```typescript
import { streamAnthropic, type AnthropicOptions } from "@mariozechner/pi-ai";
import { streamAnthropic, type AnthropicOptions } from "@mariozechner/companion-ai";
// TypeScript knows this is an Anthropic model
const claude = getModel("anthropic", "claude-sonnet-4-20250514");
@ -831,7 +831,7 @@ When messages from one provider are sent to a different provider, the library au
### Example: Multi-Provider Conversation
```typescript
import { getModel, complete, Context } from "@mariozechner/pi-ai";
import { getModel, complete, Context } from "@mariozechner/companion-ai";
// Start with Claude
const claude = getModel("anthropic", "claude-sonnet-4-20250514");
@ -884,7 +884,7 @@ This enables flexible workflows where you can:
The `Context` object can be easily serialized and deserialized using standard JSON methods, making it simple to persist conversations, implement chat history, or transfer contexts between services:
```typescript
import { Context, getModel, complete } from "@mariozechner/pi-ai";
import { Context, getModel, complete } from "@mariozechner/companion-ai";
// Create and use a context
const context: Context = {
@ -922,7 +922,7 @@ const continuation = await complete(newModel, restored);
The library supports browser environments. You must pass the API key explicitly since environment variables are not available in browsers:
```typescript
import { getModel, complete } from "@mariozechner/pi-ai";
import { getModel, complete } from "@mariozechner/companion-ai";
// API key must be passed explicitly in browser
const model = getModel("anthropic", "claude-3-5-haiku-20241022");
@ -943,7 +943,7 @@ const response = await complete(
### Browser Compatibility Notes
- Amazon Bedrock (`bedrock-converse-stream`) is not supported in browser environments.
- OAuth login flows are not supported in browser environments. Use the `@mariozechner/pi-ai/oauth` entry point in Node.js.
- OAuth login flows are not supported in browser environments. Use the `@mariozechner/companion-ai/oauth` entry point in Node.js.
- In browser builds, Bedrock can still appear in model lists. Calls to Bedrock models fail at runtime.
- Use a server-side proxy or backend service if you need Bedrock or OAuth-based auth from a web app.
@ -985,17 +985,17 @@ const response = await complete(model, context, {
#### Antigravity Version Override
Set `PI_AI_ANTIGRAVITY_VERSION` to override the Antigravity User-Agent version when Google updates their requirements:
Set `COMPANION_AI_ANTIGRAVITY_VERSION` to override the Antigravity User-Agent version when Google updates their requirements:
```bash
export PI_AI_ANTIGRAVITY_VERSION="1.23.0"
export COMPANION_AI_ANTIGRAVITY_VERSION="1.23.0"
```
#### Cache Retention
Set `PI_CACHE_RETENTION=long` to extend prompt cache retention:
Set `COMPANION_CACHE_RETENTION=long` to extend prompt cache retention:
| Provider | Default | With `PI_CACHE_RETENTION=long` |
| Provider | Default | With `COMPANION_CACHE_RETENTION=long` |
| --------- | --------- | ------------------------------ |
| Anthropic | 5 minutes | 1 hour |
| OpenAI | in-memory | 24 hours |
@ -1007,7 +1007,7 @@ This only affects direct API calls to `api.anthropic.com` and `api.openai.com`.
### Checking Environment Variables
```typescript
import { getEnvApiKey } from "@mariozechner/pi-ai";
import { getEnvApiKey } from "@mariozechner/companion-ai";
// Check if an API key is set in environment variables
const key = getEnvApiKey("openai"); // checks OPENAI_API_KEY
@ -1047,7 +1047,7 @@ export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account.json"
```
```typescript
import { getModel, complete } from "@mariozechner/pi-ai";
import { getModel, complete } from "@mariozechner/companion-ai";
(async () => {
const model = getModel("google-vertex", "gemini-2.5-flash");
@ -1068,16 +1068,16 @@ Official docs: [Application Default Credentials](https://cloud.google.com/docs/a
The quickest way to authenticate:
```bash
npx @mariozechner/pi-ai login # interactive provider selection
npx @mariozechner/pi-ai login anthropic # login to specific provider
npx @mariozechner/pi-ai list # list available providers
npx @mariozechner/companion-ai login # interactive provider selection
npx @mariozechner/companion-ai login anthropic # login to specific provider
npx @mariozechner/companion-ai list # list available providers
```
Credentials are saved to `auth.json` in the current directory.
### Programmatic OAuth
The library provides login and token refresh functions via the `@mariozechner/pi-ai/oauth` entry point. Credential storage is the caller's responsibility.
The library provides login and token refresh functions via the `@mariozechner/companion-ai/oauth` entry point. Credential storage is the caller's responsibility.
```typescript
import {
@ -1095,13 +1095,13 @@ import {
// Types
type OAuthProvider, // 'anthropic' | 'openai-codex' | 'github-copilot' | 'google-gemini-cli' | 'google-antigravity'
type OAuthCredentials,
} from "@mariozechner/pi-ai/oauth";
} from "@mariozechner/companion-ai/oauth";
```
### Login Flow Example
```typescript
import { loginGitHubCopilot } from "@mariozechner/pi-ai/oauth";
import { loginGitHubCopilot } from "@mariozechner/companion-ai/oauth";
import { writeFileSync } from "fs";
const credentials = await loginGitHubCopilot({
@ -1125,8 +1125,8 @@ writeFileSync("auth.json", JSON.stringify(auth, null, 2));
Use `getOAuthApiKey()` to get an API key, automatically refreshing if expired:
```typescript
import { getModel, complete } from "@mariozechner/pi-ai";
import { getOAuthApiKey } from "@mariozechner/pi-ai/oauth";
import { getModel, complete } from "@mariozechner/companion-ai";
import { getOAuthApiKey } from "@mariozechner/companion-ai/oauth";
import { readFileSync, writeFileSync } from "fs";
// Load your stored credentials