chore: rebrand companion-os to clanker-agent

- Rename all package names from companion-* to clanker-*
- Update npm scopes from @mariozechner to @harivansh-afk
- Rename config directories .companion -> .clanker
- Rename environment variables COMPANION_* -> CLANKER_*
- Update all documentation, README files, and install scripts
- Rename package directories (companion-channels, companion-grind, companion-teams)
- Update GitHub URLs to harivansh-afk/clanker-agent
- Preserve full git history from companion-cloud monorepo
This commit is contained in:
Harivansh Rathi 2026-03-26 16:22:52 -04:00
parent f93fe7d1a0
commit 67168d8289
356 changed files with 2249 additions and 10223 deletions

View file

@ -1,4 +1,4 @@
# @mariozechner/companion-ai
# @mariozechner/clanker-ai
Unified LLM API with automatic model discovery, provider configuration, token and cost tracking, and simple context persistence and hand-off to other models mid-session.
@ -72,10 +72,10 @@ Unified LLM API with automatic model discovery, provider configuration, token an
## Installation
```bash
npm install @mariozechner/companion-ai
npm install @mariozechner/clanker-ai
```
TypeBox exports are re-exported from `@mariozechner/companion-ai`: `Type`, `Static`, and `TSchema`.
TypeBox exports are re-exported from `@mariozechner/clanker-ai`: `Type`, `Static`, and `TSchema`.
## Quick Start
@ -88,7 +88,7 @@ import {
Context,
Tool,
StringEnum,
} from "@mariozechner/companion-ai";
} from "@mariozechner/clanker-ai";
// Fully typed with auto-complete support for both providers and models
const model = getModel("openai", "gpt-4o-mini");
@ -223,7 +223,7 @@ Tools enable LLMs to interact with external systems. This library uses TypeBox s
### Defining Tools
```typescript
import { Type, Tool, StringEnum } from "@mariozechner/companion-ai";
import { Type, Tool, StringEnum } from "@mariozechner/clanker-ai";
// Define tool parameters with TypeBox
const weatherTool: Tool = {
@ -356,7 +356,7 @@ When using `agentLoop`, tool arguments are automatically validated against your
When implementing your own tool execution loop with `stream()` or `complete()`, use `validateToolCall` to validate arguments before passing them to your tools:
```typescript
import { stream, validateToolCall, Tool } from "@mariozechner/companion-ai";
import { stream, validateToolCall, Tool } from "@mariozechner/clanker-ai";
const tools: Tool[] = [weatherTool, calculatorTool];
const s = stream(model, { messages, tools });
@ -410,7 +410,7 @@ Models with vision capabilities can process images. You can check if a model sup
```typescript
import { readFileSync } from "fs";
import { getModel, complete } from "@mariozechner/companion-ai";
import { getModel, complete } from "@mariozechner/clanker-ai";
const model = getModel("openai", "gpt-4o-mini");
@ -449,7 +449,7 @@ Many models support thinking/reasoning capabilities where they can show their in
### Unified Interface (streamSimple/completeSimple)
```typescript
import { getModel, streamSimple, completeSimple } from "@mariozechner/companion-ai";
import { getModel, streamSimple, completeSimple } from "@mariozechner/clanker-ai";
// Many models across providers support thinking/reasoning
const model = getModel("anthropic", "claude-sonnet-4-20250514");
@ -491,7 +491,7 @@ for (const block of response.content) {
For fine-grained control, use the provider-specific options:
```typescript
import { getModel, complete } from "@mariozechner/companion-ai";
import { getModel, complete } from "@mariozechner/clanker-ai";
// OpenAI Reasoning (o1, o3, gpt-5)
const openaiModel = getModel("openai", "gpt-5-mini");
@ -578,7 +578,7 @@ if (message.stopReason === "error" || message.stopReason === "aborted") {
The abort signal allows you to cancel in-progress requests. Aborted requests have `stopReason === 'aborted'`:
```typescript
import { getModel, stream } from "@mariozechner/companion-ai";
import { getModel, stream } from "@mariozechner/clanker-ai";
const model = getModel("openai", "gpt-4o-mini");
const controller = new AbortController();
@ -682,7 +682,7 @@ A **provider** offers models through a specific API. For example:
### Querying Providers and Models
```typescript
import { getProviders, getModels, getModel } from "@mariozechner/companion-ai";
import { getProviders, getModels, getModel } from "@mariozechner/clanker-ai";
// Get all available providers
const providers = getProviders();
@ -708,7 +708,7 @@ console.log(`Using ${model.name} via ${model.api} API`);
You can create custom models for local inference servers or custom endpoints:
```typescript
import { Model, stream } from "@mariozechner/companion-ai";
import { Model, stream } from "@mariozechner/clanker-ai";
// Example: Ollama using OpenAI-compatible API
const ollamaModel: Model<"openai-completions"> = {
@ -802,7 +802,7 @@ If `compat` is not set, the library falls back to URL-based detection. If `compa
Models are typed by their API, which keeps the model metadata accurate. Provider-specific option types are enforced when you call the provider functions directly. The generic `stream` and `complete` functions accept `StreamOptions` with additional provider fields.
```typescript
import { streamAnthropic, type AnthropicOptions } from "@mariozechner/companion-ai";
import { streamAnthropic, type AnthropicOptions } from "@mariozechner/clanker-ai";
// TypeScript knows this is an Anthropic model
const claude = getModel("anthropic", "claude-sonnet-4-20250514");
@ -831,7 +831,7 @@ When messages from one provider are sent to a different provider, the library au
### Example: Multi-Provider Conversation
```typescript
import { getModel, complete, Context } from "@mariozechner/companion-ai";
import { getModel, complete, Context } from "@mariozechner/clanker-ai";
// Start with Claude
const claude = getModel("anthropic", "claude-sonnet-4-20250514");
@ -884,7 +884,7 @@ This enables flexible workflows where you can:
The `Context` object can be easily serialized and deserialized using standard JSON methods, making it simple to persist conversations, implement chat history, or transfer contexts between services:
```typescript
import { Context, getModel, complete } from "@mariozechner/companion-ai";
import { Context, getModel, complete } from "@mariozechner/clanker-ai";
// Create and use a context
const context: Context = {
@ -922,7 +922,7 @@ const continuation = await complete(newModel, restored);
The library supports browser environments. You must pass the API key explicitly since environment variables are not available in browsers:
```typescript
import { getModel, complete } from "@mariozechner/companion-ai";
import { getModel, complete } from "@mariozechner/clanker-ai";
// API key must be passed explicitly in browser
const model = getModel("anthropic", "claude-3-5-haiku-20241022");
@ -943,7 +943,7 @@ const response = await complete(
### Browser Compatibility Notes
- Amazon Bedrock (`bedrock-converse-stream`) is not supported in browser environments.
- OAuth login flows are not supported in browser environments. Use the `@mariozechner/companion-ai/oauth` entry point in Node.js.
- OAuth login flows are not supported in browser environments. Use the `@mariozechner/clanker-ai/oauth` entry point in Node.js.
- In browser builds, Bedrock can still appear in model lists. Calls to Bedrock models fail at runtime.
- Use a server-side proxy or backend service if you need Bedrock or OAuth-based auth from a web app.
@ -985,17 +985,17 @@ const response = await complete(model, context, {
#### Antigravity Version Override
Set `COMPANION_AI_ANTIGRAVITY_VERSION` to override the Antigravity User-Agent version when Google updates their requirements:
Set `CLANKER_AI_ANTIGRAVITY_VERSION` to override the Antigravity User-Agent version when Google updates their requirements:
```bash
export COMPANION_AI_ANTIGRAVITY_VERSION="1.23.0"
export CLANKER_AI_ANTIGRAVITY_VERSION="1.23.0"
```
#### Cache Retention
Set `COMPANION_CACHE_RETENTION=long` to extend prompt cache retention:
Set `CLANKER_CACHE_RETENTION=long` to extend prompt cache retention:
| Provider | Default | With `COMPANION_CACHE_RETENTION=long` |
| Provider | Default | With `CLANKER_CACHE_RETENTION=long` |
| --------- | --------- | ------------------------------ |
| Anthropic | 5 minutes | 1 hour |
| OpenAI | in-memory | 24 hours |
@ -1007,7 +1007,7 @@ This only affects direct API calls to `api.anthropic.com` and `api.openai.com`.
### Checking Environment Variables
```typescript
import { getEnvApiKey } from "@mariozechner/companion-ai";
import { getEnvApiKey } from "@mariozechner/clanker-ai";
// Check if an API key is set in environment variables
const key = getEnvApiKey("openai"); // checks OPENAI_API_KEY
@ -1047,7 +1047,7 @@ export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account.json"
```
```typescript
import { getModel, complete } from "@mariozechner/companion-ai";
import { getModel, complete } from "@mariozechner/clanker-ai";
(async () => {
const model = getModel("google-vertex", "gemini-2.5-flash");
@ -1068,16 +1068,16 @@ Official docs: [Application Default Credentials](https://cloud.google.com/docs/a
The quickest way to authenticate:
```bash
npx @mariozechner/companion-ai login # interactive provider selection
npx @mariozechner/companion-ai login anthropic # login to specific provider
npx @mariozechner/companion-ai list # list available providers
npx @mariozechner/clanker-ai login # interactive provider selection
npx @mariozechner/clanker-ai login anthropic # login to specific provider
npx @mariozechner/clanker-ai list # list available providers
```
Credentials are saved to `auth.json` in the current directory.
### Programmatic OAuth
The library provides login and token refresh functions via the `@mariozechner/companion-ai/oauth` entry point. Credential storage is the caller's responsibility.
The library provides login and token refresh functions via the `@mariozechner/clanker-ai/oauth` entry point. Credential storage is the caller's responsibility.
```typescript
import {
@ -1095,13 +1095,13 @@ import {
// Types
type OAuthProvider, // 'anthropic' | 'openai-codex' | 'github-copilot' | 'google-gemini-cli' | 'google-antigravity'
type OAuthCredentials,
} from "@mariozechner/companion-ai/oauth";
} from "@mariozechner/clanker-ai/oauth";
```
### Login Flow Example
```typescript
import { loginGitHubCopilot } from "@mariozechner/companion-ai/oauth";
import { loginGitHubCopilot } from "@mariozechner/clanker-ai/oauth";
import { writeFileSync } from "fs";
const credentials = await loginGitHubCopilot({
@ -1125,8 +1125,8 @@ writeFileSync("auth.json", JSON.stringify(auth, null, 2));
Use `getOAuthApiKey()` to get an API key, automatically refreshing if expired:
```typescript
import { getModel, complete } from "@mariozechner/companion-ai";
import { getOAuthApiKey } from "@mariozechner/companion-ai/oauth";
import { getModel, complete } from "@mariozechner/clanker-ai";
import { getOAuthApiKey } from "@mariozechner/clanker-ai/oauth";
import { readFileSync, writeFileSync } from "fs";
// Load your stored credentials

View file

@ -1,5 +1,5 @@
{
"name": "@mariozechner/companion-ai",
"name": "@harivansh-afk/clanker-ai",
"version": "0.56.2",
"description": "Unified LLM API with automatic model discovery and provider configuration",
"type": "module",
@ -20,7 +20,7 @@
}
},
"bin": {
"companion-ai": "./dist/cli.js"
"clanker-ai": "./dist/cli.js"
},
"files": [
"dist",
@ -66,7 +66,7 @@
"license": "MIT",
"repository": {
"type": "git",
"url": "git+https://github.com/getcompanion-ai/co-mono.git",
"url": "git+https://github.com/harivansh-afk/clanker-agent.git",
"directory": "packages/ai"
},
"engines": {

View file

@ -78,7 +78,7 @@ async function main(): Promise<void> {
const providerList = PROVIDERS.map(
(p) => ` ${p.id.padEnd(20)} ${p.name}`,
).join("\n");
console.log(`Usage: npx @mariozechner/companion-ai <command> [provider]
console.log(`Usage: npx @mariozechner/clanker-ai <command> [provider]
Commands:
login [provider] Login to an OAuth provider
@ -88,9 +88,9 @@ Providers:
${providerList}
Examples:
npx @mariozechner/companion-ai login # interactive provider selection
npx @mariozechner/companion-ai login anthropic # login to specific provider
npx @mariozechner/companion-ai list # list providers
npx @mariozechner/clanker-ai login # interactive provider selection
npx @mariozechner/clanker-ai login anthropic # login to specific provider
npx @mariozechner/clanker-ai list # list providers
`);
return;
}
@ -131,7 +131,7 @@ Examples:
if (!PROVIDERS.some((p) => p.id === provider)) {
console.error(`Unknown provider: ${provider}`);
console.error(
`Use 'npx @mariozechner/companion-ai list' to see available providers`,
`Use 'npx @mariozechner/clanker-ai list' to see available providers`,
);
process.exit(1);
}
@ -142,7 +142,7 @@ Examples:
}
console.error(`Unknown command: ${command}`);
console.error(`Use 'npx @mariozechner/companion-ai --help' for usage`);
console.error(`Use 'npx @mariozechner/clanker-ai --help' for usage`);
process.exit(1);
}

View file

@ -514,7 +514,7 @@ function mapThinkingLevelToEffort(
/**
* Resolve cache retention preference.
* Defaults to "short" and uses COMPANION_CACHE_RETENTION for backward compatibility.
* Defaults to "short" and uses CLANKER_CACHE_RETENTION for backward compatibility.
*/
function resolveCacheRetention(
cacheRetention?: CacheRetention,
@ -524,7 +524,7 @@ function resolveCacheRetention(
}
if (
typeof process !== "undefined" &&
process.env.COMPANION_CACHE_RETENTION === "long"
process.env.CLANKER_CACHE_RETENTION === "long"
) {
return "long";
}

View file

@ -40,7 +40,7 @@ import { transformMessages } from "./transform-messages.js";
/**
* Resolve cache retention preference.
* Defaults to "short" and uses COMPANION_CACHE_RETENTION for backward compatibility.
* Defaults to "short" and uses CLANKER_CACHE_RETENTION for backward compatibility.
*/
function resolveCacheRetention(
cacheRetention?: CacheRetention,
@ -50,7 +50,7 @@ function resolveCacheRetention(
}
if (
typeof process !== "undefined" &&
process.env.COMPANION_CACHE_RETENTION === "long"
process.env.CLANKER_CACHE_RETENTION === "long"
) {
return "long";
}

View file

@ -88,7 +88,7 @@ const DEFAULT_ANTIGRAVITY_VERSION = "1.18.3";
function getAntigravityHeaders() {
const version =
process.env.COMPANION_AI_ANTIGRAVITY_VERSION || DEFAULT_ANTIGRAVITY_VERSION;
process.env.CLANKER_AI_ANTIGRAVITY_VERSION || DEFAULT_ANTIGRAVITY_VERSION;
return {
"User-Agent": `antigravity/${version} darwin/arm64`,
};
@ -1040,8 +1040,8 @@ export function buildRequest(
model: model.id,
request,
...(isAntigravity ? { requestType: "agent" } : {}),
userAgent: isAntigravity ? "antigravity" : "companion-coding-agent",
requestId: `${isAntigravity ? "agent" : "companion"}-${Date.now()}-${Math.random().toString(36).slice(2, 11)}`,
userAgent: isAntigravity ? "antigravity" : "clanker-coding-agent",
requestId: `${isAntigravity ? "agent" : "clanker"}-${Date.now()}-${Math.random().toString(36).slice(2, 11)}`,
};
}

View file

@ -997,10 +997,10 @@ function buildHeaders(
headers.set("Authorization", `Bearer ${token}`);
headers.set("chatgpt-account-id", accountId);
headers.set("OpenAI-Beta", "responses=experimental");
headers.set("originator", "companion");
headers.set("originator", "clanker");
const userAgent = _os
? `companion (${_os.platform()} ${_os.release()}; ${_os.arch()})`
: "companion (browser)";
? `clanker (${_os.platform()} ${_os.release()}; ${_os.arch()})`
: "clanker (browser)";
headers.set("User-Agent", userAgent);
headers.set("accept", "text/event-stream");
headers.set("content-type", "application/json");

View file

@ -33,7 +33,7 @@ const OPENAI_TOOL_CALL_PROVIDERS = new Set([
/**
* Resolve cache retention preference.
* Defaults to "short" and uses COMPANION_CACHE_RETENTION for backward compatibility.
* Defaults to "short" and uses CLANKER_CACHE_RETENTION for backward compatibility.
*/
function resolveCacheRetention(
cacheRetention?: CacheRetention,
@ -43,7 +43,7 @@ function resolveCacheRetention(
}
if (
typeof process !== "undefined" &&
process.env.COMPANION_CACHE_RETENTION === "long"
process.env.CLANKER_CACHE_RETENTION === "long"
) {
return "long";
}

View file

@ -283,7 +283,7 @@ export interface OpenAICompletionsCompat {
supportsDeveloperRole?: boolean;
/** Whether the provider supports `reasoning_effort`. Default: auto-detected from URL. */
supportsReasoningEffort?: boolean;
/** Optional mapping from companion-ai reasoning levels to provider/model-specific `reasoning_effort` values. */
/** Optional mapping from clanker-ai reasoning levels to provider/model-specific `reasoning_effort` values. */
reasoningEffortMap?: Partial<Record<ThinkingLevel, string>>;
/** Whether the provider supports `stream_options: { include_usage: true }` for token usage in streaming responses. Default: true. */
supportsUsageInStreaming?: boolean;

View file

@ -216,7 +216,7 @@ async function refreshAccessToken(refreshToken: string): Promise<TokenResult> {
}
async function createAuthorizationFlow(
originator: string = "companion",
originator: string = "clanker",
): Promise<{ verifier: string; state: string; url: string }> {
const { verifier, challenge } = await generatePKCE();
const state = createState();
@ -337,7 +337,7 @@ function getAccountId(accessToken: string): string | null {
* @param options.onManualCodeInput - Optional promise that resolves with user-pasted code.
* Races with browser callback - whichever completes first wins.
* Useful for showing paste input immediately alongside browser flow.
* @param options.originator - OAuth originator parameter (defaults to "companion")
* @param options.originator - OAuth originator parameter (defaults to "clanker")
*/
export async function loginOpenAICodex(options: {
onAuth: (info: { url: string; instructions?: string }) => void;

View file

@ -71,8 +71,8 @@ describe.skipIf(!oauthToken)("Anthropic OAuth tool name normalization", () => {
expect(toolCallName).toBe("todowrite");
});
it("should handle companion's built-in tools (read, write, edit, bash)", async () => {
// Companion's tools use lowercase names, CC uses PascalCase
it("should handle clanker's built-in tools (read, write, edit, bash)", async () => {
// Clanker's tools use lowercase names, CC uses PascalCase
const readTool: Tool = {
name: "read",
description: "Read a file",
@ -116,7 +116,7 @@ describe.skipIf(!oauthToken)("Anthropic OAuth tool name normalization", () => {
});
it("should NOT map find to Glob - find is not a CC tool name", async () => {
// Companion has a "find" tool, CC has "Glob" - these are DIFFERENT tools
// Clanker has a "find" tool, CC has "Glob" - these are DIFFERENT tools
// The old code incorrectly mapped find -> Glob, which broke the round-trip
// because there's no tool named "glob" in context.tools
const findTool: Tool = {

View file

@ -3,18 +3,18 @@ import { getModel } from "../src/models.js";
import { stream } from "../src/stream.js";
import type { Context } from "../src/types.js";
describe("Cache Retention (COMPANION_CACHE_RETENTION)", () => {
const originalEnv = process.env.COMPANION_CACHE_RETENTION;
describe("Cache Retention (CLANKER_CACHE_RETENTION)", () => {
const originalEnv = process.env.CLANKER_CACHE_RETENTION;
beforeEach(() => {
delete process.env.COMPANION_CACHE_RETENTION;
delete process.env.CLANKER_CACHE_RETENTION;
});
afterEach(() => {
if (originalEnv !== undefined) {
process.env.COMPANION_CACHE_RETENTION = originalEnv;
process.env.CLANKER_CACHE_RETENTION = originalEnv;
} else {
delete process.env.COMPANION_CACHE_RETENTION;
delete process.env.CLANKER_CACHE_RETENTION;
}
});
@ -25,7 +25,7 @@ describe("Cache Retention (COMPANION_CACHE_RETENTION)", () => {
describe("Anthropic Provider", () => {
it.skipIf(!process.env.ANTHROPIC_API_KEY)(
"should use default cache TTL (no ttl field) when COMPANION_CACHE_RETENTION is not set",
"should use default cache TTL (no ttl field) when CLANKER_CACHE_RETENTION is not set",
async () => {
const model = getModel("anthropic", "claude-3-5-haiku-20241022");
let capturedPayload: any = null;
@ -51,9 +51,9 @@ describe("Cache Retention (COMPANION_CACHE_RETENTION)", () => {
);
it.skipIf(!process.env.ANTHROPIC_API_KEY)(
"should use 1h cache TTL when COMPANION_CACHE_RETENTION=long",
"should use 1h cache TTL when CLANKER_CACHE_RETENTION=long",
async () => {
process.env.COMPANION_CACHE_RETENTION = "long";
process.env.CLANKER_CACHE_RETENTION = "long";
const model = getModel("anthropic", "claude-3-5-haiku-20241022");
let capturedPayload: any = null;
@ -79,7 +79,7 @@ describe("Cache Retention (COMPANION_CACHE_RETENTION)", () => {
);
it("should not add ttl when baseUrl is not api.anthropic.com", async () => {
process.env.COMPANION_CACHE_RETENTION = "long";
process.env.CLANKER_CACHE_RETENTION = "long";
// Create a model with a different baseUrl (simulating a proxy)
const baseModel = getModel("anthropic", "claude-3-5-haiku-20241022");
@ -210,7 +210,7 @@ describe("Cache Retention (COMPANION_CACHE_RETENTION)", () => {
describe("OpenAI Responses Provider", () => {
it.skipIf(!process.env.OPENAI_API_KEY)(
"should not set prompt_cache_retention when COMPANION_CACHE_RETENTION is not set",
"should not set prompt_cache_retention when CLANKER_CACHE_RETENTION is not set",
async () => {
const model = getModel("openai", "gpt-4o-mini");
let capturedPayload: any = null;
@ -232,9 +232,9 @@ describe("Cache Retention (COMPANION_CACHE_RETENTION)", () => {
);
it.skipIf(!process.env.OPENAI_API_KEY)(
"should set prompt_cache_retention to 24h when COMPANION_CACHE_RETENTION=long",
"should set prompt_cache_retention to 24h when CLANKER_CACHE_RETENTION=long",
async () => {
process.env.COMPANION_CACHE_RETENTION = "long";
process.env.CLANKER_CACHE_RETENTION = "long";
const model = getModel("openai", "gpt-4o-mini");
let capturedPayload: any = null;
@ -255,7 +255,7 @@ describe("Cache Retention (COMPANION_CACHE_RETENTION)", () => {
);
it("should not set prompt_cache_retention when baseUrl is not api.openai.com", async () => {
process.env.COMPANION_CACHE_RETENTION = "long";
process.env.CLANKER_CACHE_RETENTION = "long";
// Create a model with a different baseUrl (simulating a proxy)
const baseModel = getModel("openai", "gpt-4o-mini");

View file

@ -683,7 +683,7 @@ describe("Context overflow error handling", () => {
// Check if ollama is installed and local LLM tests are enabled
let ollamaInstalled = false;
if (!process.env.COMPANION_NO_LOCAL_LLM) {
if (!process.env.CLANKER_NO_LOCAL_LLM) {
try {
execSync("which ollama", { stdio: "ignore" });
ollamaInstalled = true;
@ -785,7 +785,7 @@ describe("Context overflow error handling", () => {
// =============================================================================
let lmStudioRunning = false;
if (!process.env.COMPANION_NO_LOCAL_LLM) {
if (!process.env.CLANKER_NO_LOCAL_LLM) {
try {
execSync(
"curl -s --max-time 1 http://localhost:1234/v1/models > /dev/null",

View file

@ -227,7 +227,7 @@ function dumpFailurePayload(params: {
payload?: unknown;
messages: Message[];
}): void {
const filename = `/tmp/companion-handoff-${params.label}-${Date.now()}.json`;
const filename = `/tmp/clanker-handoff-${params.label}-${Date.now()}.json`;
const body = {
label: params.label,
error: params.error,

View file

@ -765,7 +765,7 @@ describe("AI Providers Empty Message Tests", () => {
);
// =========================================================================
// OAuth-based providers (credentials from ~/.companion/agent/oauth.json)
// OAuth-based providers (credentials from ~/.clanker/agent/oauth.json)
// =========================================================================
describe("Anthropic OAuth Provider Empty Messages", () => {

View file

@ -476,7 +476,7 @@ describe("Tool Results with Images", () => {
);
// =========================================================================
// OAuth-based providers (credentials from ~/.companion/agent/oauth.json)
// OAuth-based providers (credentials from ~/.clanker/agent/oauth.json)
// =========================================================================
describe("Anthropic OAuth Provider (claude-sonnet-4-5)", () => {
@ -584,7 +584,7 @@ describe("Tool Results with Images", () => {
},
);
/** These two don't work, the model simply won't call the tool, works in companion
/** These two don't work, the model simply won't call the tool, works in clanker
it.skipIf(!antigravityToken)(
"claude-sonnet-4-5 - should handle tool result with only image",
{ retry: 3, timeout: 30000 },

View file

@ -1,5 +1,5 @@
/**
* Test helper for resolving API keys from ~/.companion/agent/auth.json
* Test helper for resolving API keys from ~/.clanker/agent/auth.json
*
* Supports both API key and OAuth credentials.
* OAuth tokens are automatically refreshed if expired and saved back to auth.json.
@ -20,7 +20,7 @@ import type {
OAuthProvider,
} from "../src/utils/oauth/types.js";
const AUTH_PATH = join(homedir(), ".companion", "agent", "auth.json");
const AUTH_PATH = join(homedir(), ".clanker", "agent", "auth.json");
type ApiKeyCredential = {
type: "api_key";
@ -57,7 +57,7 @@ function saveAuthStorage(storage: AuthStorage): void {
}
/**
* Resolve API key for a provider from ~/.companion/agent/auth.json
* Resolve API key for a provider from ~/.clanker/agent/auth.json
*
* For API key credentials, returns the key directly.
* For OAuth credentials, returns the access token (refreshing if expired and saving back).

View file

@ -6,22 +6,22 @@ import { streamOpenAICodexResponses } from "../src/providers/openai-codex-respon
import type { Context, Model } from "../src/types.js";
const originalFetch = global.fetch;
const originalAgentDir = process.env.COMPANION_CODING_AGENT_DIR;
const originalAgentDir = process.env.CLANKER_CODING_AGENT_DIR;
afterEach(() => {
global.fetch = originalFetch;
if (originalAgentDir === undefined) {
delete process.env.COMPANION_CODING_AGENT_DIR;
delete process.env.CLANKER_CODING_AGENT_DIR;
} else {
process.env.COMPANION_CODING_AGENT_DIR = originalAgentDir;
process.env.CLANKER_CODING_AGENT_DIR = originalAgentDir;
}
vi.restoreAllMocks();
});
describe("openai-codex streaming", () => {
it("streams SSE responses into AssistantMessageEventStream", async () => {
const tempDir = mkdtempSync(join(tmpdir(), "companion-codex-stream-"));
process.env.COMPANION_CODING_AGENT_DIR = tempDir;
const tempDir = mkdtempSync(join(tmpdir(), "clanker-codex-stream-"));
process.env.CLANKER_CODING_AGENT_DIR = tempDir;
const payload = Buffer.from(
JSON.stringify({
@ -95,7 +95,7 @@ describe("openai-codex streaming", () => {
expect(headers?.get("Authorization")).toBe(`Bearer ${token}`);
expect(headers?.get("chatgpt-account-id")).toBe("acc_test");
expect(headers?.get("OpenAI-Beta")).toBe("responses=experimental");
expect(headers?.get("originator")).toBe("companion");
expect(headers?.get("originator")).toBe("clanker");
expect(headers?.get("accept")).toBe("text/event-stream");
expect(headers?.has("x-api-key")).toBe(false);
return new Response(stream, {
@ -149,8 +149,8 @@ describe("openai-codex streaming", () => {
});
it("sets conversation_id/session_id headers and prompt_cache_key when sessionId is provided", async () => {
const tempDir = mkdtempSync(join(tmpdir(), "companion-codex-stream-"));
process.env.COMPANION_CODING_AGENT_DIR = tempDir;
const tempDir = mkdtempSync(join(tmpdir(), "clanker-codex-stream-"));
process.env.CLANKER_CODING_AGENT_DIR = tempDir;
const payload = Buffer.from(
JSON.stringify({
@ -272,8 +272,8 @@ describe("openai-codex streaming", () => {
it.each(["gpt-5.3-codex", "gpt-5.4"])(
"clamps %s minimal reasoning effort to low",
async (modelId) => {
const tempDir = mkdtempSync(join(tmpdir(), "companion-codex-stream-"));
process.env.COMPANION_CODING_AGENT_DIR = tempDir;
const tempDir = mkdtempSync(join(tmpdir(), "clanker-codex-stream-"));
process.env.CLANKER_CODING_AGENT_DIR = tempDir;
const payload = Buffer.from(
JSON.stringify({
@ -393,8 +393,8 @@ describe("openai-codex streaming", () => {
);
it("does not set conversation_id/session_id headers when sessionId is not provided", async () => {
const tempDir = mkdtempSync(join(tmpdir(), "companion-codex-stream-"));
process.env.COMPANION_CODING_AGENT_DIR = tempDir;
const tempDir = mkdtempSync(join(tmpdir(), "clanker-codex-stream-"));
process.env.CLANKER_CODING_AGENT_DIR = tempDir;
const payload = Buffer.from(
JSON.stringify({

View file

@ -1048,7 +1048,7 @@ describe("Generate E2E Tests", () => {
);
// =========================================================================
// OAuth-based providers (credentials from ~/.companion/agent/oauth.json)
// OAuth-based providers (credentials from ~/.clanker/agent/oauth.json)
// Tokens are resolved at module level (see oauthTokens above)
// =========================================================================
@ -1800,7 +1800,7 @@ describe("Generate E2E Tests", () => {
// Check if ollama is installed and local LLM tests are enabled
let ollamaInstalled = false;
if (!process.env.COMPANION_NO_LOCAL_LLM) {
if (!process.env.CLANKER_NO_LOCAL_LLM) {
try {
execSync("which ollama", { stdio: "ignore" });
ollamaInstalled = true;

View file

@ -294,7 +294,7 @@ describe("Token Statistics on Abort", () => {
);
// =========================================================================
// OAuth-based providers (credentials from ~/.companion/agent/oauth.json)
// OAuth-based providers (credentials from ~/.clanker/agent/oauth.json)
// =========================================================================
describe("Anthropic OAuth Provider", () => {

View file

@ -7,7 +7,7 @@
* OpenAI Responses API generates IDs in format: {call_id}|{id}
* where {id} can be 400+ chars with special characters (+, /, =).
*
* Regression test for: https://github.com/badlogic/companion-mono/issues/1022
* Regression test for: https://github.com/badlogic/clanker-mono/issues/1022
*/
import { Type } from "@sinclair/typebox";

View file

@ -324,7 +324,7 @@ describe("Tool Call Without Result Tests", () => {
});
// =========================================================================
// OAuth-based providers (credentials from ~/.companion/agent/oauth.json)
// OAuth-based providers (credentials from ~/.clanker/agent/oauth.json)
// =========================================================================
describe("Anthropic OAuth Provider", () => {

View file

@ -472,7 +472,7 @@ describe("AI Providers Unicode Surrogate Pair Tests", () => {
);
// =========================================================================
// OAuth-based providers (credentials from ~/.companion/agent/oauth.json)
// OAuth-based providers (credentials from ~/.clanker/agent/oauth.json)
// =========================================================================
describe("Anthropic OAuth Provider Unicode Handling", () => {