co-mono/packages/coding-agent/docs/providers.md

4.1 KiB

Providers

Pi supports subscription-based providers via OAuth and API key providers via environment variables or auth file.

Table of Contents

Subscriptions

Use /login in interactive mode, then select a provider:

  • Claude Pro/Max
  • ChatGPT Plus/Pro (Codex)
  • GitHub Copilot
  • Google Gemini CLI
  • Google Antigravity

Use /logout to clear credentials. Tokens are stored in ~/.pi/agent/auth.json and auto-refresh when expired.

GitHub Copilot

  • Press Enter for github.com, or enter your GitHub Enterprise Server domain
  • If you get "model not supported", enable it in VS Code: Copilot Chat → model selector → select model → "Enable"

Google Providers

  • Gemini CLI: Standard Gemini models via Cloud Code Assist
  • Antigravity: Sandbox with Gemini 3, Claude, and GPT-OSS models
  • Both free with any Google account, subject to rate limits
  • For paid Cloud Code Assist: set GOOGLE_CLOUD_PROJECT env var

OpenAI Codex

  • Requires ChatGPT Plus or Pro subscription
  • Personal use only; for production, use the OpenAI Platform API

API Keys

Set via environment variable:

export ANTHROPIC_API_KEY=sk-ant-...
pi
Provider Environment Variable
Anthropic ANTHROPIC_API_KEY
OpenAI OPENAI_API_KEY
Google Gemini GEMINI_API_KEY
Mistral MISTRAL_API_KEY
Groq GROQ_API_KEY
Cerebras CEREBRAS_API_KEY
xAI XAI_API_KEY
OpenRouter OPENROUTER_API_KEY
Vercel AI Gateway AI_GATEWAY_API_KEY
ZAI ZAI_API_KEY
OpenCode Zen OPENCODE_API_KEY
MiniMax MINIMAX_API_KEY
MiniMax (China) MINIMAX_CN_API_KEY

Auth File

Store credentials in ~/.pi/agent/auth.json:

{
  "anthropic": { "type": "api_key", "key": "sk-ant-..." },
  "openai": { "type": "api_key", "key": "sk-..." },
  "google": { "type": "api_key", "key": "..." }
}

The file is created with 0600 permissions (user read/write only). Auth file credentials take priority over environment variables.

OAuth credentials are also stored here after /login and managed automatically.

Cloud Providers

Azure OpenAI

export AZURE_OPENAI_API_KEY=...
export AZURE_OPENAI_BASE_URL=https://your-resource.openai.azure.com
# or use resource name instead of base URL
export AZURE_OPENAI_RESOURCE_NAME=your-resource

# Optional
export AZURE_OPENAI_API_VERSION=2024-02-01
export AZURE_OPENAI_DEPLOYMENT_NAME_MAP=gpt-4=my-gpt4,gpt-4o=my-gpt4o

Amazon Bedrock

# Option 1: AWS Profile
export AWS_PROFILE=your-profile

# Option 2: IAM Keys
export AWS_ACCESS_KEY_ID=AKIA...
export AWS_SECRET_ACCESS_KEY=...

# Option 3: Bearer Token
export AWS_BEARER_TOKEN_BEDROCK=...

# Optional region (defaults to us-east-1)
export AWS_REGION=us-west-2

Also supports ECS task roles (AWS_CONTAINER_CREDENTIALS_*) and IRSA (AWS_WEB_IDENTITY_TOKEN_FILE).

pi --provider amazon-bedrock --model us.anthropic.claude-sonnet-4-20250514-v1:0

Google Vertex AI

Uses Application Default Credentials:

gcloud auth application-default login
export GOOGLE_CLOUD_PROJECT=your-project
export GOOGLE_CLOUD_LOCATION=us-central1

Or set GOOGLE_APPLICATION_CREDENTIALS to a service account key file.

Custom Providers

Via models.json: Add Ollama, LM Studio, vLLM, or any provider that speaks a supported API (OpenAI Completions, OpenAI Responses, Anthropic Messages, Google Generative AI). See models.md.

Via extensions: For providers that need custom API implementations or OAuth flows, create an extension. See custom-provider.md and examples/extensions/custom-provider-gitlab-duo.

Resolution Order

When resolving credentials for a provider:

  1. CLI --api-key flag
  2. auth.json entry (API key or OAuth token)
  3. Environment variable
  4. Custom provider keys from models.json