3.5 KiB
Providers
Pi supports subscription-based providers via OAuth and API key providers via environment variables or auth file.
Table of Contents
Subscriptions
Use /login in interactive mode, then select a provider:
- Claude Pro/Max
- ChatGPT Plus/Pro (Codex)
- GitHub Copilot
- Google Gemini CLI
- Google Antigravity
Use /logout to clear credentials. Tokens are stored in ~/.pi/agent/auth.json and auto-refresh when expired.
GitHub Copilot
- Press Enter for github.com, or enter your GitHub Enterprise Server domain
- If you get "model not supported", enable it in VS Code: Copilot Chat → model selector → select model → "Enable"
Google Providers
- Gemini CLI: Standard Gemini models via Cloud Code Assist
- Antigravity: Sandbox with Gemini 3, Claude, and GPT-OSS models
- Both free with any Google account, subject to rate limits
- For paid Cloud Code Assist: set
GOOGLE_CLOUD_PROJECTenv var
OpenAI Codex
- Requires ChatGPT Plus or Pro subscription
- Personal use only; for production, use the OpenAI Platform API
API Keys
Set via environment variable:
export ANTHROPIC_API_KEY=sk-ant-...
pi
| Provider | Environment Variable |
|---|---|
| Anthropic | ANTHROPIC_API_KEY |
| OpenAI | OPENAI_API_KEY |
| Google Gemini | GEMINI_API_KEY |
| Mistral | MISTRAL_API_KEY |
| Groq | GROQ_API_KEY |
| Cerebras | CEREBRAS_API_KEY |
| xAI | XAI_API_KEY |
| OpenRouter | OPENROUTER_API_KEY |
| Vercel AI Gateway | AI_GATEWAY_API_KEY |
| ZAI | ZAI_API_KEY |
| OpenCode Zen | OPENCODE_API_KEY |
| MiniMax | MINIMAX_API_KEY |
| MiniMax (China) | MINIMAX_CN_API_KEY |
Auth File
Store credentials in ~/.pi/agent/auth.json:
{
"anthropic": { "type": "api_key", "key": "sk-ant-..." },
"openai": { "type": "api_key", "key": "sk-..." },
"google": { "type": "api_key", "key": "..." }
}
The file is created with 0600 permissions (user read/write only). Auth file credentials take priority over environment variables.
OAuth credentials are also stored here after /login and managed automatically.
Cloud Providers
Azure OpenAI
export AZURE_OPENAI_API_KEY=...
export AZURE_OPENAI_BASE_URL=https://your-resource.openai.azure.com
# or use resource name instead of base URL
export AZURE_OPENAI_RESOURCE_NAME=your-resource
# Optional
export AZURE_OPENAI_API_VERSION=2024-02-01
export AZURE_OPENAI_DEPLOYMENT_NAME_MAP=gpt-4=my-gpt4,gpt-4o=my-gpt4o
Amazon Bedrock
# Option 1: AWS Profile
export AWS_PROFILE=your-profile
# Option 2: IAM Keys
export AWS_ACCESS_KEY_ID=AKIA...
export AWS_SECRET_ACCESS_KEY=...
# Option 3: Bearer Token
export AWS_BEARER_TOKEN_BEDROCK=...
# Optional region (defaults to us-east-1)
export AWS_REGION=us-west-2
Also supports ECS task roles (AWS_CONTAINER_CREDENTIALS_*) and IRSA (AWS_WEB_IDENTITY_TOKEN_FILE).
pi --provider amazon-bedrock --model us.anthropic.claude-sonnet-4-20250514-v1:0
Google Vertex AI
Uses Application Default Credentials:
gcloud auth application-default login
export GOOGLE_CLOUD_PROJECT=your-project
export GOOGLE_CLOUD_LOCATION=us-central1
Or set GOOGLE_APPLICATION_CREDENTIALS to a service account key file.
Resolution Order
When resolving credentials for a provider:
- CLI
--api-keyflag auth.jsonentry (API key or OAuth token)- Environment variable
- Custom provider keys from
models.json