mirror of
https://github.com/getcompanion-ai/co-mono.git
synced 2026-04-15 20:03:05 +00:00
API keys in auth.json now support the same resolution as models.json: - Shell command: "\!command" executes and uses stdout (cached) - Environment variable: uses the value of the named variable - Literal value: used directly Extracted shared resolveConfigValue() to new resolve-config-value.ts module.
168 lines
4.8 KiB
Markdown
168 lines
4.8 KiB
Markdown
# Providers
|
|
|
|
Pi supports subscription-based providers via OAuth and API key providers via environment variables or auth file. For each provider, pi knows all available models. The list is updated with every pi release.
|
|
|
|
## Table of Contents
|
|
|
|
- [Subscriptions](#subscriptions)
|
|
- [API Keys](#api-keys)
|
|
- [Auth File](#auth-file)
|
|
- [Cloud Providers](#cloud-providers)
|
|
- [Custom Providers](#custom-providers)
|
|
- [Resolution Order](#resolution-order)
|
|
|
|
## Subscriptions
|
|
|
|
Use `/login` in interactive mode, then select a provider:
|
|
|
|
- Claude Pro/Max
|
|
- ChatGPT Plus/Pro (Codex)
|
|
- GitHub Copilot
|
|
- Google Gemini CLI
|
|
- Google Antigravity
|
|
|
|
Use `/logout` to clear credentials. Tokens are stored in `~/.pi/agent/auth.json` and auto-refresh when expired.
|
|
|
|
### GitHub Copilot
|
|
|
|
- Press Enter for github.com, or enter your GitHub Enterprise Server domain
|
|
- If you get "model not supported", enable it in VS Code: Copilot Chat → model selector → select model → "Enable"
|
|
|
|
### Google Providers
|
|
|
|
- **Gemini CLI**: Standard Gemini models via Cloud Code Assist
|
|
- **Antigravity**: Sandbox with Gemini 3, Claude, and GPT-OSS models
|
|
- Both free with any Google account, subject to rate limits
|
|
- For paid Cloud Code Assist: set `GOOGLE_CLOUD_PROJECT` env var
|
|
|
|
### OpenAI Codex
|
|
|
|
- Requires ChatGPT Plus or Pro subscription
|
|
- Personal use only; for production, use the OpenAI Platform API
|
|
|
|
## API Keys
|
|
|
|
Set via environment variable:
|
|
|
|
```bash
|
|
export ANTHROPIC_API_KEY=sk-ant-...
|
|
pi
|
|
```
|
|
|
|
| Provider | Environment Variable |
|
|
|----------|---------------------|
|
|
| Anthropic | `ANTHROPIC_API_KEY` |
|
|
| OpenAI | `OPENAI_API_KEY` |
|
|
| Google Gemini | `GEMINI_API_KEY` |
|
|
| Mistral | `MISTRAL_API_KEY` |
|
|
| Groq | `GROQ_API_KEY` |
|
|
| Cerebras | `CEREBRAS_API_KEY` |
|
|
| xAI | `XAI_API_KEY` |
|
|
| OpenRouter | `OPENROUTER_API_KEY` |
|
|
| Vercel AI Gateway | `AI_GATEWAY_API_KEY` |
|
|
| ZAI | `ZAI_API_KEY` |
|
|
| OpenCode Zen | `OPENCODE_API_KEY` |
|
|
| Hugging Face | `HF_TOKEN` |
|
|
| Kimi For Coding | `KIMI_API_KEY` |
|
|
| MiniMax | `MINIMAX_API_KEY` |
|
|
| MiniMax (China) | `MINIMAX_CN_API_KEY` |
|
|
|
|
## Auth File
|
|
|
|
Store credentials in `~/.pi/agent/auth.json`:
|
|
|
|
```json
|
|
{
|
|
"anthropic": { "type": "api_key", "key": "sk-ant-..." },
|
|
"openai": { "type": "api_key", "key": "sk-..." },
|
|
"google": { "type": "api_key", "key": "..." },
|
|
"opencode": { "type": "api_key", "key": "..." }
|
|
}
|
|
```
|
|
|
|
The file is created with `0600` permissions (user read/write only). Auth file credentials take priority over environment variables.
|
|
|
|
### Key Resolution
|
|
|
|
The `key` field supports three formats:
|
|
|
|
- **Shell command:** `"!command"` executes and uses stdout (cached for process lifetime)
|
|
```json
|
|
{ "type": "api_key", "key": "!security find-generic-password -ws 'anthropic'" }
|
|
{ "type": "api_key", "key": "!op read 'op://vault/item/credential'" }
|
|
```
|
|
- **Environment variable:** Uses the value of the named variable
|
|
```json
|
|
{ "type": "api_key", "key": "MY_ANTHROPIC_KEY" }
|
|
```
|
|
- **Literal value:** Used directly
|
|
```json
|
|
{ "type": "api_key", "key": "sk-ant-..." }
|
|
```
|
|
|
|
OAuth credentials are also stored here after `/login` and managed automatically.
|
|
|
|
## Cloud Providers
|
|
|
|
### Azure OpenAI
|
|
|
|
```bash
|
|
export AZURE_OPENAI_API_KEY=...
|
|
export AZURE_OPENAI_BASE_URL=https://your-resource.openai.azure.com
|
|
# or use resource name instead of base URL
|
|
export AZURE_OPENAI_RESOURCE_NAME=your-resource
|
|
|
|
# Optional
|
|
export AZURE_OPENAI_API_VERSION=2024-02-01
|
|
export AZURE_OPENAI_DEPLOYMENT_NAME_MAP=gpt-4=my-gpt4,gpt-4o=my-gpt4o
|
|
```
|
|
|
|
### Amazon Bedrock
|
|
|
|
```bash
|
|
# Option 1: AWS Profile
|
|
export AWS_PROFILE=your-profile
|
|
|
|
# Option 2: IAM Keys
|
|
export AWS_ACCESS_KEY_ID=AKIA...
|
|
export AWS_SECRET_ACCESS_KEY=...
|
|
|
|
# Option 3: Bearer Token
|
|
export AWS_BEARER_TOKEN_BEDROCK=...
|
|
|
|
# Optional region (defaults to us-east-1)
|
|
export AWS_REGION=us-west-2
|
|
```
|
|
|
|
Also supports ECS task roles (`AWS_CONTAINER_CREDENTIALS_*`) and IRSA (`AWS_WEB_IDENTITY_TOKEN_FILE`).
|
|
|
|
```bash
|
|
pi --provider amazon-bedrock --model us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
```
|
|
|
|
### Google Vertex AI
|
|
|
|
Uses Application Default Credentials:
|
|
|
|
```bash
|
|
gcloud auth application-default login
|
|
export GOOGLE_CLOUD_PROJECT=your-project
|
|
export GOOGLE_CLOUD_LOCATION=us-central1
|
|
```
|
|
|
|
Or set `GOOGLE_APPLICATION_CREDENTIALS` to a service account key file.
|
|
|
|
## Custom Providers
|
|
|
|
**Via models.json:** Add Ollama, LM Studio, vLLM, or any provider that speaks a supported API (OpenAI Completions, OpenAI Responses, Anthropic Messages, Google Generative AI). See [models.md](models.md).
|
|
|
|
**Via extensions:** For providers that need custom API implementations or OAuth flows, create an extension. See [custom-provider.md](custom-provider.md) and [examples/extensions/custom-provider-gitlab-duo](../examples/extensions/custom-provider-gitlab-duo/).
|
|
|
|
## Resolution Order
|
|
|
|
When resolving credentials for a provider:
|
|
|
|
1. CLI `--api-key` flag
|
|
2. `auth.json` entry (API key or OAuth token)
|
|
3. Environment variable
|
|
4. Custom provider keys from `models.json`
|