mirror of
https://github.com/getcompanion-ai/co-mono.git
synced 2026-04-16 18:03:50 +00:00
feat(coding-agent): support env vars and shell commands in headers
Header values in models.json now resolve using the same logic as apiKey: - Environment variable names are resolved to their values - Shell commands prefixed with ! are executed - Literal values are used directly This is a minor breaking change: if a header value accidentally matches an env var name, it will now resolve to that env var's value. Fixes #909
This commit is contained in:
parent
7af1919d31
commit
e0742d8217
3 changed files with 40 additions and 20 deletions
|
|
@ -681,10 +681,10 @@ Add custom models (Ollama, vLLM, LM Studio, etc.) via `~/.pi/agent/models.json`:
|
|||
|
||||
**Supported APIs:** `openai-completions`, `openai-responses`, `openai-codex-responses`, `anthropic-messages`, `google-generative-ai`
|
||||
|
||||
**API key resolution:** The `apiKey` field supports three formats:
|
||||
**Value resolution:** The `apiKey` and `headers` fields support three formats for their values:
|
||||
- `"!command"` - Executes the command and uses stdout (e.g., `"!security find-generic-password -ws 'anthropic'"` for macOS Keychain, `"!op read 'op://vault/item/credential'"` for 1Password)
|
||||
- Environment variable name (e.g., `"MY_API_KEY"`) - Uses the value of the environment variable
|
||||
- Literal value - Used directly as the API key
|
||||
- Literal value - Used directly
|
||||
|
||||
**API override:** Set `api` at provider level (default for all models) or model level (override per model).
|
||||
|
||||
|
|
@ -695,11 +695,11 @@ Add custom models (Ollama, vLLM, LM Studio, etc.) via `~/.pi/agent/models.json`:
|
|||
"providers": {
|
||||
"custom-proxy": {
|
||||
"baseUrl": "https://proxy.example.com/v1",
|
||||
"apiKey": "YOUR_API_KEY",
|
||||
"apiKey": "MY_API_KEY",
|
||||
"api": "anthropic-messages",
|
||||
"headers": {
|
||||
"User-Agent": "Mozilla/5.0 ...",
|
||||
"X-Custom-Auth": "token"
|
||||
"x-portkey-api-key": "PORTKEY_API_KEY",
|
||||
"x-secret": "!op read 'op://vault/item/secret'"
|
||||
},
|
||||
"models": [...]
|
||||
}
|
||||
|
|
@ -707,6 +707,8 @@ Add custom models (Ollama, vLLM, LM Studio, etc.) via `~/.pi/agent/models.json`:
|
|||
}
|
||||
```
|
||||
|
||||
Header values use the same resolution as `apiKey`: environment variables, shell commands (`!`), or literal values.
|
||||
|
||||
**Overriding built-in providers:**
|
||||
|
||||
To route a built-in provider (anthropic, openai, google, etc.) through a proxy without redefining all models, just specify the `baseUrl`:
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue