mirror of
https://github.com/getcompanion-ai/co-mono.git
synced 2026-04-16 14:01:06 +00:00
Enhance provider override to support baseUrl-only mode
Builds on #406 to support simpler proxy use case: - Override just baseUrl to route built-in provider through proxy - All built-in models preserved, no need to redefine them - Full replacement still works when models array is provided
This commit is contained in:
parent
243104fa18
commit
d747ec6e23
4 changed files with 238 additions and 25 deletions
|
|
@ -465,6 +465,37 @@ Add custom models (Ollama, vLLM, LM Studio, etc.) via `~/.pi/agent/models.json`:
|
|||
}
|
||||
```
|
||||
|
||||
**Overriding built-in providers:**
|
||||
|
||||
To route a built-in provider (anthropic, openai, google, etc.) through a proxy without redefining all models, just specify the `baseUrl`:
|
||||
|
||||
```json
|
||||
{
|
||||
"providers": {
|
||||
"anthropic": {
|
||||
"baseUrl": "https://my-proxy.example.com/v1"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
All built-in Anthropic models remain available with the new endpoint. Existing OAuth or API key auth continues to work.
|
||||
|
||||
To fully replace a built-in provider with custom models, include the `models` array:
|
||||
|
||||
```json
|
||||
{
|
||||
"providers": {
|
||||
"anthropic": {
|
||||
"baseUrl": "https://my-proxy.example.com/v1",
|
||||
"apiKey": "ANTHROPIC_API_KEY",
|
||||
"api": "anthropic-messages",
|
||||
"models": [...]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Authorization header:** Set `authHeader: true` to add `Authorization: Bearer <apiKey>` automatically.
|
||||
|
||||
**OpenAI compatibility (`compat` field):**
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue