mirror of
https://github.com/getcompanion-ai/co-mono.git
synced 2026-04-15 15:03:31 +00:00
- Add ~/.pi/agent/models.json config for custom providers (Ollama, vLLM, etc.) - Support all 4 API types (openai-completions, openai-responses, anthropic-messages, google-generative-ai) - Live reload models.json on /model selector open - Smart model defaults per provider (claude-sonnet-4-5, gpt-5.1-codex, etc.) - Graceful session fallback when saved model missing or no API key - Validation errors show precise file/field info in CLI and TUI - Agent knows its own README.md path for self-documentation - Added gpt-5.1-codex (400k context, 128k output, reasoning) Fixes #21 |
||
|---|---|---|
| .. | ||
| src | ||
| package.json | ||
| README.md | ||
| tsconfig.json | ||
@mariozechner/pi-proxy
CORS and authentication proxy for pi-ai. Enables browser clients to access OAuth-protected endpoints.
Usage
CORS Proxy
Zero-config CORS proxy for development:
# Run directly with tsx
npx tsx packages/proxy/src/cors-proxy.ts 3001
# Or use npm script
npm run dev -w @mariozechner/pi-proxy
# Or install globally and use CLI
npm install -g @mariozechner/pi-proxy
pi-proxy 3001
The proxy will forward requests to any URL:
// Instead of:
fetch('https://api.anthropic.com/v1/messages', { ... })
// Use:
fetch('http://localhost:3001?url=https://api.anthropic.com/v1/messages', { ... })
OAuth Integration
For Anthropic OAuth tokens, configure your client to use the proxy:
import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic({
apiKey: 'oauth_token_here',
baseURL: 'http://localhost:3001?url=https://api.anthropic.com'
});
Future Proxy Types
- BunnyCDN Edge Function: Deploy as edge function
- Managed Proxy: Self-hosted with provider key management and credential auth
- Cloudflare Worker: Deploy as CF worker
Architecture
The proxy:
- Accepts requests with
?url=<target>query parameter - Forwards all headers (except
host,origin) - Forwards request body for non-GET/HEAD requests
- Returns response with CORS headers enabled
- Strips CORS headers from upstream response
Development
npm install
npm run build
npm run check