co-mono/packages/pods/package.json
Mario Zechner 0c5cbd0068 v0.7.12: Custom models/providers support via models.json
- Add ~/.pi/agent/models.json config for custom providers (Ollama, vLLM, etc.)
- Support all 4 API types (openai-completions, openai-responses, anthropic-messages, google-generative-ai)
- Live reload models.json on /model selector open
- Smart model defaults per provider (claude-sonnet-4-5, gpt-5.1-codex, etc.)
- Graceful session fallback when saved model missing or no API key
- Validation errors show precise file/field info in CLI and TUI
- Agent knows its own README.md path for self-documentation
- Added gpt-5.1-codex (400k context, 128k output, reasoning)

Fixes #21
2025-11-16 22:56:24 +01:00

41 lines
847 B
JSON

{
"name": "@mariozechner/pi",
"version": "0.7.12",
"description": "CLI tool for managing vLLM deployments on GPU pods",
"type": "module",
"bin": {
"pi": "dist/cli.js"
},
"scripts": {
"clean": "rm -rf dist",
"build": "tsgo -p tsconfig.build.json && chmod +x dist/cli.js && cp src/models.json dist/ && cp -r scripts dist/",
"check": "biome check --write .",
"prepublishOnly": "npm run clean && npm run build"
},
"files": [
"dist",
"scripts"
],
"keywords": [
"llm",
"vllm",
"gpu",
"ai",
"cli"
],
"author": "Mario Zechner",
"license": "MIT",
"repository": {
"type": "git",
"url": "git+https://github.com/badlogic/pi-mono.git",
"directory": "packages/pods"
},
"engines": {
"node": ">=20.0.0"
},
"dependencies": {
"@mariozechner/pi-agent": "^0.7.12",
"chalk": "^5.5.0"
},
"devDependencies": {}
}