mirror of
https://github.com/getcompanion-ai/co-mono.git
synced 2026-04-17 07:03:25 +00:00
feat(ai): Create unified AI package with OpenAI, Anthropic, and Gemini support
- Set up @mariozechner/ai package structure following monorepo patterns - Install OpenAI, Anthropic, and Google Gemini SDK dependencies - Document comprehensive API investigation for all three providers - Design minimal unified API with streaming-first architecture - Add models.dev integration for pricing and capabilities - Implement automatic caching strategy for all providers - Update project documentation with package creation guide
This commit is contained in:
parent
2c03724862
commit
f064ea0e14
14 changed files with 7437 additions and 21 deletions
62
packages/ai/README.md
Normal file
62
packages/ai/README.md
Normal file
|
|
@ -0,0 +1,62 @@
|
|||
# @mariozechner/ai
|
||||
|
||||
Unified API for OpenAI, Anthropic, and Google Gemini LLM providers. This package provides a common interface for working with multiple LLM providers, handling their differences transparently while exposing a consistent, minimal API.
|
||||
|
||||
## Features (Planned)
|
||||
|
||||
- **Unified Interface**: Single API for OpenAI, Anthropic, and Google Gemini
|
||||
- **Streaming Support**: Real-time response streaming with delta events
|
||||
- **Tool Calling**: Consistent tool/function calling across providers
|
||||
- **Reasoning/Thinking**: Support for reasoning tokens where available
|
||||
- **Session Management**: Serializable conversation state across providers
|
||||
- **Token Tracking**: Unified token counting (input, output, cached, reasoning)
|
||||
- **Interrupt Handling**: Graceful cancellation of requests
|
||||
- **Provider Detection**: Automatic configuration based on endpoint
|
||||
- **Caching Support**: Provider-specific caching strategies
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
npm install @mariozechner/ai
|
||||
```
|
||||
|
||||
## Quick Start (Coming Soon)
|
||||
|
||||
```typescript
|
||||
import { createClient } from '@mariozechner/ai';
|
||||
|
||||
// Automatically detects provider from configuration
|
||||
const client = createClient({
|
||||
provider: 'openai',
|
||||
apiKey: process.env.OPENAI_API_KEY,
|
||||
model: 'gpt-4'
|
||||
});
|
||||
|
||||
// Same API works for all providers
|
||||
const response = await client.complete({
|
||||
messages: [
|
||||
{ role: 'user', content: 'Hello!' }
|
||||
],
|
||||
stream: true
|
||||
});
|
||||
|
||||
for await (const event of response) {
|
||||
if (event.type === 'content') {
|
||||
process.stdout.write(event.text);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Supported Providers
|
||||
|
||||
- **OpenAI**: GPT-3.5, GPT-4, o1, o3 models
|
||||
- **Anthropic**: Claude models via native SDK
|
||||
- **Google Gemini**: Gemini models with thinking support
|
||||
|
||||
## Development
|
||||
|
||||
This package is part of the pi monorepo. See the main README for development instructions.
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
Loading…
Add table
Add a link
Reference in a new issue