Add Mistral as AI provider

- Add Mistral to KnownProvider type and model generation
- Implement Mistral-specific compat handling in openai-completions:
  - requiresToolResultName: tool results need name field
  - requiresAssistantAfterToolResult: synthetic assistant message between tool/user
  - requiresThinkingAsText: thinking blocks as <thinking> text
  - requiresMistralToolIds: tool IDs must be exactly 9 alphanumeric chars
- Add MISTRAL_API_KEY environment variable support
- Add Mistral tests across all test files
- Update documentation (README, CHANGELOG) for both ai and coding-agent packages
- Remove client IDs from gemini.md, reference upstream source instead

Closes #165
This commit is contained in:
Mario Zechner 2025-12-10 20:36:19 +01:00
parent a248e2547a
commit 99b4b1aca0
31 changed files with 1856 additions and 282 deletions

View file

@ -9,6 +9,7 @@ Unified LLM API with automatic model discovery, provider configuration, token an
- **OpenAI**
- **Anthropic**
- **Google**
- **Mistral**
- **Groq**
- **Cerebras**
- **xAI**
@ -564,7 +565,7 @@ A **provider** offers models through a specific API. For example:
- **Anthropic** models use the `anthropic-messages` API
- **Google** models use the `google-generative-ai` API
- **OpenAI** models use the `openai-responses` API
- **xAI, Cerebras, Groq, etc.** models use the `openai-completions` API (OpenAI-compatible)
- **Mistral, xAI, Cerebras, Groq, etc.** models use the `openai-completions` API (OpenAI-compatible)
### Querying Providers and Models
@ -1036,6 +1037,7 @@ In Node.js environments, you can set environment variables to avoid passing API
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GEMINI_API_KEY=...
MISTRAL_API_KEY=...
GROQ_API_KEY=gsk_...
CEREBRAS_API_KEY=csk-...
XAI_API_KEY=xai-...