fix(ai): detect context_length_exceeded overflow

This commit is contained in:
Mario Zechner 2026-01-08 03:16:59 +01:00
parent b1f32b9c8d
commit 946efe4b45
2 changed files with 3 additions and 1 deletions

View file

@ -13,6 +13,8 @@
### Fixed
- Fixed OpenAI Codex context window from 400,000 to 272,000 tokens to match Codex CLI defaults and prevent 400 errors. ([#536](https://github.com/badlogic/pi-mono/pull/536) by [@ghoulr](https://github.com/ghoulr))
- Fixed Codex SSE error events to surface message, code, and status. ([#551](https://github.com/badlogic/pi-mono/pull/551) by [@tmustier](https://github.com/tmustier))
- Fixed context overflow detection for `context_length_exceeded` error codes.
## [0.37.8] - 2026-01-07

View file

@ -32,7 +32,7 @@ const OVERFLOW_PATTERNS = [
/exceeds the limit of \d+/i, // GitHub Copilot
/exceeds the available context size/i, // llama.cpp server
/greater than the context length/i, // LM Studio
/context length exceeded/i, // Generic fallback
/context[_ ]length[_ ]exceeded/i, // Generic fallback
/too many tokens/i, // Generic fallback
/token limit exceeded/i, // Generic fallback
];