co-mono/packages
Mario Zechner 39fa25eb67 fix(ai): clean up openai-codex models and token limits
- Remove model aliases (gpt-5, gpt-5-mini, gpt-5-nano, codex-mini-latest, gpt-5-codex, gpt-5.1-codex, gpt-5.1-chat-latest)
- Fix context window from 400k to 272k tokens to match Codex CLI defaults
- Keep maxTokens at 128k (original value)
- Simplify reasoning effort clamping

closes #536
2026-01-07 20:39:46 +01:00
..
agent Add [Unreleased] section for next cycle 2026-01-07 01:33:34 +01:00
ai fix(ai): clean up openai-codex models and token limits 2026-01-07 20:39:46 +01:00
coding-agent Merge pull request #513 from austinm911/fix/async-extension-factories 2026-01-07 11:48:33 +01:00
mom Add [Unreleased] section for next cycle 2026-01-07 01:33:34 +01:00
pods Release v0.37.8 2026-01-07 01:32:53 +01:00
tui Add [Unreleased] section for next cycle 2026-01-07 01:33:34 +01:00
web-ui Add [Unreleased] section for next cycle 2026-01-07 01:33:34 +01:00