feat: enhance model cycling with thinking levels and --thinking flag

PR #47 enhancements:
- Add thinking level syntax to --models (e.g., --models sonnet:high,haiku:low)
- First model in scope used as initial model when starting new session
- Auto-apply thinking level when cycling models with Ctrl+P
- Save both model and thinking to session AND settings for persistence
- Simplify UX by removing autoThinkingDisabled flag
- Fix model matching to prioritize exact matches over partial
- Support provider/modelId format (e.g., openrouter/openai/gpt-5.1-codex)

Issue #45:
- Add --thinking CLI flag to set thinking level directly
- Takes highest priority over all other thinking level sources

Closes #45
This commit is contained in:
Mario Zechner 2025-11-21 21:32:30 +01:00
parent df3af27288
commit ba8c073ed2
5 changed files with 117 additions and 31 deletions

View file

@ -5,9 +5,12 @@
### Added
- **`/clear` Command**: New slash command to reset the conversation context and start a fresh session. Aborts any in-flight agent work, clears all messages, and creates a new session file. ([#48](https://github.com/badlogic/pi-mono/pull/48))
- **Model Cycling with Thinking Levels**: The `--models` flag now supports thinking level syntax (e.g., `--models sonnet:high,haiku:low`). When cycling models with `Ctrl+P`, the associated thinking level is automatically applied. The first model in the scope is used as the initial model when starting a new session. Both model and thinking level changes are now saved to session and settings for persistence. ([#47](https://github.com/badlogic/pi-mono/pull/47))
- **`--thinking` Flag**: New CLI flag to set thinking level directly (e.g., `--thinking high`). Valid values: `off`, `minimal`, `low`, `medium`, `high`. Takes highest priority over all other thinking level sources. ([#45](https://github.com/badlogic/pi-mono/issues/45))
### Fixed
- **Model Matching Priority**: The `--models` flag now prioritizes exact matches over partial matches. Supports `provider/modelId` format (e.g., `openrouter/openai/gpt-5.1-codex`) for precise selection. Exact ID matches are tried before partial matching, so `--models gpt-5.1-codex` correctly selects `gpt-5.1-codex` instead of `openai/gpt-5.1-codex-mini`.
- **Markdown Link Rendering**: Fixed links with identical text and href (e.g., `https://github.com/badlogic/pi-mono/pull/48/files`) being rendered twice. Now correctly compares raw text instead of styled text (which contains ANSI codes) when determining if link text matches href.
## [0.8.5] - 2025-11-21

View file

@ -670,12 +670,27 @@ Continue the most recent session
Select a session to resume (opens interactive selector)
**--models <patterns>**
Comma-separated model patterns for quick cycling with `Ctrl+P`. Patterns match against model IDs and names (case-insensitive). When multiple versions exist, prefers aliases over dated versions (e.g., `claude-sonnet-4-5` over `claude-sonnet-4-5-20250929`). Without this flag, `Ctrl+P` cycles through all available models.
Comma-separated model patterns for quick cycling with `Ctrl+P`. Matching priority:
1. `provider/modelId` exact match (e.g., `openrouter/openai/gpt-5.1-codex`)
2. Exact model ID match (e.g., `gpt-5.1-codex`)
3. Partial match against model IDs and names (case-insensitive)
When multiple partial matches exist, prefers aliases over dated versions (e.g., `claude-sonnet-4-5` over `claude-sonnet-4-5-20250929`). Without this flag, `Ctrl+P` cycles through all available models.
Each pattern can optionally include a thinking level suffix: `pattern:level` where level is one of `off`, `minimal`, `low`, `medium`, or `high`. When cycling models, the associated thinking level is automatically applied. The first model in the list is used as the initial model when starting a new session.
Examples:
- `--models claude-sonnet,gpt-4o` - Scope to Claude Sonnet and GPT-4o
- `--models sonnet,haiku` - Match any model containing "sonnet" or "haiku"
- `--models gemini` - All Gemini models
- `--models openrouter/openai/gpt-5.1-codex` - Exact provider/model match
- `--models gpt-5.1-codex` - Exact ID match (not `openai/gpt-5.1-codex-mini`)
- `--models sonnet:high,haiku:low` - Sonnet with high thinking, Haiku with low thinking
- `--models sonnet,haiku` - Partial match for any model containing "sonnet" or "haiku"
**--thinking <level>**
Set thinking level for reasoning-capable models. Valid values: `off`, `minimal`, `low`, `medium`, `high`. Takes highest priority over all other thinking level sources (saved settings, `--models` pattern levels, session restore).
Examples:
- `--thinking high` - Start with high thinking level
- `--thinking off` - Disable thinking even if saved setting was different
**--help, -h**
Show help message
@ -707,6 +722,13 @@ pi --provider openai --model gpt-4o "Help me refactor this code"
# Limit model cycling to specific models
pi --models claude-sonnet,claude-haiku,gpt-4o
# Now Ctrl+P cycles only through those models
# Model cycling with thinking levels
pi --models sonnet:high,haiku:low
# Starts with sonnet at high thinking, Ctrl+P switches to haiku at low thinking
# Start with specific thinking level
pi --thinking high "Solve this complex algorithm problem"
```
## Tools

View file

@ -39,6 +39,7 @@ interface Args {
model?: string;
apiKey?: string;
systemPrompt?: string;
thinking?: ThinkingLevel;
continue?: boolean;
resume?: boolean;
help?: boolean;
@ -82,6 +83,17 @@ function parseArgs(args: string[]): Args {
result.session = args[++i];
} else if (arg === "--models" && i + 1 < args.length) {
result.models = args[++i].split(",").map((s) => s.trim());
} else if (arg === "--thinking" && i + 1 < args.length) {
const level = args[++i];
if (level === "off" || level === "minimal" || level === "low" || level === "medium" || level === "high") {
result.thinking = level;
} else {
console.error(
chalk.yellow(
`Warning: Invalid thinking level "${level}". Valid values: off, minimal, low, medium, high`,
),
);
}
} else if (!arg.startsWith("-")) {
result.messages.push(arg);
}
@ -107,6 +119,7 @@ ${chalk.bold("Options:")}
--session <path> Use specific session file
--no-session Don't save session (ephemeral)
--models <patterns> Comma-separated model patterns for quick cycling with Ctrl+P
--thinking <level> Set thinking level: off, minimal, low, medium, high
--help, -h Show this help
${chalk.bold("Examples:")}
@ -131,6 +144,9 @@ ${chalk.bold("Examples:")}
# Cycle models with fixed thinking levels
pi --models sonnet:high,haiku:low
# Start with a specific thinking level
pi --thinking high "Solve this complex problem"
${chalk.bold("Environment Variables:")}
ANTHROPIC_API_KEY - Anthropic Claude API key
ANTHROPIC_OAUTH_TOKEN - Anthropic OAuth token (alternative to API key)
@ -370,7 +386,38 @@ async function resolveModelScope(
}
}
// Find all models matching this pattern (case-insensitive partial match)
// Check for provider/modelId format (provider is everything before the first /)
const slashIndex = modelPattern.indexOf("/");
if (slashIndex !== -1) {
const provider = modelPattern.substring(0, slashIndex);
const modelId = modelPattern.substring(slashIndex + 1);
const providerMatch = availableModels.find(
(m) => m.provider.toLowerCase() === provider.toLowerCase() && m.id.toLowerCase() === modelId.toLowerCase(),
);
if (providerMatch) {
if (
!scopedModels.find(
(sm) => sm.model.id === providerMatch.id && sm.model.provider === providerMatch.provider,
)
) {
scopedModels.push({ model: providerMatch, thinkingLevel });
}
continue;
}
// No exact provider/model match - fall through to other matching
}
// Check for exact ID match (case-insensitive)
const exactMatch = availableModels.find((m) => m.id.toLowerCase() === modelPattern.toLowerCase());
if (exactMatch) {
// Exact match found - use it directly
if (!scopedModels.find((sm) => sm.model.id === exactMatch.id && sm.model.provider === exactMatch.provider)) {
scopedModels.push({ model: exactMatch, thinkingLevel });
}
continue;
}
// No exact match - fall back to partial matching
const matches = availableModels.filter(
(m) =>
m.id.toLowerCase().includes(modelPattern.toLowerCase()) ||
@ -637,6 +684,12 @@ export async function main(args: string[]) {
process.exit(1);
}
initialModel = model;
// Also load saved thinking level if we're using saved model
const savedThinking = settingsManager.getDefaultThinkingLevel();
if (savedThinking) {
initialThinking = savedThinking;
}
}
}
@ -769,6 +822,11 @@ export async function main(args: string[]) {
}
}
// CLI --thinking flag takes highest priority
if (parsed.thinking) {
initialThinking = parsed.thinking;
}
// Create agent (initialModel can be null in interactive mode)
const agent = new Agent({
initialState: {

View file

@ -6,6 +6,7 @@ export interface Settings {
lastChangelogVersion?: string;
defaultProvider?: string;
defaultModel?: string;
defaultThinkingLevel?: "off" | "minimal" | "low" | "medium" | "high";
queueMode?: "all" | "one-at-a-time";
theme?: string;
}
@ -98,4 +99,13 @@ export class SettingsManager {
this.settings.theme = theme;
this.save();
}
getDefaultThinkingLevel(): "off" | "minimal" | "low" | "medium" | "high" | undefined {
return this.settings.defaultThinkingLevel;
}
setDefaultThinkingLevel(level: "off" | "minimal" | "low" | "medium" | "high"): void {
this.settings.defaultThinkingLevel = level;
this.save();
}
}

View file

@ -91,9 +91,6 @@ export class TuiRenderer {
// Model scope for quick cycling
private scopedModels: Array<{ model: Model<any>; thinkingLevel: ThinkingLevel }> = [];
// Track if user manually changed thinking (disables auto-thinking from model cycling)
private autoThinkingDisabled = false;
// Tool output expansion state
private toolOutputExpanded = false;
@ -789,11 +786,9 @@ export class TuiRenderer {
// Apply the new thinking level
this.agent.setThinkingLevel(nextLevel);
// Disable auto-thinking since user manually changed it
this.autoThinkingDisabled = true;
// Save thinking level change to session
// Save thinking level change to session and settings
this.sessionManager.saveThinkingLevelChange(nextLevel);
this.settingsManager.setDefaultThinkingLevel(nextLevel);
// Update border color
this.updateEditorBorderColor();
@ -840,17 +835,16 @@ export class TuiRenderer {
// Switch model
this.agent.setModel(nextModel);
// Apply thinking level if not disabled and model supports it
if (!this.autoThinkingDisabled && nextModel.reasoning) {
this.agent.setThinkingLevel(nextThinking);
this.sessionManager.saveThinkingLevelChange(nextThinking);
this.updateEditorBorderColor();
} else if (!this.autoThinkingDisabled && !nextModel.reasoning && nextThinking !== "off") {
// Model doesn't support thinking but user requested it - silently ignore
this.agent.setThinkingLevel("off");
this.sessionManager.saveThinkingLevelChange("off");
this.updateEditorBorderColor();
}
// Save model change to session and settings
this.sessionManager.saveModelChange(nextModel.provider, nextModel.id);
this.settingsManager.setDefaultModelAndProvider(nextModel.provider, nextModel.id);
// Apply thinking level (silently use "off" if model doesn't support thinking)
const effectiveThinking = nextModel.reasoning ? nextThinking : "off";
this.agent.setThinkingLevel(effectiveThinking);
this.sessionManager.saveThinkingLevelChange(effectiveThinking);
this.settingsManager.setDefaultThinkingLevel(effectiveThinking);
this.updateEditorBorderColor();
// Show notification
this.chatContainer.addChild(new Spacer(1));
@ -874,7 +868,7 @@ export class TuiRenderer {
if (availableModels.length === 1) {
this.chatContainer.addChild(new Spacer(1));
this.chatContainer.addChild(new Text(theme.fg("dim", "Only one model in scope"), 1, 0));
this.chatContainer.addChild(new Text(theme.fg("dim", "Only one model available"), 1, 0));
this.ui.requestRender();
return;
}
@ -902,6 +896,10 @@ export class TuiRenderer {
// Switch model
this.agent.setModel(nextModel);
// Save model change to session and settings
this.sessionManager.saveModelChange(nextModel.provider, nextModel.id);
this.settingsManager.setDefaultModelAndProvider(nextModel.provider, nextModel.id);
// Show notification
this.chatContainer.addChild(new Spacer(1));
this.chatContainer.addChild(new Text(theme.fg("dim", `Switched to ${nextModel.name || nextModel.id}`), 1, 0));
@ -949,11 +947,9 @@ export class TuiRenderer {
// Apply the selected thinking level
this.agent.setThinkingLevel(level);
// Disable auto-thinking since user manually changed it
this.autoThinkingDisabled = true;
// Save thinking level change to session
// Save thinking level change to session and settings
this.sessionManager.saveThinkingLevelChange(level);
this.settingsManager.setDefaultThinkingLevel(level);
// Update border color
this.updateEditorBorderColor();
@ -1107,9 +1103,6 @@ export class TuiRenderer {
// Apply the selected model
this.agent.setModel(model);
// Clear scoped models since user manually selected a model
this.scopedModels = [];
// Save model change to session
this.sessionManager.saveModelChange(model.provider, model.id);