mirror of
https://github.com/getcompanion-ai/co-mono.git
synced 2026-04-15 18:01:22 +00:00
pi-ai: - Fixed usage.input to exclude cached tokens for OpenAI providers - Previously input included cached tokens, causing double-counting - Now input + output + cacheRead + cacheWrite correctly gives total context coding-agent: - Session header now includes branchedFrom field for branched sessions - Updated compaction.md with refined implementation plan - Updated session.md with branchedFrom documentation
3 KiB
3 KiB
Session File Format
Sessions are stored as JSONL (JSON Lines) files. Each line is a JSON object with a type field.
File Location
~/.pi/agent/sessions/--<path>--/<timestamp>_<uuid>.jsonl
Where <path> is the working directory with / replaced by -.
Type Definitions
src/session-manager.ts- Session entry types (SessionHeader,SessionMessageEntry, etc.)packages/agent/src/types.ts-AppMessage,Attachment,ThinkingLevelpackages/ai/src/types.ts-UserMessage,AssistantMessage,ToolResultMessage,Usage,ToolCall
Entry Types
SessionHeader
First line of the file. Defines session metadata.
{"type":"session","id":"uuid","timestamp":"2024-12-03T14:00:00.000Z","cwd":"/path/to/project","provider":"anthropic","modelId":"claude-sonnet-4-5","thinkingLevel":"off"}
For branched sessions, includes the source session path:
{"type":"session","id":"uuid","timestamp":"2024-12-03T14:00:00.000Z","cwd":"/path/to/project","provider":"anthropic","modelId":"claude-sonnet-4-5","thinkingLevel":"off","branchedFrom":"/path/to/original/session.jsonl"}
SessionMessageEntry
A message in the conversation. The message field contains an AppMessage (see rpc.md).
{"type":"message","timestamp":"2024-12-03T14:00:01.000Z","message":{"role":"user","content":"Hello","timestamp":1733234567890}}
{"type":"message","timestamp":"2024-12-03T14:00:02.000Z","message":{"role":"assistant","content":[{"type":"text","text":"Hi!"}],"api":"anthropic-messages","provider":"anthropic","model":"claude-sonnet-4-5","usage":{...},"stopReason":"stop","timestamp":1733234567891}}
{"type":"message","timestamp":"2024-12-03T14:00:03.000Z","message":{"role":"toolResult","toolCallId":"call_123","toolName":"bash","content":[{"type":"text","text":"output"}],"isError":false,"timestamp":1733234567900}}
ModelChangeEntry
Emitted when the user switches models mid-session.
{"type":"model_change","timestamp":"2024-12-03T14:05:00.000Z","provider":"openai","modelId":"gpt-4o"}
ThinkingLevelChangeEntry
Emitted when the user changes the thinking/reasoning level.
{"type":"thinking_level_change","timestamp":"2024-12-03T14:06:00.000Z","thinkingLevel":"high"}
Parsing Example
import { readFileSync } from "fs";
const lines = readFileSync("session.jsonl", "utf8").trim().split("\n");
for (const line of lines) {
const entry = JSON.parse(line);
switch (entry.type) {
case "session":
console.log(`Session: ${entry.id}, Model: ${entry.provider}/${entry.modelId}`);
break;
case "message":
console.log(`${entry.message.role}: ${JSON.stringify(entry.message.content)}`);
break;
case "model_change":
console.log(`Switched to: ${entry.provider}/${entry.modelId}`);
break;
case "thinking_level_change":
console.log(`Thinking: ${entry.thinkingLevel}`);
break;
}
}