Skip to content

Conversation

jif-oai
Copy link
Contributor

@jif-oai jif-oai commented Sep 11, 2025

Compact feature:

  1. Stops the model when the context window become too large
  2. Add a user turn, asking for the model to summarize
  3. Build a bridge that contains all the previous user message + the summary. Rendered from a template
  4. Start sampling again from a clean conversation with only that bridge

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Codex Review: Here are some suggestions.

Reply with @codex fix comments to fix any unresolved comments.

About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".

@jif-oai jif-oai requested a review from bolinfest September 11, 2025 01:51
sess: Arc<Session>,
turn_context: Arc<TurnContext>,
) {
let sub_id = sess.next_internal_sub_id();
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how much do we care about this being an ordered id vs just appending a random suffix? (for the sake of keeping session state smaller.

@pakrym-oai
Copy link
Collaborator

We are including a copy of environment information and AGENTS.md into summary message.

@pakrym-oai
Copy link
Collaborator

If compaction doesn't get the session under the limit (large user input) the agent will try to compact again.

});
break;
}
sess.enqueue_pending_responses(responses);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't figure out why this had to change?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand how it was working before lol

Copy link
Collaborator

@pakrym-oai pakrym-oai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should avoid duplicating environment context and agents.md between summary and normal session prefix.

If possible can we reduce amount of state we store on SessionState? I think we can calculate token_limit_reached inline and get away without compaction counter.

We should keep total token usage across compactions.

If compaction doesn't get the session under the limit (large user input) the agent will try to compact again and again until it runs out of context.

Copy link
Collaborator

@pakrym-oai pakrym-oai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! The last question I have is about #3446 (comment)

@jif-oai jif-oai merged commit ea225df into main Sep 12, 2025
19 checks passed
@jif-oai jif-oai deleted the jif/compact branch September 12, 2025 20:07
@github-actions github-actions bot locked and limited conversation to collaborators Sep 12, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants