Skip to main content

Command Palette

Search for a command to run...

How Amazon Q Developer CLI Manages Context Files

A look behind the scenes of the context management system of Q CLI

Updated
3 min read
How Amazon Q Developer CLI Manages Context Files

Ever wondered how Amazon Q CLI seamlessly incorporates your project files into AI conversations? Let us decode what’s happening behind the scenes of the /context command.

The Context Management System of Q CLI

The Q CLI application is shipped with an internal context management system built around several key components. Files are tracked and distinguished between:

  • 🤖 Agent paths: Files defined in agent configuration (persistent)

  • 📁 Session paths: Files added via /context add (temporary)

Agent-defined files are permanent, while session files are temporary. For agents, they are persistent because they're defined in the agent configuration file, not because of any special caching mechanism. Session context changes don't persist between chat sessions. Files are read fresh for each request (no caching), respecting filesystem permissions.

From command to context

When you use the "/context add" command, Q CLI:

  • Validates paths exist (unless --force is used)

  • Expands glob patterns (e.g., .py, src/**/.js)

  • Adds paths as session entries

The most important part - how files become part of your conversation. Your files are surrounded by special CONTEXT_ENTRY_START_HEADER and CONTEXT_ENTRY_END_HEADER. Your context files are integrated as special-formatted content within the conversation context that gets sent to the AI model:

--- CONTEXT ENTRY BEGIN ---
[src/main.py] 
def main(): 
print("Hello, World!")

[src/utils.py] 
def helper_function(): 
return "utility" 
--- CONTEXT ENTRY END ---

The implementation can be found in the context_messages method within conversation.rs:

async fn context_messages(
        &mut self,
        os: &Os,
        additional_context: Option<String>,
    ) -> (Option<Vec<HistoryEntry>>, Vec<(String, String)>) {
        let mut context_content = String::new();
        let mut dropped_context_files = Vec::new();
        if let Some((summary, _)) = &self.latest_summary {
            context_content.push_str(CONTEXT_ENTRY_START_HEADER);
            context_content.push_str("This summary contains ALL relevant information from our previous conversation including tool uses, results, code analysis, and file operations. YOU MUST reference this information when answering questions and explicitly acknowledge specific details from the summary when they're relevant to the current question.\n\n");
            context_content.push_str("SUMMARY CONTENT:\n");
            context_content.push_str(summary);
            context_content.push('\n');
            context_content.push_str(CONTEXT_ENTRY_END_HEADER);
        }

        // Add context files if available
        if let Some(context_manager) = self.context_manager.as_mut() {
            match context_manager.collect_context_files_with_limit(os).await {
                Ok((files_to_use, files_dropped)) => {
                    if !files_dropped.is_empty() {
                        dropped_context_files.extend(files_dropped);
                    }

                    if !files_to_use.is_empty() {
                        context_content.push_str(CONTEXT_ENTRY_START_HEADER);
                        for (filename, content) in files_to_use {
                            context_content.push_str(&format!("[{}]\n{}\n", filename, content));
                        }
                        context_content.push_str(CONTEXT_ENTRY_END_HEADER);
                    }
                },
                Err(e) => {
                    warn!("Failed to get context files: {}", e);
                },
            }
        }

        if let Some(context) = additional_context {
            context_content.push_str(&context);
        }

        if let Some(agent_prompt) = self.agents.get_active().and_then(|a| a.prompt.as_ref()) {
            context_content.push_str(&format!("Follow this instruction: {}", agent_prompt));
        }

        if !context_content.is_empty() {
            self.context_message_length = Some(context_content.len());
            let user = UserMessage::new_prompt(context_content, None);
            let assistant = AssistantMessage::new_response(None, "I will fully incorporate this information when generating my responses, and explicitly acknowledge relevant parts of the summary when answering questions.".into());
            (
                Some(vec![HistoryEntry {
                    user,
                    assistant,
                    request_metadata: None,
                }]),
                dropped_context_files,
            )
        } else {
            (None, dropped_context_files)
        }
    }

Context files are sorted alphabetically and deduplicated by filename. Hooks can also contribute to context content with the same header format. The Q CLI application further implements a token management for your context files:

  • It uses 75% of the model's context window for files

  • Automatically drops the largest files when limits are exceeded

  • Provides warnings when files are dropped

Some further key implementation details worth knowing for managing your context:

The next time you use /context add, you will know there is a system working behind the scenes to make your files seamlessly available to the underlying AI model - all while keeping your conversation flowing smoothly.