Provides context optimization tools to extract targeted information rather than processing large files and command outputs in their entirety.
Overview
Context Optimizer MCP Server
A Model Context Protocol (MCP) server that provides context optimization tools for AI coding assistants including GitHub Copilot, Cursor AI, Claude Desktop, and other MCP-compatible assistants. It enables AI assistants to extract targeted information rather than processing large files and command outputs in their entirety.
Features
- 🔍 File Analysis Tool (
askAboutFile) - Extract specific information from files without loading entire contents - 🖥️ Terminal Execution Tool (
runAndExtract) - Execute commands and extract relevant information using LLM analysis - ❓ Follow-up Questions Tool (
askFollowUp) - Continue conversations about previous terminal executions - 🔬 Research Tools (
researchTopic,deepResearch) - Conduct web research using Exa.ai's API - 🔒 Security Controls - Path validation, command filtering, and session management
- 🔧 Multi-LLM Support - Works with Google Gemini, Claude (Anthropic), and OpenAI
- ⚙️ Environment Variable Configuration - API key management through system environment variables
- 🏗️ Simple Configuration - Environment variables only, no config files to manage
- 🧪 Comprehensive Testing - Unit tests, integration tests, and security validation
Server Config
{
"mcpServers": {
"context-optimizer": {
"command": "context-optimizer-mcp"
}
}
}