Share code context with LLMs via MCP or clipboard
Run the underlying MCP server directly — without Agent-CoreX routing.
npx -y llm-context.pyAgent-CoreX intelligently routes which tools from cyberchitta/llm-context.py your AI agent actually needs per request — reducing token usage and cost.
agent-corex.json
{
"mcpServers": {
"llm-context.py": {
"command": "npx",
"args": [
"-y",
"llm-context.py"
]
}
}
}Connect in 2 minutes. Only pay for the tools your agent actually uses.
Secure file operations with configurable access controls
AI-native directory visualization with semantic analysis, ultra-compressed formats for AI consumption, and 10x token reduction. Supports quantum-semantic mode with intelligent file categorization.
The Box MCP server allows third party AI agents to securely and seamlessly access Box content and use tools such as search, asking questions from files and folders, and data extraction.
Recommended: intelligent tool routing, lower costs
uvx agent-corex mcp add llm-context.pyAgent-CoreX selects only the tools your agent needs, cutting token usage by up to 60%.
Use cyberchitta/llm-context.py directly without routing
npx -y llm-context.pyNew to Agent-CoreX? View setup guide →