MCP server to access and manage LLM application prompts created with [Langfuse]([https://langfuse.com/](https://langfuse.com/docs/prompts/get-started)) Prompt Management.
Run the underlying MCP server directly — without Agent-CoreX routing.
npx -y mcp-server-langfuseAgent-CoreX intelligently routes which tools from langfuse/mcp-server-langfuse your AI agent actually needs per request — reducing token usage and cost.
agent-corex.json
{
"mcpServers": {
"mcp-server-langfuse": {
"command": "npx",
"args": [
"-y",
"mcp-server-langfuse"
]
}
}
}Connect in 2 minutes. Only pay for the tools your agent actually uses.
Recommended: intelligent tool routing, lower costs
uvx agent-corex mcp add mcp-server-langfuseAgent-CoreX selects only the tools your agent needs, cutting token usage by up to 60%.
Use langfuse/mcp-server-langfuse directly without routing
npx -y mcp-server-langfuseNew to Agent-CoreX? View setup guide →