MCP Directory

jaspertvdm/mcp-server-ollama-bridge

GitHub
Other

Bridge to local Ollama LLM server. Run Llama, Mistral, Qwen and other local models through MCP.

Direct install

Run the underlying MCP server directly — without Agent-CoreX routing.

npx -y mcp-server-ollama-bridge

Use with Agent-CoreX

Agent-CoreX intelligently routes which tools from jaspertvdm/mcp-server-ollama-bridge your AI agent actually needs per request — reducing token usage and cost.

agent-corex.json

{
  "mcpServers": {
    "mcp-server-ollama-bridge": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-server-ollama-bridge"
      ]
    }
  }
}

Route jaspertvdm/mcp-server-ollama-bridge intelligently

Connect in 2 minutes. Only pay for the tools your agent actually uses.

Installation Guide

Use with Agent-CoreX

Recommended: intelligent tool routing, lower costs

uvx agent-corex mcp add mcp-server-ollama-bridge

Agent-CoreX selects only the tools your agent needs, cutting token usage by up to 60%.

Direct Install

Use jaspertvdm/mcp-server-ollama-bridge directly without routing

npx -y mcp-server-ollama-bridge

New to Agent-CoreX? View setup guide →

jaspertvdm/mcp-server-ollama-bridge MCP Server — Agent-CoreX