Design Engineer Logo
Visit Repo
Open Graph preview

MCP LLM Bridge

MCP implementation that enables communication between MCP servers and OpenAI-compatible LLMs

Site favicon
🔌 MCP Tools

Overview

MCP LLM Bridge

Description:
A bridge connecting MCP servers to OpenAI-compatible LLMs through function calling

Category: Integration & Connectivity

Overview: This implementation creates a bidirectional protocol translation layer that enables communication between MCP servers and various LLM endpoints. It supports OpenAI API primarily, but also works with local endpoints implementing the OpenAI API spec like Ollama and LM Studio.

Key features:

  • Converts MCP tool specs into OpenAI function schemas
  • Maps function invocations to MCP tool executions
  • Compatible with cloud and local LLM implementations
  • Supports multiple LLM endpoints
  • Includes test database functionality

Installation:

curl -LsSf https://astral.sh/uv/install.sh | sh
git clone https://github.com/bartolli/mcp-llm-bridge.git
cd mcp-llm-bridge
uv venv
source .venv/bin/activate
uv pip install -e .

Configuration Example:

{
  "mcp-llm-bridge": {
    "command": "uvx",
    "args": ["mcp-server-sqlite", "--db-path", "test.db"],
    "env": {
      "OPENAI_API_KEY": "your_key",
      "OPENAI_MODEL": "gpt-4o"
    }
  }
}

Supported LLM Endpoints:

  • OpenAI API (Primary)
  • Ollama (Local)
  • LM Studio (Local)

Testing:

uv pip install -e ".[test]"
python -m pytest -v tests/

Usage:

python -m mcp_llm_bridge.main

Licensed under MIT. Compatible with various OpenAI models and local implementations.