Design Engineer Logo
Visit Repo
Open Graph preview

MCP Bridge

A middleware to provide an openAI compatible endpoint that can call MCP tools

Site favicon
🔌 MCP Tools

Overview

MCP-Bridge

Description:
A middleware providing OpenAI-compatible endpoints for MCP tool integration

Category: Integration Tools / API Bridge

Overview: MCP-Bridge enables seamless integration between OpenAI API and MCP tools, allowing developers to use any OpenAI-compatible client with MCP tools. Key capabilities include:

  • Non-streaming and streaming chat completions with MCP
  • Non-streaming completions without MCP
  • Support for inference engines with tool call capabilities (tested with vLLM)
  • OpenAI API compatibility layer
  • Configurable tool management

Installation Options:

  1. Docker Installation (Recommended):
git clone [repository]
# Edit compose.yml
docker-compose up --build -d
  1. Manual Installation:
python3 -m venv .venv
source .venv/activate
pip install -r requirements.txt

Configuration Example:

{
  "inference_server": {
    "base_url": "http://localhost:8000/v1",
    "api_key": "None"
  },
  "mcp_servers": {
    "fetch": {
      "command": "uvx",
      "args": ["mcp-server-fetch"]
    }
  },
  "network": {
    "host": "0.0.0.0",
    "port": 9090
  }
}

Key Features:

  • Tool definitions management
  • Request forwarding to inference engine
  • Tool call handling
  • Response modification and processing
  • Flexible configuration options
  • Built-in documentation endpoint (/docs)

Licensed under MIT License