MCP Client CLI
A simple CLI to run LLM prompt and implement MCP client
🔌 MCP Tools
Overview
mcp-client-cli
Description:
A lightweight CLI tool for running LLM prompts with MCP server compatibility
Category: CLI Tools & Utilities
Overview: This CLI client allows running LLM prompts from your terminal with support for multiple MCP-compatible servers. Key features:
- Basic prompt handling via command line or pipe input
- Conversation continuity with 'c' prefix
- Tool integration (Brave Search, YouTube, web fetch)
- Configurable system prompts and LLM providers
- Support for OpenAI, Groq, and local LLM models via llama
Installation:
pip install git+https://github.com/adhikasp/mcp-client-cli.git
Basic Usage:
llm "Your prompt here"
echo "Your prompt" | llm
cat instructions.txt | llm
llm c "continue previous conversation"
Configuration:
Create ~/.llm/config.json
:
{
"systemPrompt": "You are an AI assistant helping a software engineer...",
"llm": {
"provider": "openai",
"model": "gpt-4-mini",
"api_key": "your-openai-api-key",
"temperature": 0.7
},
"mcpServers": {
"fetch": {
"command": "uvx",
"args": ["mcp-server-fetch"]
},
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"env": {
"BRAVE_API_KEY": "your-brave-api-key"
}
}
}
}
The client supports various MCP servers through simple configuration, making it a versatile tool for different LLM interactions and tool integrations.