A comprehensive LLM CLI plugin for Model Context Protocol (MCP) integration, enabling seamless interaction between the LLM command-line tool and MCP servers.
- Installation
- Quick Start
- Commands Reference
- Usage Examples
- Common Workflows
- Configuration
- Troubleshooting
pip install llm-mcp-cli- Python 3.11 or higher
- llm >= 0.27.0
- MCP-compatible servers (e.g.,
@modelcontextprotocol/server-filesystem)
- Add an MCP server:
llm mcp add filesystem npx @modelcontextprotocol/server-filesystem /path/to/directory- List available tools:
llm mcp tools --format list- Use tools in LLM conversations:
llm -m gpt-4 "List all files in my directory" $(llm mcp tools --format commands)if you want to add specific server commands
llm -m gpt-4 "List all files in my directory" $(llm mcp tools --server fetch --format commands)Register a new MCP server.
Syntax:
llm mcp add <name> <command> [args...] [options]Parameters:
name(required) - Unique identifier for the servercommand(required) - Command to execute the server (e.g.,npx,python)args(optional) - Additional arguments for the server command
Options:
--env KEY=value- Set environment variables (can be used multiple times)--description- Add a description for the server
Examples:
# Add filesystem server
llm mcp add filesystem npx @modelcontextprotocol/server-filesystem /Users/docs
# First store your GitHub token (one-time setup)
llm keys set GITHUB_PERSONAL_ACCESS_TOKEN
# Add GitHub server (API key automatically resolved from LLM storage)
llm mcp add github npx @modelcontextprotocol/server-github
# Add server with description
llm mcp add myserver python /path/to/server.py \
--description "Custom MCP server for data processing"
# Multiple environment variables
llm mcp add api-server ./server \
--env API_KEY=secret \
--env DEBUG=true \
--env PORT=8080Remove a registered MCP server.
Syntax:
llm mcp remove <name>Example:
llm mcp remove filesystemList all registered MCP servers.
Syntax:
llm mcp list [options]Options:
--enabled-only- Show only enabled servers--with-status- Include connection status information
Output includes:
- Server name with enabled/disabled indicator (✓/✗)
- Command and arguments
- Description (if provided)
- Environment variable count
- Connection status (with
--with-status) - Available tools count (with
--with-status)
Examples:
# List all servers
llm mcp list
# List only enabled servers with status
llm mcp list --enabled-only --with-statusEnable a disabled MCP server.
Syntax:
llm mcp enable <name>Example:
llm mcp enable filesystemDisable an MCP server without removing it.
Syntax:
llm mcp disable <name>Example:
llm mcp disable filesystemTest connectivity to an MCP server.
Syntax:
llm mcp test <name>Output includes:
- Connection success/failure status
- Available tools count
- First 5 tool names (if available)
- Error messages (if connection fails)
Example:
llm mcp test filesystemShow detailed information about a specific MCP server.
Syntax:
llm mcp describe <name>Output includes:
- Server configuration details
- Environment variables (keys only, values hidden)
- Connection status
- Complete list of available tools with descriptions
Example:
llm mcp describe filesystemManually start an MCP server connection.
Syntax:
llm mcp start <name>Example:
llm mcp start filesystemStop an active MCP server connection.
Syntax:
llm mcp stop <name>Example:
llm mcp stop filesystemList all available MCP tools from enabled servers.
Syntax:
llm mcp tools [options]Options:
--server <name>- Filter tools by specific server--format <type>- Output format (default: list)list- Detailed format with descriptionsnames- Tool names only, one per linecommands- As-Tflags ready for use with llm
--names-only- (Deprecated) Equivalent to--format names
Examples:
# List all tools with descriptions
llm mcp tools
# Get tools from specific server
llm mcp tools --server filesystem
# Get tool names only
llm mcp tools --format names
# Get ready-to-use command flags
llm mcp tools --format commands
# Output: -T filesystem__read_file -T filesystem__write_file ...Call a specific MCP tool directly.
Syntax:
llm mcp call-tool <tool_name> [options]Parameters:
tool_name(required) - Tool name in formatserver__tool
Options:
--args <json>- JSON object with tool arguments (default: "{}")
Examples:
# Read a file
llm mcp call-tool filesystem__read_file \
--args '{"path": "/tmp/example.txt"}'
# List directory contents
llm mcp call-tool filesystem__list_directory \
--args '{"path": "/Users/docs"}'
# Call with complex arguments
llm mcp call-tool github__search_repositories \
--args '{"query": "language:python stars:>1000", "limit": 10}'Show overall MCP plugin status and statistics.
Syntax:
llm mcp statusOutput includes:
- Total registered servers count
- Enabled servers count
- Connected servers count
- Available tools count
- Configuration directory path
- Log directory path
Example:
llm mcp status# 1. Add a filesystem server for your documents
llm mcp add docs npx @modelcontextprotocol/server-filesystem ~/Documents
# 2. Store API keys securely (one-time setup)
llm keys set GITHUB_PERSONAL_ACCESS_TOKEN
# 3. Add a GitHub server (API key automatically resolved)
llm mcp add github npx @modelcontextprotocol/server-github
# 4. Verify servers are working
llm mcp test docs
llm mcp test github
# 5. List all available tools
llm mcp tools# Method 1: Use the tools in a conversation
llm -m gpt-4 \
$(llm mcp tools --server docs --format commands) \
"What markdown files are in my Documents folder?"
# Method 2: Specify individual tools
llm -m claude-3-opus \
-T docs__read_file \
-T docs__write_file \
"Update the README.md file to include installation instructions"
# Method 3: Use all available tools
llm -m gpt-4 $(llm mcp tools --format commands) \
"Analyze the project structure and create a summary"# List files in a directory
llm mcp call-tool docs__list_directory \
--args '{"path": "/Users/me/Documents"}'
# Read a specific file
llm mcp call-tool docs__read_file \
--args '{"path": "/Users/me/Documents/notes.md"}'
# Search GitHub repositories
llm mcp call-tool github__search_repositories \
--args '{"query": "mcp server", "limit": 5}'# Setup filesystem server for documents
llm mcp add documents npx @modelcontextprotocol/server-filesystem \
~/Documents ~/Projects
# Use with LLM to organize files
llm -m gpt-4 $(llm mcp tools --server documents --format commands) \
"Create a summary of all README files in my Projects folder"# Add server for code directory
llm mcp add codebase npx @modelcontextprotocol/server-filesystem \
/path/to/codebase
# Analyze code structure
llm -m claude-3-opus $(llm mcp tools --server codebase --format commands) \
"Analyze the Python files and identify potential refactoring opportunities"# Store API keys once
llm keys set GITHUB_PERSONAL_ACCESS_TOKEN
# Add multiple servers (API keys automatically resolved)
llm mcp add docs npx @modelcontextprotocol/server-filesystem ~/Documents
llm mcp add code npx @modelcontextprotocol/server-filesystem ~/Code
llm mcp add github npx @modelcontextprotocol/server-github
# Use all tools together
llm -m gpt-4 $(llm mcp tools --format commands) \
"Compare my local documentation with similar projects on GitHub"The plugin automatically resolves common API keys from LLM's secure storage, eliminating the need for --env flags:
# 1. Store API keys securely using LLM's key storage (one-time setup)
llm keys set FIRECRAWL_API_KEY
llm keys set GITHUB_PERSONAL_ACCESS_TOKEN
llm keys set OPENAI_API_KEY
# 2. Add servers without needing to specify --env flags
llm mcp add firecrawl npx -- -y firecrawl-mcp
llm mcp add github npx @modelcontextprotocol/server-github
# 3. Test servers - API keys are automatically resolved
llm mcp test firecrawl # ✓ Uses FIRECRAWL_API_KEY from storage
llm mcp test github # ✓ Uses GITHUB_PERSONAL_ACCESS_TOKEN from storageResolution Priority:
- Environment variable (if already set)
- LLM key storage (
llm keys get KEY_NAME) - Server throws error if not found
# Check overall status
llm mcp status
# List all servers with their status
llm mcp list --with-status
# Disable unused servers
llm mcp disable old-server
# Test specific server
llm mcp test docs
# Get detailed information
llm mcp describe docs# Test the connection
llm mcp test servername
# Check server status
llm mcp describe servername
# Try restarting the server
llm mcp stop servername
llm mcp start servername# Ensure server is enabled
llm mcp enable servername
# List tools for specific server
llm mcp tools --server servername
# Check server has tools available
llm mcp test servername# Get detailed server information
llm mcp describe servername
# Check overall plugin status
llm mcp status
# View server list with connection status
llm mcp list --with-status
# Test individual server connectivity
llm mcp test servernameThis is an open-source project. Contributions are welcome!
# Clone the repository
git clone https://github.com/eugenepyvovarov/llm-mcp.git
cd llm-mcpApache 2.0 License - see LICENSE file for details.
For issues, questions, or suggestions:
- GitHub Issues: github.com/eugenepyvovarov/llm-mcp/issues
- Documentation: This README
- MCP Specification: modelcontextprotocol.io