This project demonstrates the Model Context Protocol (MCP) implementation using Go for the host application and both Go and Python for the MCP servers. The example showcases how different MCP servers can provide specialized tools that can be orchestrated by an LLM-powered host application.
The project consists of three main components:
- MCP Host (Go) - An intelligent host application that integrates with an LLM (DeepSeek API) to analyze user queries and call appropriate tools from the MCP servers
- MCP Server Go - Provides user management tools
- MCP Server Python - Provides mathematical and text processing tools
sequenceDiagram
participant User
participant Host as MCP Host<br/>(Go)
participant LLM as DeepSeek API<br/>(LLM)
participant GoClient as Go MCP Client
participant PythonClient as Python MCP Client
participant GoServer as Go MCP Server<br/>(User Management)
participant PythonServer as Python MCP Server<br/>(Math/Text Tools)
User->>Host: Natural language query<br/>("Add 5 and 10")
Host->>LLM: Analyze query and determine<br/>tool + parameters
LLM-->>Host: Tool plan<br/>{"tool": "add_numbers", "inputs": {"a": 5, "b": 10}}
Note over Host: Route to appropriate server
alt Tool found in Go Server
Host->>GoClient: Call tool request
GoClient->>GoServer: JSON-RPC call<br/>(tools/call)
GoServer-->>GoClient: Tool response
GoClient-->>Host: Result
else Tool found in Python Server
Host->>PythonClient: Call tool request
PythonClient->>PythonServer: JSON-RPC call<br/>(tools/call)
PythonServer-->>PythonClient: Tool response
PythonClient-->>Host: Result
end
Host-->>User: Final response<br/>("The sum of 5 and 10 is 15")
This project uses DeepSeek API as the Large Language Model (LLM) to provide intelligent tool orchestration capabilities. DeepSeek serves as the "brain" of the system, analyzing natural language queries and determining the appropriate tools to call.
-
Natural Language Understanding: DeepSeek analyzes user queries written in plain English to understand the intent and extract relevant parameters
-
Tool Selection: Based on the available tool schemas from both MCP servers, DeepSeek determines which specific tool should be called to fulfill the user's request
-
Parameter Extraction: The LLM extracts and formats the necessary parameters from the user query into the correct data types and structure required by the target tool
-
Response Planning: DeepSeek generates a structured JSON response containing:
- The tool name to be called
- The input parameters with proper types and values
- Any validation or preprocessing needed
For a user query like "Add 5 and 10", DeepSeek:
- Understands that this is a mathematical operation
- Identifies that the
add_numberstool from the Python server should be used - Extracts the numbers
5and10as theaandbparameters - Returns a structured plan:
{"tool": "add_numbers", "inputs": {"a": 5, "b": 10}}
The DeepSeek integration requires an API key that must be set as an environment variable:
export DEEPSEEK_API_KEY="your-deepseek-api-key"Security Note:
- Keep your API key secure and do not commit it to version control systems (e.g., Git).
- Use environment variable managers or secret management tools to store API keys securely.
- Restrict the permissions of your API key to the minimum necessary for your application.
The host application communicates with DeepSeek via HTTPS API calls to analyze queries and receive tool execution plans, making the system capable of handling complex, multi-step reasoning tasks through natural language interfaces.
The Go server provides user management functionality with the following tools:
-
get_user- Retrieve user information by user ID- Input:
user_id(string, required) - Returns: User details including name, email, and age
- Input:
-
create_user- Create a new user in the system- Input:
user_id(string, required)name(string, required)email(string, required)age(number, optional, minimum: 0)
- Returns: Confirmation of user creation
- Input:
The server comes pre-populated with sample users (Alice, Bob, Charlie) and uses in-memory storage.
The Python server provides mathematical and text processing tools:
-
add_numbers- Add two numbers together- Input:
a(float),b(float) - Returns: Sum of the two numbers
- Input:
-
multiply_numbers- Multiply two numbers together- Input:
a(float),b(float) - Returns: Product of the two numbers
- Input:
-
process_text- Process text with various operations- Input:
text(string)operation(string) - Available: "upper", "lower", "reverse"
- Returns: Processed text based on the operation
- Input:
The demo application showcases an intelligent tool orchestration system where:
- User Query Processing: The host receives natural language queries from users
- LLM Analysis: The DeepSeek API analyzes the query to determine which tool to call and with what parameters
- Tool Selection: The host automatically routes the request to the appropriate MCP server (Go or Python)
- Tool Execution: The selected tool is executed with the LLM-determined parameters
- Response: Results are returned to the user
The demo runs three example queries:
- "Read user with Id 2" β Calls
get_usertool on the Go server - "Create a new user with Id [timestamp], name 'John Doe', email 'jhondoe at example.com', age 30" β Calls
create_usertool on the Go server - "Add two numbers 5 and 10" β Calls
add_numberstool on the Python server
The MCP host application features a modern terminal user interface built with the Bubble Tea library, providing an interactive way to communicate with MCP servers.
- π¬ Interactive Query Input: Type natural language queries and get real-time responses
- π OpenAPI Specification Viewer: Browse MCP server documentation directly in the terminal
- β‘ Real-time Processing: See spinner animations while queries are being processed
- π¨ Syntax Highlighting: Color-coded responses and error messages
- β¨οΈ Keyboard Shortcuts: Efficient navigation with hotkeys
-
Enter Queries: Type your natural language request in the input field
Enter your query: > Add 15 and 25 -
Submit: Press
Enterto send your query to the MCP servers -
View Results: The LLM will analyze your request, select the appropriate tool, and display the response
| Key Combination | Action |
|---|---|
Enter |
Submit query to MCP servers |
Ctrl+O |
Toggle Go MCP Server OpenAPI specification |
Ctrl+Y |
Toggle Python MCP Server OpenAPI specification |
Ctrl+R |
Reset/clear current query and response |
Q or Ctrl+C |
Quit application |
The UI includes built-in viewers for both MCP servers' OpenAPI specifications:
- Go Server Spec (
Ctrl+O): View user management tools documentation - Python Server Spec (
Ctrl+Y): View math and text processing tools documentation
Navigate through the specifications using:
β/βorj/k: Scroll up/downPage Up/Page Down: Fast scrollHome/End: Jump to beginning/end
# In the UI:
> Create a user with name Alice and email [email protected]
β Response: User created successfully with ID 1234567890
> Add 42 and 58
β Response: The sum of 42 and 58 is 100
> Convert "hello world" to uppercase
β Response: Uppercase: HELLO WORLD
# Press Ctrl+O to view Go server documentation
# Press Ctrl+Y to view Python server documentation
# Press Q to quitThe application automatically detects the environment:
- TTY Available: Runs the interactive Bubble Tea UI
- No TTY (debugging/CI): Falls back to command-line examples mode
This ensures the application works in both development and production environments.
- Docker and Docker Compose
- DeepSeek API key
-
Set your DeepSeek API key:
export DEEPSEEK_API_KEY="your-api-key-here"
-
Run the demo:
make run-demo
Or manually with Docker Compose:
DEEPSEEK_API_KEY="your-api-key" make run-demo
You can also test the MCP servers directly using curl:
- Start the Go MCP server if it's not started yet
cd mcp-server-go && make run- Ask for available tools
curl --request POST \
--url http://localhost:8090/mcp \
--header 'Accept: application/json, text/event-stream' \
--header 'Content-Type: application/json' \
--data '{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/list",
"params": {}
}'- Start the Go MCP server if it's not started yet
cd mcp-server-go && make run- Create the new user
curl --request POST \
--url http://localhost:8090/mcp \
--header 'Accept: application/json, text/event-stream' \
--header 'Content-Type: application/json' \
--data '{
"jsonrpc": "2.0",
"id": 3,
"method": "tools/call",
"params": {
"name": "create_user",
"arguments": {
"user_id": "test123",
"name": "Test User",
"email": "[email protected]",
"age": 25
}
}
}'mcp-example/
βββ docs/ # Documentation and assets
β βββ example.gif. # Demo video showing UI interaction
βββ mcp-host/ # Go-based MCP host application
β βββ Dockerfile # Container build instructions
β βββ Makefile # Build and run tasks
β βββ go.mod # Go module dependencies
β βββ cmd/main.go # Main application entry point (with TTY detection)
β βββ pkg/mcphost/ # Host implementation and LLM integration
β β βββ host.go # Core MCP host logic
β β βββ llm-client.go # DeepSeek API integration
β β βββ mcp-client-go.go # Go MCP server client
β β βββ mcp-client-python.go # Python MCP server client
β βββ script/run.sh # Interactive startup script
βββ mcp-server-go/ # Go MCP server for user management
β βββ Dockerfile # Container build instructions
β βββ Makefile # Build and run tasks
β βββ go.mod # Go module dependencies
β βββ cmd/main.go # Server entry point
β βββ internal/mcpserver/ # User management tools implementation
β β βββ http.go # HTTP server and health endpoints
β β βββ mcp-server.go # MCP protocol implementation
β β βββ openapi.json # OpenAPI 3.0 specification
β βββ script/run.sh # Interactive startup script
βββ mcp-server-python/ # Python MCP server for math/text tools
β βββ Dockerfile # Container build instructions
β βββ Makefile # Build and run tasks
β βββ mcp-server.py # FastMCP-based server (with OpenAPI endpoint)
β βββ requirements.txt # Python dependencies
β βββ scripts/ # Python setup and run scripts
β βββ run.sh # Interactive startup script
β βββ setup.sh # Environment setup
βββ scripts/ # Utility scripts
β βββ run-demo.sh # Main demo execution script
βββ docker-compose.yml # Multi-container orchestration (with TTY support)
βββ Makefile # Root-level build tasks
βββ README.md # This comprehensive documentation
βββ LICENSE # Project license
βββ .gitignore # Git ignore patterns
- Interactive Terminal UI: Modern Bubble Tea-based interface with real-time query processing
- Multi-language MCP ecosystem: Demonstrates interoperability between Go and Python MCP servers
- LLM-powered tool orchestration: Uses AI to intelligently select and call appropriate tools
- Built-in Documentation Viewer: Browse OpenAPI specifications directly in the terminal
- Automatic server routing: Host application automatically tries both servers to find the right tool
- Environment Detection: Automatically switches between UI and CLI mode based on TTY availability
- Containerized deployment: Full Docker Compose setup for easy testing and deployment
- Health checks: Robust container health monitoring and startup coordination
- Adding multi steps support. For example: "Add 1 to 4 and multiply the result by 2"
- Adding a Authn/Authz layer between the host and the MCPs
- Improving the UI
