An intimate REPL environment for testing Hanzo's Model Context Protocol (MCP) tools and AI integration. Think of it as rlwrap
meets Claude Code - direct access to MCP tools with IPython magic.
- π― Direct MCP Access: All MCP tools available as Python functions
- π¬ Integrated Chat: Chat with AI that can use MCP tools
- π§ IPython Magic: Use
?
and!
helpers, tab completion - π Live Editing: Edit the REPL source code on the fly
- π§ͺ Built-in Tests: Comprehensive test suite for MCP tools
- π¨ Rich Output: Beautiful formatting with Rich library
# Setup (one time)
make setup
# Start the REPL
make dev
# All MCP tools are available as functions
>>> read_file(file_path="/etc/hosts")
>>> write_file(file_path="test.txt", content="Hello, World!")
>>> search(query="def main", path=".")
>>> run_command(command="ls -la")
# Simple chat
>>> chat("What files are in the current directory?")
# AI will use MCP tools to answer
>>> chat("Create a Python script that prints the current time")
# Complex workflows
>>> chat("Find all Python files with 'test' in the name and show their sizes")
# Single-line chat
%chat What is the weather today?
# Multi-line chat
%%ai
Can you help me create a web scraper?
I need it to extract titles from a list of URLs.
# List available tools
%tools
# Execute specific tool
%tool read_file {"file_path": "README.md"}
# Edit REPL source
%edit_self ipython_repl.py
# Change model
%model claude-3-opus-20240229
# Access MCP server directly
>>> mcp.tools.keys()
# Access LLM client
>>> llm.get_available_providers()
>>> llm.set_model("gpt-4")
# Access tool executor
>>> executor.get_context()
make dev
- Start IPython REPL with MCP toolsmake test
- Run test suitemake demo-file
- Demo file operationsmake demo-chat
- Demo chat functionalitymake demo-search
- Demo search operationsmake clean
- Clean generated files
chat(message)
- Chat with AI using MCP toolstools.<tab>
- Tab completion for all tools%tools
- List all available MCP tools%model
- Show/set current LLM model%edit_self
- Edit REPL source code
Set at least one LLM provider API key:
OPENAI_API_KEY
- OpenAI API keyANTHROPIC_API_KEY
- Anthropic/Claude API keyGROQ_API_KEY
- Groq API key- Other providers supported by litellm
The REPL can be used as an MCP server for Hanzo Chat:
- Start the REPL:
make dev
- Configure Hanzo Chat to use the MCP endpoint
- All tools become available in the chat interface
# Run all tests
make test
# Run integration tests (requires API keys)
make test-integration
# Interactive test in REPL
make test-repl
# Watch for changes and auto-restart
make watch
# Format code
make format
# Lint code
make lint
# Type check
make type-check
hanzo_repl/
βββ ipython_repl.py # IPython-based REPL with magic commands
βββ llm_client.py # Multi-provider LLM client
βββ tool_executor.py # Execute tools based on LLM responses
βββ tests.py # Comprehensive test suite
βββ cli.py # Command-line interface
mcp_repl.py # Minimal direct-access REPL
This REPL provides a more intimate way to interact with Hanzo's MCP tools:
- No barriers between you and the tools
- Direct Python access to all functionality
- Chat with AI that can modify its own code
- Perfect for testing and development
- Great for learning the MCP API
Think of it as your personal MCP playground where you can experiment, test, and build with immediate feedback.