A solution for real-time streaming output from Temporal workflows using Redis PubSub. This addresses the limitation that Temporal doesn't natively support streaming responses from an LLM.
This project demonstrates how to implement real-time streaming from Temporal workflows by using Redis PubSub as a communication channel. This demonstration focuses specifically on streaming LLM output back to the user via CLI, while also persisting results to the Temporal Activity.
- A Temporal workflow is defined that executes an activity
- The activity streams its output to a Redis PubSub channel character by character
- A client subscribes to the channel and displays the stream in real-time in the terminal
- The workflow completes once all streaming is finished
- anthropic
- python-dotenv
- UV package manager (for dependency management)
- Python 3.10+
- Redis server
- Temporal server
# Create a virtual environment using UV (recommended)
uv venv
# Activate the virtual environment
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies with UV
uv sync
- Start a Redis server in a separate terminal:
redis-server # Validate it's running in another terminal by running 'redi-cli ping' --> response should be 'pong'
- Start a Temporal server in a separate terminal:
temporal server start-dev # Open the Temporal UI on localhost:8233
- Start the Worker from the original virtual environment:
python worker.py
- Start the Workflow from a new terminal with the virtual environment
python starter.py
Contributions are welcome! Please feel free to submit a Pull Request.