Deploy an elizaOS agent as a Convex backend. Messages are processed through runtime.messageService.handleMessage and persisted in Convex's real-time database, giving you reactive queries, automatic caching, and zero-config hosting.
┌──────────────┐ ┌──────────────────┐ ┌────────────────────────┐
│ Client │────▶│ Convex HTTP │────▶│ Node.js Action │
│ (curl/app) │◀────│ Router (http.ts) │◀────│ (agent.ts) │
└──────────────┘ └──────────────────┘ │ │
│ AgentRuntime │
│ messageService │
│ .handleMessage() │
└───────────┬────────────┘
│
┌──────────────┼──────────────┐
▼ ▼ ▼
┌──────────┐ ┌──────────┐ ┌────────────┐
│ Convex │ │ LLM API │ │ elizaOS │
│ Database │ │ (OpenAI │ │ Plugins │
│ │ │ etc.) │ │ │
└──────────┘ └──────────┘ └────────────┘
- HTTP Router (
convex/http.ts) — exposesPOST /chat,GET /health, andGET /messagesendpoints - Agent Action (
convex/agent.ts) — a"use node"action that boots the full elizaOSAgentRuntime, callsmessageService.handleMessage, and returns the response - Message Persistence (
convex/messages.ts) — internal mutations store every user message and agent response in Convex; public queries let clients subscribe to real-time updates - Schema (
convex/schema.ts) — typed tables formessagesandconversations
- Node.js 18+
- A Convex account (convex.dev)
- At least one LLM API key (OpenAI, Anthropic, xAI, Google GenAI, or Groq)
cd examples/convex
bun install # or npm installnpx convex devOn first run this will:
- Prompt you to log in via GitHub
- Create a new Convex project
- Generate the
convex/_generated/directory - Print your HTTP Actions URL (e.g.
https://your-deployment.convex.cloud)
Set your LLM API key as a Convex environment variable:
npx convex env set OPENAI_API_KEY sk-your-key-hereOr for other providers:
npx convex env set ANTHROPIC_API_KEY your-key
npx convex env set XAI_API_KEY your-key
npx convex env set GOOGLE_GENERATIVE_AI_API_KEY your-key
npx convex env set GROQ_API_KEY your-key# Health check
curl https://your-deployment.convex.cloud/health
# Send a message
curl -X POST https://your-deployment.convex.cloud/chat \
-H "Content-Type: application/json" \
-d '{"message": "Hello Eliza!", "conversationId": "test-1"}'
# Retrieve messages
curl "https://your-deployment.convex.cloud/messages?conversationId=test-1"CONVEX_URL=https://your-deployment.convex.cloud bun run test-client.tsexamples/convex/
├── README.md # This file
├── package.json # Dependencies (convex, @elizaos/core, LLM plugins)
├── tsconfig.json # TypeScript configuration
├── .env.example # Environment variable template
├── convex/
│ ├── schema.ts # Database schema (messages, conversations)
│ ├── messages.ts # Queries & mutations for message persistence
│ ├── agent.ts # "use node" action — elizaOS runtime + handleMessage
│ ├── http.ts # HTTP router (POST /chat, GET /health, GET /messages)
│ └── _generated/ # Auto-generated by Convex (do not edit)
├── test-client.ts # Interactive CLI test client
└── scripts/
└── test-curl.sh # Quick curl smoke tests
Send a message to the elizaOS agent.
Request:
{
"message": "Hello, how are you?",
"conversationId": "optional-uuid",
"userId": "optional-user-id"
}Response:
{
"response": "I'm doing well, thank you!",
"conversationId": "uuid",
"agentName": "Eliza",
"provider": "OpenAI",
"timestamp": "2025-01-10T12:00:00.000Z"
}Returns service health status.
{
"status": "healthy",
"runtime": "elizaos-convex",
"version": "2.0.0-alpha"
}Retrieve all messages in a conversation (ordered by creation time).
{
"messages": [
{ "role": "user", "text": "Hello!", "entityId": "...", ... },
{ "role": "assistant", "text": "Hi there!", "entityId": "...", ... }
],
"conversationId": "..."
}| Variable | Required | Description |
|---|---|---|
OPENAI_API_KEY |
One of | OpenAI API key |
ANTHROPIC_API_KEY |
One of | Anthropic API key |
XAI_API_KEY |
One of | xAI (Grok) API key |
GOOGLE_GENERATIVE_AI_API_KEY |
One of | Google GenAI (Gemini) API key |
GROQ_API_KEY |
One of | Groq API key |
Set via npx convex env set KEY value.
npx convex deployThis pushes your functions to Convex's production environment. Set environment variables on the production deployment:
npx convex env set OPENAI_API_KEY sk-your-prod-key --prodThe "use node" directive tells Convex to run this action in a full Node.js environment instead of the default Convex runtime. This is required because @elizaos/core and the LLM plugins use Node.js APIs and npm packages.
The AgentRuntime is cached at module scope so that subsequent invocations on the same Convex isolate reuse the already-initialized runtime, avoiding cold-start overhead.
Because messages are stored in Convex tables, any client using the Convex React hooks (or the Convex client SDK) can subscribe to api.messages.list and get real-time updates as the agent responds.
- elizaOS Documentation
- Convex Documentation
- Convex HTTP Actions
- Convex Node.js Actions
- Chat Example — CLI chat using the same
messageService.handleMessagepattern