🚀 Deploy AI agents with multiple LLM providers and communication channels in minutes
A Docker-based deployment template for NullClaw - the AI agent framework that works with Claude, GPT, Llama, and more, accessible via Telegram, Discord, Slack, CLI, and other channels.
- Multi-Provider Support: OpenRouter, Anthropic, OpenAI, Groq, xAI, DeepSeek, Mistral, Ollama
- Multiple Channels: Telegram, Discord, Slack, IRC, CLI, and more
- One-Command Deploy: Deploy to Railway.app or run locally with Docker
- Environment-Driven: Configure everything via environment variables
- Production-Ready: Health checks, auto-restart, and audit logging
- Secure: Built-in sandboxing and workspace isolation
- Docker and Docker Compose
- At least one LLM provider API key (OpenRouter recommended)
git clone https://github.com/your-username/nullclaw-railway-template.git cd nullclaw-railway-template # Copy environment template cp .env.example .env # Edit with your API keys nano .env# In .env file - at least one provider required OPENROUTER_API_KEY=sk-or-v1-xxxxx # or ANTHROPIC_API_KEY=sk-ant-xxxxx # or OPENAI_API_KEY=sk-xxxxx# Build and start gateway service docker compose up -d nullclaw # Check logs docker compose logs -f nullclaw # Test health endpoint curl http://localhost:3000/healthYour AI agent is now running at http://localhost:3000! 🎉
# Start gateway on port 3000 docker compose up -d nullclaw # Access the API curl http://localhost:3000/health# Start interactive CLI agent docker compose --profile agent up -d nullclaw-agent # View agent logs docker compose logs -f nullclaw-agent# Add to .env TELEGRAM_BOT_TOKEN=123456:ABCDEF TELEGRAM_ALLOW_FROM=123456789 # Your Telegram user ID # Restart service docker compose restart nullclaw# At least one provider API key OPENROUTER_API_KEY= # Recommended - supports 100+ models ANTHROPIC_API_KEY= # Claude models OPENAI_API_KEY= # GPT models GROQ_API_KEY= # Fast inference# Model Selection DEFAULT_MODEL=openrouter/anthropic/claude-sonnet-4 # Port (Railway convention) PORT=3000 # Host binding (optional, defaults to 0.0.0.0) # GATEWAY_HOST=0.0.0.0 # Autonomy & Security AUTONOMY_LEVEL=supervised # supervised, semi_autonomous, full WORKSPACE_ONLY=true MAX_ACTIONS_PER_HOUR=20 SANDBOX_BACKEND=auto # auto, landlock, firejail, bubblewrap, docker, none AUDIT_ENABLED=true # Memory Backend MEMORY_BACKEND=sqlite # sqlite, markdownTELEGRAM_BOT_TOKEN=123456:ABCDEF TELEGRAM_ALLOW_FROM=123456789 # Comma-separated user IDs or "*"DISCORD_TOKEN=your-bot-token DISCORD_GUILD_ID=123456789 DISCORD_ALLOW_FROM=123456789SLACK_BOT_TOKEN=xoxb-xxxxx SLACK_APP_TOKEN=xapp-xxxxx SLACK_ALLOW_FROM=U123456See GUIDE_ADD_NEW_CHANNELS.md for detailed instructions.
| Provider | API Key Variable | Popular Models |
|---|---|---|
| OpenRouter | OPENROUTER_API_KEY | claude-sonnet-4, gpt-4o, llama-3.3-70b |
| Anthropic | ANTHROPIC_API_KEY | claude-3-5-sonnet, claude-3-opus |
| OpenAI | OPENAI_API_KEY | gpt-4o, gpt-4-turbo, o1-preview |
| Groq | GROQ_API_KEY | llama-3.3-70b-versatile, mixtral-8x7b |
| xAI | XAI_API_KEY | grok-beta |
| DeepSeek | DEEPSEEK_API_KEY | deepseek-chat, deepseek-coder |
| Mistral | MISTRAL_API_KEY | mistral-large-latest, codestral-latest |
| Ollama | OLLAMA_API_KEY | (base URL: http://localhost:11434) |
- Push this template to GitHub
- Connect repository to Railway
- Set environment variables in Railway dashboard
- Deploy! Railway will use
railway.jsonconfiguration
Configuration:
- Builder: Dockerfile
- Replicas: 1
- Restart: ON_FAILURE (max 10 retries)
- Health check:
GET /healthevery 30s
# Build image docker compose build nullclaw # Run with environment file docker compose up -d nullclaw # Or with inline environment variables docker run -d \ -p 3000:3000 \ -e OPENROUTER_API_KEY=sk-or-v1-xxxxx \ -e DEFAULT_MODEL=openrouter/anthropic/claude-sonnet-4 \ nullclaw-railway:latestnullclaw-railway-template/ ├── Dockerfile # Multi-stage build (Zig compilation) ├── docker-compose.yml # Service orchestration ├── entrypoint.sh # Config generation script ├── config.template.json # Configuration template ├── .env.example # Environment variable reference ├── railway.json # Railway.app deployment config ├── GUIDE_ADD_NEW_CHANNELS.md # Channel extension guide ├── AGENTS.md # AI agent guidelines └── README.md # This file - Build Time: Dockerfile clones NullClaw from GitHub and compiles with Zig
- Runtime:
entrypoint.shgeneratesconfig.jsonfrom environment variables - Startup: NullClaw binary starts with generated configuration
- Health Check: HTTP endpoint at
/healthmonitors service status
- Check NullClaw's
config.example.json - Add environment variables to
.env.example - Update
build_channels()inentrypoint.sh - Test with
docker compose up -d nullclaw
See GUIDE_ADD_NEW_CHANNELS.md for detailed examples.
# Check logs docker compose logs nullclaw # Verify environment variables docker compose exec nullclaw env | grep API_KEY # Check generated config docker compose exec nullclaw cat /nullclaw-data/.nullclaw/config.json# Test endpoint manually curl -v http://localhost:3000/health # Check if port is in use lsof -i :3000# Rebuild container docker compose down docker compose build nullclaw docker compose up -d nullclawContributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is open source and available under the MIT License.
- NullClaw - The AI agent framework
- Railway.app - Simplified deployment platform
- All the amazing LLM providers making AI accessible
Made with ❤️ for the AI community