An MCP (Model Context Protocol) server that provides chat and image analysis capabilities through OpenRouter.ai's diverse model ecosystem. This server combines text chat functionality with powerful image analysis capabilities.
-
Text Chat:
- Direct access to all OpenRouter.ai chat models
- Support for simple text and multimodal conversations
- Configurable temperature and other parameters
-
Image Analysis:
- Analyze single images with custom questions
- Process multiple images simultaneously
- Automatic image resizing and optimization
- Support for various image sources (local files, URLs, data URLs)
-
Model Selection:
- Search and filter available models
- Validate model IDs
- Get detailed model information
- Support for default model configuration
-
Performance Optimization:
- Smart model information caching
- Exponential backoff for retries
- Automatic rate limit handling
-
Improved OS Compatibility:
- Enhanced path handling for Windows, macOS, and Linux
- Better support for Windows-style paths with drive letters
- Normalized path processing for consistent behavior across platforms
-
MCP Configuration Support:
- Cursor MCP integration without requiring environment variables
- Direct configuration via MCP parameters
- Flexible API key and model specification options
-
Robust Error Handling:
- Improved fallback mechanisms for image processing
- Better error reporting with specific diagnostics
- Multiple backup strategies for file reading
-
Image Processing Enhancements:
- More reliable base64 encoding for all image types
- Fallback options when Sharp module is unavailable
- Better handling of large images with automatic optimization
npm install -g @stabgan/openrouter-mcp-multimodaldocker run -i -e OPENROUTER_API_KEY=your-api-key-here stabgandocker/openrouter-mcp-multimodal:latest- Get your OpenRouter API key from OpenRouter Keys
- Choose a default model (optional)
Add one of the following configurations to your MCP settings file (e.g., cline_mcp_settings.json or claude_desktop_config.json):
{ "mcpServers": { "openrouter": { "command": "npx", "args": [ "-y", "@stabgan/openrouter-mcp-multimodal" ], "env": { "OPENROUTER_API_KEY": "your-api-key-here", "DEFAULT_MODEL": "qwen/qwen2.5-vl-32b-instruct:free" } } } }{ "mcpServers": { "openrouter": { "command": "uv", "args": [ "run", "-m", "openrouter_mcp_multimodal" ], "env": { "OPENROUTER_API_KEY": "your-api-key-here", "DEFAULT_MODEL": "qwen/qwen2.5-vl-32b-instruct:free" } } } }{ "mcpServers": { "openrouter": { "command": "docker", "args": [ "run", "--rm", "-i", "-e", "OPENROUTER_API_KEY=your-api-key-here", "-e", "DEFAULT_MODEL=qwen/qwen2.5-vl-32b-instruct:free", "stabgandocker/openrouter-mcp-multimodal:latest" ] } } }{ "mcpServers": { "openrouter": { "command": "smithery", "args": [ "run", "stabgan/openrouter-mcp-multimodal" ], "env": { "OPENROUTER_API_KEY": "your-api-key-here", "DEFAULT_MODEL": "qwen/qwen2.5-vl-32b-instruct:free" } } } }For comprehensive examples of how to use this MCP server, check out the examples directory. We provide:
- JavaScript examples for Node.js applications
- Python examples with interactive chat capabilities
- Code snippets for integrating with various applications
Each example comes with clear documentation and step-by-step instructions.
This project uses the following key dependencies:
@modelcontextprotocol/sdk: ^1.8.0 - Latest MCP SDK for tool implementationopenai: ^4.89.1 - OpenAI-compatible API client for OpenRoutersharp: ^0.33.5 - Fast image processing libraryaxios: ^1.8.4 - HTTP client for API requestsnode-fetch: ^3.3.2 - Modern fetch implementation
Node.js 18 or later is required. All dependencies are regularly updated to ensure compatibility and security.
Send text or multimodal messages to OpenRouter models:
use_mcp_tool({ server_name: "openrouter", tool_name: "mcp_openrouter_chat_completion", arguments: { model: "google/gemini-2.5-pro-exp-03-25:free", // Optional if default is set messages: [ { role: "system", content: "You are a helpful assistant." }, { role: "user", content: "What is the capital of France?" } ], temperature: 0.7 // Optional, defaults to 1.0 } });For multimodal messages with images:
use_mcp_tool({ server_name: "openrouter", tool_name: "mcp_openrouter_chat_completion", arguments: { model: "anthropic/claude-3.5-sonnet", messages: [ { role: "user", content: [ { type: "text", text: "What's in this image?" }, { type: "image_url", image_url: { url: "https://example.com/image.jpg" } } ] } ] } });