π A powerful knowledge graph server for AI agents, built with Neo4j and integrated with Model Context Protocol (MCP).
- π Dynamic knowledge graph management with Neo4j
- π€ Seamless integration with OpenAI models
- π MCP (Model Context Protocol) support
- π³ Docker-ready deployment
- π― Custom entity extraction capabilities
- π Advanced semantic search functionality
- Docker and Docker Compose
- Python 3.10 or higher
- OpenAI API key
- Minimum 4GB RAM (recommended 8GB)
- 2GB free disk space
- Clone the repository:
git clone https://github.com/gifflet/graphiti-mcp-server.git cd graphiti-mcp-server- Set up environment variables:
cp .env.sample .env- Edit
.envwith your configuration:
# Required for LLM operations OPENAI_API_KEY=your_openai_api_key_here MODEL_NAME=gpt-4.1-mini # Optional: Custom OpenAI endpoint (e.g., for proxies) # OPENAI_BASE_URL=https://api.openai.com/v1 # Neo4j Configuration (defaults work with Docker) NEO4J_URI=bolt://neo4j:7687 NEO4J_USER=neo4j NEO4J_PASSWORD=demodemo- Start the services:
docker compose up -d- Verify installation:
# Check if services are running docker compose ps # Check logs docker compose logs graphiti-mcpYou can run with environment variables directly:
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up| Service | Port | Purpose |
|---|---|---|
| Neo4j Browser | 7474 | Web interface for graph visualization |
| Neo4j Bolt | 7687 | Database connection |
| Graphiti MCP | 8000 | MCP server endpoint |
| Variable | Required | Default | Description |
|---|---|---|---|
OPENAI_API_KEY | β | - | Your OpenAI API key |
OPENAI_BASE_URL | β | - | Custom OpenAI API endpoint (consumed by OpenAI SDK) |
MODEL_NAME | β | gpt-4.1-mini | Main LLM model to use |
SMALL_MODEL_NAME | β | gpt-4.1-nano | Small LLM model for lighter tasks |
LLM_TEMPERATURE | β | 0.0 | LLM temperature (0.0-2.0) |
EMBEDDER_MODEL_NAME | β | text-embedding-3-small | Embedding model |
| Variable | Required | Default | Description |
|---|---|---|---|
NEO4J_URI | β | bolt://neo4j:7687 | Neo4j connection URI |
NEO4J_USER | β | neo4j | Neo4j username |
NEO4J_PASSWORD | β | demodemo | Neo4j password |
| Variable | Required | Default | Description |
|---|---|---|---|
MCP_SERVER_HOST | β | - | MCP server host binding |
SEMAPHORE_LIMIT | β | 10 | Concurrent operation limit for LLM calls |
For Azure OpenAI deployments, use these environment variables instead of the standard OpenAI configuration:
| Variable | Required | Default | Description |
|---|---|---|---|
AZURE_OPENAI_ENDPOINT | β * | - | Azure OpenAI endpoint URL |
AZURE_OPENAI_API_VERSION | β * | - | Azure OpenAI API version |
AZURE_OPENAI_DEPLOYMENT_NAME | β * | - | Azure OpenAI deployment name |
AZURE_OPENAI_USE_MANAGED_IDENTITY | β | false | Use Azure managed identity for auth |
AZURE_OPENAI_EMBEDDING_ENDPOINT | β | - | Separate endpoint for embeddings |
AZURE_OPENAI_EMBEDDING_API_VERSION | β | - | API version for embeddings |
AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME | β | - | Deployment name for embeddings |
AZURE_OPENAI_EMBEDDING_API_KEY | β | - | Separate API key for embeddings |
* Required when using Azure OpenAI
Notes:
OPENAI_BASE_URLis consumed directly by the OpenAI Python SDK, useful for proxy configurations or custom endpointsSEMAPHORE_LIMITcontrols concurrent LLM API calls - decrease if you encounter rate limits, increase for higher throughput- Azure configuration is an alternative to standard OpenAI - don't mix both configurations
Default configuration for Neo4j:
- Username:
neo4j - Password:
demodemo - URI:
bolt://neo4j:7687(within Docker network) - Memory settings optimized for development
You can run with environment variables directly:
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose upFor Azure OpenAI:
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com \ AZURE_OPENAI_API_VERSION=2024-02-01 \ AZURE_OPENAI_DEPLOYMENT_NAME=your-deployment \ OPENAI_API_KEY=your_key \ docker compose up- Configure Cursor MCP settings:
{ "mcpServers": { "Graphiti": { "command": "uv", "args": ["run", "graphiti_mcp_server.py"], "env": { "OPENAI_API_KEY": "your_key_here" } } } }- For Docker-based setup:
{ "mcpServers": { "Graphiti": { "url": "http://localhost:8000/sse" } } }- Add Graphiti rules to Cursor's User Rules (see
graphiti_cursor_rules.mdc) - Start an agent session in Cursor
The server supports standard MCP transports:
- SSE (Server-Sent Events):
http://localhost:8000/sse - WebSocket:
ws://localhost:8000/ws - Stdio: Direct process communication
- Install dependencies:
# Using uv (recommended) curl -LsSf https://astral.sh/uv/install.sh | sh uv sync # Or using pip pip install -r requirements.txt- Start Neo4j locally:
docker run -d \ --name neo4j-dev \ -p 7474:7474 -p 7687:7687 \ -e NEO4J_AUTH=neo4j/demodemo \ neo4j:5.26.0- Run the server:
# Set environment variables export OPENAI_API_KEY=your_key export NEO4J_URI=bolt://localhost:7687 # Run with stdio transport uv run graphiti_mcp_server.py # Or with SSE transport uv run graphiti_mcp_server.py --transport sse --use-custom-entities# Run basic connectivity test curl http://localhost:8000/health # Test MCP endpoint curl http://localhost:8000/sse# Clean up and restart docker compose down -v docker compose up --build # Check disk space docker system df# View all logs docker compose logs -f # View specific service logs docker compose logs -f graphiti-mcp docker compose logs -f neo4j # Enable debug logging docker compose up -e LOG_LEVEL=DEBUG- Memory: Increase Neo4j heap size in
docker-compose.yml - Storage: Monitor Neo4j data volume usage
- Network: Check for firewall blocking ports 7474, 7687, 8000
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ β MCP Client β β Graphiti MCP β β Neo4j β β (Cursor) βββββΊβ Server βββββΊβ Database β β β β (Port 8000) β β (Port 7687) β βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ β βΌ ββββββββββββββββββββ β OpenAI API β β (LLM Client) β ββββββββββββββββββββ - Neo4j Database: Graph storage and querying
- Graphiti MCP Server: API layer and LLM operations
- OpenAI Integration: Entity extraction and semantic processing
- MCP Protocol: Standardized AI agent communication
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
- Neo4j team for the amazing graph database
- OpenAI for their powerful LLM models
- MCP community for the protocol specification
- Graphiti Core for the knowledge graph framework
Need help? Open an issue or check our troubleshooting guide above.