This directory contains the comprehensive test suite for the Code-Index-MCP project.
conftest.py- Pytest configuration and shared fixturestest_gateway.py- Tests for FastAPI endpoints and API gatewaytest_dispatcher.py- Tests for plugin routing and caching logictest_sqlite_store.py- Tests for SQLite persistence layertest_watcher.py- Tests for file system monitoringtest_python_plugin.py- Tests for Python language plugin
# Install dependencies pip install -r requirements.txt pip install -e . # Run all tests pytest # Run with coverage pytest --cov=mcp_server --cov-report=html# Run unit tests only make test # Run all tests with coverage make test-all # Run specific test categories make test-unit make test-integration make benchmarkTests are marked with the following categories:
unit- Fast, isolated unit testsintegration- Tests requiring external resourcesslow- Tests that take >1s to runbenchmark- Performance benchmarkse2e- End-to-end testsrequires_db- Tests requiring databaserequires_network- Tests requiring network
Run specific categories:
# Only unit tests pytest -m "unit" # Integration tests only pytest -m "integration" # Exclude slow tests pytest -m "not slow"The project maintains >80% code coverage. Coverage reports are generated in:
- Terminal: Use
--cov-report=term-missing - HTML:
htmlcov/index.html - XML:
coverage.xml(for CI integration)
Tests run automatically on:
- Every push to
mainanddevelopbranches - All pull requests
- Daily at 2 AM UTC
The CI pipeline includes:
- Linting - Black, isort, flake8, mypy
- Testing - Multi-OS, multi-Python version matrix
- Coverage - Upload to Codecov
- Security - Safety and Bandit scans
- Performance - Benchmark tests
- Docker - Build verification
class TestComponentName: """Test suite for ComponentName.""" def test_specific_behavior(self): """Test that component does X when Y.""" # Arrange # Act # AssertCommon fixtures from conftest.py:
def test_with_sqlite(sqlite_store): """Test using SQLite store fixture.""" # sqlite_store is a fresh SQLite instance def test_with_mock_plugin(mock_plugin): """Test using mock plugin.""" # mock_plugin is a Mock(spec=IPlugin) def test_with_temp_files(temp_code_directory): """Test with temporary code files.""" # temp_code_directory contains sample code files@pytest.mark.benchmark def test_performance(benchmark_results): """Benchmark critical operations.""" from conftest import measure_time with measure_time("operation_name", benchmark_results): # Code to benchmark pass- Import errors: Ensure package is installed with
pip install -e . - Fixture not found: Check if test file imports from conftest
- Slow tests: Use
-m "not slow"to skip slow tests - Database errors: Tests use in-memory SQLite by default
Run tests with verbose output and stop on first failure:
pytest -vvs -xUse pytest's built-in debugger:
pytest --pdbWhen adding new tests:
- Follow existing naming conventions
- Add appropriate markers (
@pytest.mark.unit, etc.) - Include docstrings explaining what is tested
- Aim for >80% coverage of new code
- Run
make lintbefore committing