| title | emoji | colorFrom | colorTo | sdk | sdk_version | app_file | pinned | license | short_description | hf_oauth | hf_oauth_scopes | tags | |||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
FixMyEnv Agent | 🐍 | blue | green | gradio | 6.0.1 | app.py | false | mit | MCP for Agents that plan your python package upgrade | true |
|
|
An AI-powered Gradio app (and MCP server) that analyzes your Python project, finds outdated or vulnerable dependencies, and recommends upgrades. Attach a pyproject.toml or requirements.txt, chat with the agent, and it will pull package data via GitHub MCP and run uv resolution to suggest safe versions.
- App: https://huggingface.co/spaces/MCP-1st-Birthday/FixMyEnv
- Demo Video: https://www.youtube.com/watch?v=u1-gZqPu0R0
- Social Post: LinkedIn
- Gradio chat UI with file uploads for dependency manifests.
- Smolagents-based reasoning backed by Hugging Face Inference API.
- GitHub MCP client for package metadata;
uvfor dependency resolution. - Runs locally with your own tokens; can also be served from Hugging Face Spaces.
- Python 3.10+
gitand a virtual environment tool (python -m venvworks fine)- Hugging Face access token with Inference API rights (
HF_TOKEN) - GitHub Personal Access Token with public repo read scope (
GITHUB_PAT) - Optional: Podman or Docker if you want to run the GitHub MCP server locally instead of using the hosted Copilot MCP endpoint.
- Clone and enter the repo:
git clone <your-fork-url> upgrade-advisor cd upgrade-advisor
- Create and activate a virtual environment:
python -m venv .venv source .venv/bin/activate - Install dependencies (editable mode so local changes are picked up): Alternatively:
pip install -e .pip install -r requirements.txt. - Create a
.envin the project root:The app will warn on missing tokens but will not function fully without them.GITHUB_PAT=ghp_******************************** HF_TOKEN=hf_*********************************** # Optional tweaks GITHUB_TOOLSETS="repos" # or "default,discussions,experiments" GITHUB_READ_ONLY=1 AGENT_MODEL=Qwen/Qwen3-Next-80B-A3B-Thinking HF_INFERENCE_PROVIDER=together GRADIO_SERVER_NAME=0.0.0.0 GRADIO_SERVER_PORT=7860
python app.py- Gradio starts at
http://127.0.0.1:7860by default. - Sign in with your Hugging Face account when prompted (or rely on
HF_TOKEN). - Ask upgrade questions and optionally upload
pyproject.tomlorrequirements.txt. - Uploaded files are placed in
uploads/for the session and cleaned up on exit.
The app defaults to the hosted Copilot MCP endpoint. To use a local MCP server instead:
podman run -i --rm \ -e GITHUB_PERSONAL_ACCESS_TOKEN=$GITHUB_PAT \ -e GITHUB_READ_ONLY=1 \ -e GITHUB_TOOLSETS="default" \ ghcr.io/github/github-mcp-serverUpdate app.py to point to your local MCP server URL/transport if you take this route. Read more about GitHub MCP server setup here.
- Code lives in
src/upgrade_advisor/; the Gradio entry point isapp.py. - Tooling and prompts for the agent are under
src/upgrade_advisor/agents/. - Basic samples for dependency files are in
tests/. - Run checks (none yet by default):
pytest.
- Missing tokens: ensure
GITHUB_PATandHF_TOKENare in.envor your shell. - Model choice: set
AGENT_MODEL/CHAT_MODELif you want to swap the default Qwen model. - Port conflicts: override
GRADIO_SERVER_PORTin.env.