Spaces:
Running
Running
Geevarghese George
commited on
Commit
·
a7ced4f
1
Parent(s):
0f92db1
update readme
Browse files
README.md
CHANGED
|
@@ -9,71 +9,95 @@ app_file: app.py
|
|
| 9 |
pinned: false
|
| 10 |
license: mit
|
| 11 |
short_description: MCP for Agents that plan your python package upgrade
|
| 12 |
-
hf_oauth: true
|
| 13 |
hf_oauth_scopes:
|
| 14 |
-
|
| 15 |
tags:
|
| 16 |
- building-mcp-track-enterprise
|
| 17 |
- building-mcp-track-customer
|
| 18 |
- mcp-in-action-track-customer
|
| 19 |
- mcp-in-action-track-enterprise
|
| 20 |
---
|
| 21 |
-
# Instructions
|
| 22 |
|
| 23 |
-
|
| 24 |
-
2. Store as GITHUB_PAT in a .env file in the root directory of the project.
|
| 25 |
|
| 26 |
-
|
| 27 |
|
| 28 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 29 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 30 |
|
| 31 |
-
##
|
| 32 |
-
|
| 33 |
-
```
|
| 34 |
-
|
| 35 |
-
-
|
| 36 |
-
|
| 37 |
-
|
| 38 |
-
|
| 39 |
-
|
| 40 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 41 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 42 |
|
| 43 |
-
##
|
| 44 |
-
|
| 45 |
-
|
| 46 |
-
|
| 47 |
-
|
| 48 |
-
"id": "github_token",
|
| 49 |
-
"description": "GitHub Personal Access Token",
|
| 50 |
-
"password": true
|
| 51 |
-
}
|
| 52 |
-
],
|
| 53 |
-
"servers": {
|
| 54 |
-
"github": {
|
| 55 |
-
"command": "podman",
|
| 56 |
-
"args": [
|
| 57 |
-
"run",
|
| 58 |
-
"-i",
|
| 59 |
-
"--rm",
|
| 60 |
-
"-e",
|
| 61 |
-
"GITHUB_PERSONAL_ACCESS_TOKEN",
|
| 62 |
-
"-e",
|
| 63 |
-
"GITHUB_READ_ONLY=1",
|
| 64 |
-
"-e",
|
| 65 |
-
"GITHUB_TOOLSETS=default",
|
| 66 |
-
"ghcr.io/github/github-mcp-server"
|
| 67 |
-
],
|
| 68 |
-
"env": {
|
| 69 |
-
"GITHUB_PERSONAL_ACCESS_TOKEN": "${input:github_token}"
|
| 70 |
-
"GITHUB_READ_ONLY": "1",
|
| 71 |
-
"GITHUB_TOOLSETS": "default",
|
| 72 |
-
}
|
| 73 |
-
}
|
| 74 |
-
}
|
| 75 |
-
}```
|
| 76 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 77 |
|
| 78 |
|
| 79 |
|
|
|
|
| 9 |
pinned: false
|
| 10 |
license: mit
|
| 11 |
short_description: MCP for Agents that plan your python package upgrade
|
| 12 |
+
hf_oauth: true
|
| 13 |
hf_oauth_scopes:
|
| 14 |
+
- read-repos
|
| 15 |
tags:
|
| 16 |
- building-mcp-track-enterprise
|
| 17 |
- building-mcp-track-customer
|
| 18 |
- mcp-in-action-track-customer
|
| 19 |
- mcp-in-action-track-enterprise
|
| 20 |
---
|
|
|
|
| 21 |
|
| 22 |
+
# FixMyEnv: Package Upgrade Advisor
|
|
|
|
| 23 |
|
| 24 |
+
An AI-powered Gradio app (and MCP server) that analyzes your Python project, finds outdated or vulnerable dependencies, and recommends upgrades. Attach a `pyproject.toml` or `requirements.txt`, chat with the agent, and it will pull package data via GitHub MCP and run `uv` resolution to suggest safe versions.
|
| 25 |
|
| 26 |
+
## What you get
|
| 27 |
+
- Gradio chat UI with file uploads for dependency manifests.
|
| 28 |
+
- Smolagents-based reasoning backed by Hugging Face Inference API.
|
| 29 |
+
- GitHub MCP client for package metadata; `uv` for dependency resolution.
|
| 30 |
+
- Runs locally with your own tokens; can also be served from Hugging Face Spaces.
|
| 31 |
|
| 32 |
+
## Prerequisites
|
| 33 |
+
- Python 3.10+
|
| 34 |
+
- `git` and a virtual environment tool (`python -m venv` works fine)
|
| 35 |
+
- Hugging Face access token with Inference API rights (`HF_TOKEN`)
|
| 36 |
+
- GitHub Personal Access Token with public repo read scope (`GITHUB_PAT`)
|
| 37 |
+
- Optional: Podman or Docker if you want to run the GitHub MCP server locally instead of using the hosted Copilot MCP endpoint.
|
| 38 |
|
| 39 |
+
## Setup
|
| 40 |
+
1. Clone and enter the repo:
|
| 41 |
+
```bash
|
| 42 |
+
git clone <your-fork-url> upgrade-advisor
|
| 43 |
+
cd upgrade-advisor
|
| 44 |
+
```
|
| 45 |
+
2. Create and activate a virtual environment:
|
| 46 |
+
```bash
|
| 47 |
+
python -m venv .venv
|
| 48 |
+
source .venv/bin/activate
|
| 49 |
+
```
|
| 50 |
+
3. Install dependencies (editable mode so local changes are picked up):
|
| 51 |
+
```bash
|
| 52 |
+
pip install -e .
|
| 53 |
+
```
|
| 54 |
+
Alternatively: `pip install -r requirements.txt`.
|
| 55 |
+
4. Create a `.env` in the project root:
|
| 56 |
+
```dotenv
|
| 57 |
+
GITHUB_PAT=ghp_********************************
|
| 58 |
+
HF_TOKEN=hf_***********************************
|
| 59 |
+
# Optional tweaks
|
| 60 |
+
GITHUB_TOOLSETS="repos" # or "default,discussions,experiments"
|
| 61 |
+
GITHUB_READ_ONLY=1
|
| 62 |
+
AGENT_MODEL=Qwen/Qwen3-Next-80B-A3B-Thinking
|
| 63 |
+
HF_INFERENCE_PROVIDER=together
|
| 64 |
+
GRADIO_SERVER_NAME=0.0.0.0
|
| 65 |
+
GRADIO_SERVER_PORT=7860
|
| 66 |
+
```
|
| 67 |
+
The app will warn on missing tokens but will not function fully without them.
|
| 68 |
+
|
| 69 |
+
## Run the app
|
| 70 |
+
```bash
|
| 71 |
+
python app.py
|
| 72 |
+
```
|
| 73 |
+
- Gradio starts at `http://127.0.0.1:7860` by default.
|
| 74 |
+
- Sign in with your Hugging Face account when prompted (or rely on `HF_TOKEN`).
|
| 75 |
+
- Ask upgrade questions and optionally upload `pyproject.toml` or `requirements.txt`.
|
| 76 |
+
- Uploaded files are placed in `uploads/` for the session and cleaned up on exit.
|
| 77 |
|
| 78 |
+
## Optional: run the GitHub MCP server locally
|
| 79 |
+
The app defaults to the hosted Copilot MCP endpoint. To use a local MCP server instead:
|
| 80 |
+
```bash
|
| 81 |
+
podman run -i --rm \
|
| 82 |
+
-e GITHUB_PERSONAL_ACCESS_TOKEN=$GITHUB_PAT \
|
| 83 |
+
-e GITHUB_READ_ONLY=1 \
|
| 84 |
+
-e GITHUB_TOOLSETS="default" \
|
| 85 |
+
ghcr.io/github/github-mcp-server
|
| 86 |
+
```
|
| 87 |
+
Update `app.py` to point to your local MCP server URL/transport if you take
|
| 88 |
+
this route.
|
| 89 |
+
Read more about GitHub MCP server setup [here](https://github.com/github/github-mcp-server).
|
| 90 |
|
| 91 |
+
## Developing and testing
|
| 92 |
+
- Code lives in `src/upgrade_advisor/`; the Gradio entry point is `app.py`.
|
| 93 |
+
- Tooling and prompts for the agent are under `src/upgrade_advisor/agents/`.
|
| 94 |
+
- Basic samples for dependency files are in `tests/`.
|
| 95 |
+
- Run checks (none yet by default): `pytest`.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 96 |
|
| 97 |
+
## Troubleshooting
|
| 98 |
+
- **Missing tokens**: ensure `GITHUB_PAT` and `HF_TOKEN` are in `.env` or your shell.
|
| 99 |
+
- **Model choice**: set `AGENT_MODEL`/`CHAT_MODEL` if you want to swap the default Qwen model.
|
| 100 |
+
- **Port conflicts**: override `GRADIO_SERVER_PORT` in `.env`.
|
| 101 |
|
| 102 |
|
| 103 |
|