| title | emoji | colorFrom | colorTo | sdk | sdk_version | app_file | pinned |
|---|---|---|---|---|---|---|---|
LLM Code Deployment API | 🚀 | indigo | red | docker | 0.0.1 | app/main.py | false |
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
This Space provides a production-grade FastAPI backend for automated application generation and deployment, as described in the LLM Code Deployment project. It supports secure secret-based access, LLM-driven app generation, automated deployment to GitHub Pages, and evaluation callbacks—built with best practices for industry and Hugging Face Spaces Docker deployments.
- POST
/handle-task: Receives and processes app brief requests, verifies secrets, and triggers app generation workflows. - Dockerized Deployment: Secure, reproducible builds using a custom Dockerfile following Hugging Face recommendations.
- Secrets Management: Reads secrets via Hugging Face Spaces environment for maximum security (never hardcoded).
- GitHub Automation: Automatically creates public repos, populates README and LICENSE, and enables GitHub Pages.
- Set up required Space secrets (secrets and tokens via the Spaces UI).
- Deploy or push your code to this Space.
- POST a task request (see
/handle-taskendpoint documentation).
See requirements.txt for dependencies.
Your FastAPI entry point should be specified in app/main.py.
For further configuration options, please visit the Spaces Configuration Reference.