Skip to content

aman-coder03/llm-code-deployment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

title emoji colorFrom colorTo sdk sdk_version app_file pinned
LLM Code Deployment API
🚀
indigo
red
docker
0.0.1
app/main.py
false

Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference

LLM Code Deployment API

This Space provides a production-grade FastAPI backend for automated application generation and deployment, as described in the LLM Code Deployment project. It supports secure secret-based access, LLM-driven app generation, automated deployment to GitHub Pages, and evaluation callbacks—built with best practices for industry and Hugging Face Spaces Docker deployments.

Key Features

  • POST /handle-task: Receives and processes app brief requests, verifies secrets, and triggers app generation workflows.
  • Dockerized Deployment: Secure, reproducible builds using a custom Dockerfile following Hugging Face recommendations.
  • Secrets Management: Reads secrets via Hugging Face Spaces environment for maximum security (never hardcoded).
  • GitHub Automation: Automatically creates public repos, populates README and LICENSE, and enables GitHub Pages.

Usage

  1. Set up required Space secrets (secrets and tokens via the Spaces UI).
  2. Deploy or push your code to this Space.
  3. POST a task request (see /handle-task endpoint documentation).

Development

See requirements.txt for dependencies.
Your FastAPI entry point should be specified in app/main.py.


For further configuration options, please visit the Spaces Configuration Reference.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published