A Typescript library to use LLM providers APIs in a unified way.
- Updated
Nov 29, 2025 - TypeScript
A Typescript library to use LLM providers APIs in a unified way.
FrontLLM is your safe front-end gateway to LLMs. Request LLM directly from your front-end code. https://frontllm.com/
🚀 Production-ready Express.js API server for OpenRouter's LLM API. Features security headers (Helmet), CORS restrictions, rate limiting, Docker support, and easy VPS deployment. Perfect starter template for building AI-powered chat applications.
Cloud-hosted MCP server providing AI agents access to Deepseek's cost-effective LLMs (R1 & V3) with 60-90% cost savings vs OpenAI/Claude
Add a description, image, and links to the llm-api topic page so that developers can more easily learn about it.
To associate your repository with the llm-api topic, visit your repo's landing page and select "manage topics."