OSW Studio is an AI-powered development platform where you build and maintain websites through natural language conversations.
Static sites have always been fast, cheap to host, and secure. The tradeoff was that maintaining them required technical skill. OSW Studio removes that tradeoff - describe what you want, and AI handles the implementation.
For developers: Skip the boilerplate. Rapid prototyping, full code access when you need it, and an AI that understands your project's context.
For everyone else: Finally maintain the site that was built for you. Add blog posts, update business hours, swap team photos - without filing a support ticket or hiring an agency.
What you get:
- Sandboxed agent - AI operates in a virtual file system with automatic checkpoints - explore freely, rollback anytime
- Dual AI modes - Chat (exploration, planning) + Code (full implementation)
- Multi-provider AI - OpenRouter (200+ models), OpenAI, Anthropic, Google Gemini, Groq, SambaNova, Ollama, LM Studio
- Full IDE - Monaco editor, live preview, file explorer, multi-tab support
- Templates & Skills - Reusable project templates and AI workflow guides
- Export anywhere - Download as ZIP, deploy to Vercel/Netlify/GitHub Pages
- Optional Server Mode - Self-host a multi-site publishing platform with built-in SEO, analytics, and admin dashboard
Perfect for: Business websites, landing pages, portfolios, documentation sites, blogs, marketing pages
Two modes: Browser Mode (static sites, export as ZIP) or Server Mode (self-hosted platform with databases, APIs, and publishing)
Get started in 3 steps:
# 1. Clone and install git clone https://github.com/o-stahl/osw-studio.git cd osw-studio npm install # 2. Start development server npm run dev3. Open browser and start building:
- β Get an API key from OpenRouter, OpenAI, or run Ollama locally
- β Open http://localhost:3000
- β Click settings β Select provider β Enter API key
- β Create project β Describe your website
- β Export as ZIP β Deploy anywhere
Try the hosted version: Live Demo (no installation required)
- Monaco Editor - Full-featured code editor with syntax highlighting, IntelliSense
- Live Preview - Hot reload, instant updates as AI builds
- File Explorer - Tree view with right-click context menus
- Multi-tab Support - Work on multiple files simultaneously
- Handlebars Templates - Build reusable components with partials
- Dual Modes:
- π¬ Chat Mode - Exploration, planning, Q&A
- π§ Code Mode - Full implementation with file operations
- 8 LLM Providers - OpenRouter, OpenAI, Anthropic Claude, Google Gemini, Groq, SambaNova, Ollama, LM Studio
- 200+ Models - From tiny 4B tool models to SOTA frontier models
- Smart Agent - Uses shell commands, JSON patch edits, self-evaluation
- Skills System - Teach AI your workflow preferences with Anthropic-style skills
- Templates - Export/import reusable project templates (.oswt files)
- Checkpoints - Rollback to any point in conversation with per-message restore
- Export Options - ZIP deployment packages or .osws backups (full history)
- Project Gallery - Grid/list views with screenshots, search, sorting
| β Browser Mode | Details |
|---|---|
| Landing Pages | Marketing sites, product pages, SaaS homepages |
| Portfolios | Personal websites, photography, design portfolios |
| Documentation | Project docs, help centers, knowledge bases |
| Blogs | Static blogs with templates and navigation |
| Client-side Apps | Calculators, tools, games, interactive demos |
See Server Mode for REST APIs, databases, analytics, and more.
OSW Studio uses an agentic AI system with 3 core tools:
- Shell Tool - File system operations (
ls,cat,grep,find,mkdir,rm,mv,cp,rg,head,tail,tree,touch,echo >) - JSON Patch Tool - Precise file edits with string-based operations
- Evaluation Tool - AI self-assesses progress and decides next steps
Command validation β Execution β Checkpoint β Continue
The agent runs entirely in your browser, operating on a virtual file system (IndexedDB). You describe what you want, AI handles the implementation.
- Gemini 3 - Good pricing, speed and quality, best value currently
- Haiku 4.5 - Reasonable pricing, speed and quality
- GLM4.7, GLM4.6, GLM4.5 & air - Fast, reliable and cheap, among SOTA for webdev
- Grok Code Fast 1 - Good balance of speed, quality and price
- Kimi K2 - Good balance of speed, quality and price
- gpt-oss-120b & 20b - Strong agentic capabilities
- Qwen3 series - Some models perform better than others, but functional across the board
- DeepSeek v3.2, v3.1 and R1 - Can handle most tasks, but not optimized for this use case
- Claude Sonnet 4.5 & Opus 4.5 - Good, but can rack up a large bill quickly (Gemini 3 is much better value)
- SOTA models - Generally SOTA models will perform, but come with a higher pricing
- DeepSeek V3, Qwen2.5, Gemma3, Mistral-small, Granite 3.x, Llama4 Maverick/Scout
Rule of thumb: A 4B tool-calling model typically outperforms a 70B non-tool model for this use case. Models released after summer 2025 should work well.
Local (Free, Private):
Cloud:
- OpenRouter - 200+ models, pay-per-use
- OpenAI - GPT-4, GPT-5 series
- Anthropic - Claude 3/4 series
- Google - Gemini models
- Groq - Fast inference
- SambaNova - High-performance models
| Type | Formats | Limits |
|---|---|---|
| Code | HTML, CSS, JS/JSX, JSON, HBS/Handlebars | 5MB per file |
| Docs | TXT, MD, XML, SVG | 5MB per file |
| Media | PNG, JPG, GIF, WebP, MP4, WebM | 10MB images, 50MB video |
β οΈ Experimental: Server Mode is under active development. Some features may be unstable or change in future releases.
OSW Studio runs client-side by default (Browser Mode). For advanced use cases, enable Server Mode:
- β Client-side only, no backend required
- β IndexedDB storage (stays in browser)
- β Deploy to Vercel, Netlify, HuggingFace
- β Complete privacy
- β Zero configuration
- β SQLite persistence (no external database setup)
- β Admin authentication (JWT + bcrypt)
- β
Static site publishing to
/sites/{siteId}/ - β Edge Functions - JavaScript API endpoints with database access
- β Per-site SQLite databases (WAL mode) with SQL editor
- β Secrets management (AES-256-GCM encrypted)
- β SEO controls - Meta tags, Open Graph, Twitter Cards, auto-sitemap
- β Built-in analytics (privacy-focused) or external (GA4, Plausible)
- β Compliance - Cookie consent banners with GDPR/CCPA support
- β Custom scripts - Inject head/body scripts, CDN resources
- β Project sync (IndexedDB β SQLite)
- β Custom domains via reverse proxy
Quick Start (Server Mode):
# 1. Configure .env NEXT_PUBLIC_SERVER_MODE=true SESSION_SECRET=$(openssl rand -base64 32) ADMIN_PASSWORD=your_secure_password SECRETS_ENCRYPTION_KEY=$(openssl rand -base64 32) # 2. Start server (SQLite databases created automatically) npm install && npm run dev # 3. Access at http://localhost:3000/admin/loginDocumentation:
- Server Mode Guide - Full setup and features
- Server Features - Edge Functions, Secrets, Database
- Framework: Next.js 15.3.3, React 19, TypeScript
- UI: TailwindCSS v4, Radix UI primitives
- Editor: Monaco Editor (VS Code engine)
- Storage: IndexedDB (browser), SQLite (server mode)
- AI: 8 LLM provider integrations
- Templating: Handlebars.js for components
- Export: JSZip for deployment packages
/components/ # React UI components (workspace, editor, preview) /lib/vfs/ # Virtual file system with checkpoints /lib/llm/ # AI orchestration, tool execution, providers /app/api/ # API routes (generation, models, validation) /docs/ # Comprehensive documentation Create .env:
# Log level: error, warn, info, debug (default: warn) NEXT_PUBLIC_LOG_LEVEL=warn # Tool streaming debug (default: 0) NEXT_PUBLIC_DEBUG_TOOL_STREAM=0- Generation fails β Check DevTools console (F12)
- Model compatibility β Test at
/test-generation - Tool issues β Enable
DEBUG_TOOL_STREAM=1 - Rate limits β Watch for toast notifications
- Local providers β Ensure Ollama/LM Studio running
- API keys - Stored in browser
localStorage(never sent to OSW Studio servers) - Network calls - Direct to AI providers or via optional proxy endpoints
- Data storage - Projects stay in IndexedDB (browser mode) or SQLite (server mode)
- Complete privacy - Use Ollama/LM Studio for 100% local operation
Note: Remote LLM providers (OpenAI, Anthropic, etc.) will receive your code during generation. For complete privacy, use local models.
- No package managers - Use CDN links for libraries (unpkg, jsdelivr, cdnjs)
- Browser Mode - Static sites only, no backend (use Server Mode for APIs/databases)
OSW Studio is a solo-maintained, community-driven project. Contributions welcome!
Ways to help:
- π Report bugs
- π‘ Request features
- π Submit pull requests
- π£ Share what you've built (open an issue or discussion!)
Built something cool? I'd love to see it! Share your creations in GitHub Discussions or open an issue with screenshots.
If OSW Studio saved you time or helped with a project, consider supporting development:
MIT License - See LICENSE file for details
Original Inspiration:
- @enzostvs & @victor - DeepSite v2 (original fork source)
- Hugging Face - Hosting platform
Technical Inspiration:
- Google AI Studio - App Builder workflow
- OpenAI Codex CLI - Agentic patterns
- Anthropic Claude - Artifact/string patch editing
Special Thanks:
- All open source contributors making projects like this possible
- The AI community for pushing boundaries
Note: OSW Studio is not affiliated with Anthropic, OpenAI, Google, Hugging Face, or other mentioned organizations. All trademarks belong to their respective owners.
