AI-generated short form content.
- T3
- AI SDK for script generation
- Remotion
- ElevenLabs
- UploadThing
- BullMQ
These are the important parts of the codebase:
apps nextjs // front end dashboard w/ better-auth (github) worker // the code responsible for video gen packages db // has schema, BullMQ typesKeep in mind that you do not need to run the Next.js app to generate videos. apps/worker uses BullMQ as a Redis-backed Queue for jobs, so you can create an instance and push jobs to it anywhere/way you want.
This project uses Infisical as a secrets manager. You can either install the CLI and set up your project on their dashboard, or use dotenv. Either way, see .env.example for the variables you need to provide.
If you have pnpm installed, install the dependencies with:
pnpm install
And develop with
pnpm dev
A Preset is a combination of Characters with a background video and other parameters. When combined with a string prompt, the Worker is able to generate a short-form video autonomously.
You can run the following to get a quick UI for Postgres to add Characters and create your first Preset:
pnpm db:push (propegate the schema to Postgres)
pnpm db:studio (run the UI on local.drizzle.studio)
Note that a preset must have at least 2 Characters with exactly 1 Character with a role of teacher.
Local generation is strongly reccommended for performance unless you can deploy to a VPS with GPU access.
There is a Dockerfile in the root that builds an optimized image of the worker for maximum portability across environments, which is important because Remotion requires many OS-level dependencies to work.
The website, on the other hand, is a standard Next.js app and can be deployed without much additional afterthought. Vercel is a good place to start.
This is the product of a sprint by smrth. Contributions are fine, but it isn't my focus going forward!