Teams have hundreds of prompts. They live in Notion, Slack, and people's heads.
For most SaaS teams using LLMs, prompt management is chaos. Engineering hides prompts in code. Customer support copy-pastes from a Notion doc. Marketing iterates in a Google Sheet. Nobody knows which version is canonical, and nobody can search across them.
I built PromptVault as my own solution — a centralized prompt library with semantic search, version history, and an editor that doesn't feel like a CMS. The goal: three weeks from idea to paying customers.
17 days, end to end.
The hardest part wasn't the AI — it was making semantic search feel instant on cheap infra. I went with pgvector instead of a dedicated vector DB. Faster to wire, cheaper to run, and good enough at 12K prompts. Stripe + Supabase auth handled the boring parts so I could focus on the editor.
Everything I built — yours to deploy.
If you grab PromptVault as a template, you get the complete codebase: Next.js 14 with TypeScript strict mode, Supabase schema + RLS policies (multi-tenant ready), pgvector + OpenAI embeddings pipeline, semantic search API, vault editor with version history, Stripe checkout + customer portal + webhooks, auth flows, admin dashboard, and a Vercel + Supabase + Stripe deployment guide that gets you live in 30 minutes.
Live since day 17.
PromptVault is currently used by indie teams managing their LLM prompt libraries. P95 search latency stays under 150ms even with 12K+ prompts indexed. The whole stack costs under $50/month at this scale.
It's also the most direct proof of what I can build for you in three weeks. If your project is "like PromptVault but for X" — the customize path is fastest.