Hey HN,
We built Meetily – an open-source, privacy-first AI meeting assistant for transcription, summarization, and note-taking. Unlike Otter.ai or Granola.ai, Meetily runs entirely on local hardware or self-hosted cloud infra, ensuring full data control (no cloud storage or SaaS lock-in).
Why We Built This:
Most AI meeting assistants store meeting data in the cloud, raising privacy and compliance concerns. Additionally, they come with expensive SaaS pricing ($8–$20 per user/month). We wanted a free, local-first alternative that:
Transcribes in real-time (Whisper.cpp) Generates AI-powered summaries (Local LLMs with Ollama (Accurate with models greater than 32B parameter models), or external APIs like Claude Sonet, Groq, Llama 70B) Stores meeting data locally (no cloud dependencies) Runs on local hardware (for security & compliance) Is fully open-source (customizable & extensible)
Tech Stack
Frontend: Tauri + Next.js Backend: FastAPI Transcription: Whisper.cpp Summarization AI: Local LLMs (Ollama) + API support (Claude, Groq, etc.) Database: SQLite + VectorDB for semantic search Rust-based audio capture for efficiency
What's next for us?
Optimizing AI summarization for small LLMs Hybrid cloud support (self-hosted models for extra compute) Conversational search on past meetings
Comments URL: https://news.ycombinator.com/item?id=43137186
Points: 3
# Comments: 2