Voxel: A local-first AI assistant with GGUF models, voice, tools, personality, and memories

I built Voxel, a local-first AI assistant that runs from a local FastAPI backend with a browser dashboard.

It supports local GGUF models, optional API keys through the OS credential vault, Piper TTS, Whisper push-to-talk, custom voice packs, local tools like calculator/time/small talk, and a local memory system.

v0.02 focuses on usability: smart setup script, random local port selection, low-resource mode, debug latency panel, TTS barge-in, and memory search.

Repo: GitHub - joshuatic/voxel: Voxel is a local-first AI assistant that runs on your own machine through a lightweight web dashboard. · GitHub

I’m looking for feedback on local model routing, onboarding, low-resource hardware support, and what should come next.

Thanks, and have a good day :slight_smile:

Note: Feedback is welcome through GitHub Issues, and accepted changes will be tracked in the changelog.