About Astra
The name Astra comes from the Latin word for stars. We believe intelligence shouldn't be locked away in cloud datacenters — it should be like starlight: within reach, for everyone.
The past few years have seen an explosion of open-source LLMs — Llama, Qwen, Gemma, DeepSeek, Phi. They're powerful, yet small enough to fit in a pocket. And llama.cpp — that open-source miracle — lets a C++ inference engine run smoothly on a phone CPU.
What Astra does is wrap all of this into a simple app: install, chat, pull more models — no need to understand GGUF, quantization, or the command line. And most importantly, it all stays local. Your conversations belong only to you.
What we stand for
Privacy first
Local inference is the default — not a premium feature.
Fully open-source
Standing on llama.cpp and the open model ecosystem.
Users first
No ads, no tracking, no forced accounts.
Always evolving
New models, new platforms, new capabilities — always shipping.
Tech stack
Onward, to the stars
Astra is a long-term project. If you believe in local-first and open-source, walk with us.