← Back to feed

gallery

GitHub Repo Pretty sure · Google shipping it on both app stores...
https://github.com/google-ai-edge/gallery

Google's on-device LLM gallery app that actually ships working inference on phones instead of just demoing it—genuine edge ML infrastructure with a UI that doesn't feel like a wrapper.

15%
60%
25%
Slop 15%Signal 60%Science 25%

This is a reference implementation, not original research, but it's solving a real hard problem: getting LLMs fast enough on consumer hardware that latency isn't a dealbreaker. LiteRT runtime + Gemma 4 + actual privacy (fully offline inference) = legitimate utility. Feature set (thinking mode, multimodal, audio, agent skills) shows thought beyond 'run model, get output.' The slop score isn't higher because Google's actually shipping this in production (Play Store, App Store) with benchmarking...

18717 stars Kotlin 2026-04-04 372 days old

Become a MFer to rate — log in