← Back to feed

LiteLLM

GitHub Repo Pretty sure · YC stamp + active maintenance
https://github.com/BerriAI/litellm

Unified LLM proxy that actually works — routing layer between your app and 100+ model endpoints. Solves the real problem of 'I have 5 API keys and 0 desire to write adapter code.' Not novel, but not vaporware either.

25%
70%
5%
Slop 25%Signal 70%Science 5%

LiteLLM is a straightforward API router: normalize requests to OpenAI format, forward to Claude/Gemini/Bedrock/etc., handle auth and retries. Zero novel ML here—it's glue code elevated to a product. But glue code *solves a real problem*. Teams shipping multi-model apps genuinely need this instead of hand-rolling adapters. The proxy server + virtual keys + usage tracking is table-stakes infrastructure, not hype. README has trademark badge bloat (Y Combinator flex, CodSpeed metrics) but the Pyt...

39136 stars Python 2026-03-15 962 days old

Become a MFer to rate — log in