What is Berri?
LiteLLM is your one-stop solution for managing 100+ LLMs effortlessly. Whether you're working with Azure, Gemini, Bedrock, or OpenAI, LiteLLM simplifies access, tracks spending, and ensures smooth fallbacks—all in the OpenAI format. It’s like having a universal remote for all your AI models!
What are the features of Berri?
- Load Balancing & Fallbacks: Keep your AI running smoothly, even when things get busy.
- Spend Tracking: Know exactly where your money’s going with detailed budget tracking.
- OpenAI-Compatible API: Access 100+ LLMs using the familiar OpenAI format.
- Self-Serve Portal: Let your team manage their own keys without hassle.
- Logging & Monitoring: Log requests, responses, and usage data to tools like s3, Datadog, and OTEL.
What are the use cases of Berri?
- AI Developers: Easily switch between different LLMs without rewriting code.
- Teams: Manage model access and budgets across multiple users.
- Enterprises: Deploy LiteLLM for secure, scalable AI solutions.
How to use Berri?
- Install LiteLLM: Get started with the open-source version or explore enterprise options.
- Set Up Virtual Keys: Control access to models by assigning virtual keys.
- Track Spending: Use the built-in tools to monitor your AI budget.
- Enable Fallbacks: Ensure your AI stays online with automatic fallback options.







