LiteLLM

Proxy Server to call 100+ LLMs in the OpenAI format

★ 19,9 K AI LLM

Homepage · Source code

Author: Berrie AI, Inc · License: MIT

Version: 1.56.4 ·

Despre LiteLLM

Call all LLM APIs using the OpenAI format [Bedrock, Huggingface, VertexAI, TogetherAI, Azure, OpenAI, Groq etc.]

preview


Features

  • Translate inputs to provider's completion, embedding, and image_generation endpoints
  • Consistent output, text responses will always be available at ['choices'][0]['message']['content']
  • Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - Router
  • Set Budgets & Rate limits per project, api key, model LiteLLM Proxy Server (LLM Gateway)