v1.74.15-stable
Deploy this versionโ
- Docker
 - Pip
 
docker run litellm
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:v1.74.15-stable
pip install litellm
pip install litellm==1.74.15.post2
Key Highlightsโ
- User Agent Activity Tracking - Track how much usage each coding tool gets.
 - Prompt Management - Use Git-Ops style prompt management with prompt templates.
 - MCP Gateway: Guardrails - Support for using Guardrails with MCP servers.
 - Google AI Studio Imagen4 - Support for using Imagen4 models on Google AI Studio.
 
User Agent Activity Trackingโ
This release brings support for tracking usage and costs for AI-powered coding tools like Claude Code, Roo Code, Gemini CLI through LiteLLM. You can now track LLM cost, total tokens used, and DAU/WAU/MAU for each coding tool.
This is great to central AI Platform teams looking to track how they are helping developer productivity.
Prompt Managementโ
New Models / Updated Modelsโ
New Model Supportโ
| Provider | Model | Context Window | Input ($/1M tokens) | Output ($/1M tokens) | Cost per Image | 
|---|---|---|---|---|---|
| OpenRouter | openrouter/x-ai/grok-4 | 256k | $3 | $15 | N/A | 
| Google AI Studio | gemini/imagen-4.0-generate-001 | N/A | N/A | N/A | $0.04 | 
| Google AI Studio | gemini/imagen-4.0-ultra-generate-001 | N/A | N/A | N/A | $0.06 | 
| Google AI Studio | gemini/imagen-4.0-fast-generate-001 | N/A | N/A | N/A | $0.02 | 
| Google AI Studio | gemini/imagen-3.0-generate-002 | N/A | N/A | N/A | $0.04 | 
| Google AI Studio | gemini/imagen-3.0-generate-001 | N/A | N/A | N/A | $0.04 | 
| Google AI Studio | gemini/imagen-3.0-fast-generate-001 | N/A | N/A | N/A | $0.02 | 
Featuresโ
- Google AI Studio
- Added Google AI Studio Imagen4 model family support - PR #13065, Get Started
 
 - Azure OpenAI
- Azure 
api_version="preview"support - PR #13072, Get Started - Password protected certificate files support - PR #12995, Get Started
 
 - Azure 
 - AWS Bedrock
 - OpenRouter
- Added Grok4 model support - PR #13018
 
 - Anthropic
- Auto Cache Control Injection - Improved cache_control_injection_points with negative index support - PR #13187, Get Started
 - Working mid-stream fallbacks with token usage tracking - PR #13149, PR #13170
 
 - Perplexity
- Citation annotations support - PR #13225
 
 
Bugsโ
- Gemini
- Fix merge_reasoning_content_in_choices parameter issue - PR #13066, Get Started
 - Added support for using 
GOOGLE_API_KEYenvironment variable for Google AI Studio - PR #12507 
 - vLLM/OpenAI-like
- Fix missing extra_headers support for embeddings - PR #13198
 
 
LLM API Endpointsโ
Bugsโ
- /generateContent
 - /vertex_ai (Passthrough)
- Ensure multimodal embedding responses are logged properly - PR #13050
 
 
MCP Gatewayโ
Featuresโ
- Health Check Improvements
- Add health check endpoints for MCP servers - PR #13106
 
 - Guardrails Integration
 - Protocol & Header Support
- Add protocol headers support - PR #13062
 
 - URL & Namespacing
- Improve MCP server URL validation for internal/Kubernetes URLs - PR #13099
 
 
Bugsโ
Management Endpoints / UIโ
Featuresโ
- 
Usage Analytics
 - 
Models
 - 
Key Management
- Properly parse JSON options for key generation in UI - PR #12989
 
 - 
Authentication
- JWT Fields
- Add dot notation support for all JWT fields - PR #13013
 
 
 - JWT Fields
 
Bugsโ
- Permissions
 - Models
- Fix model reload on model update - PR #13216
 
 - Router Settings
 
Logging / Guardrail Integrationsโ
Featuresโ
- MLFlow
- Allow adding tags for MLFlow logging requests - PR #13108
 
 - Langfuse OTEL
- Add comprehensive metadata support to Langfuse OpenTelemetry integration - PR #12956
 
 - Datadog LLM Observability
- Allow redacting message/response content for specific logging integrations - PR #13158
 
 
Bugsโ
- API Key Logging
- Fix API Key being logged inappropriately - PR #12978
 
 - MCP Spend Tracking
- Set default value for MCP namespace tool name in spend table - PR #12894
 
 
Performance / Loadbalancing / Reliability improvementsโ
Featuresโ
- Background Health Checks
- Allow disabling background health checks for specific deployments - PR #13186
 
 - Database Connection Management
- Ensure stale Prisma clients disconnect DB connections properly - PR #13140
 
 - Jitter Improvements
- Fix jitter calculation (should be added not multiplied) - PR #12901
 
 
Bugsโ
- Anthropic Streaming
- Always use choice index=0 for Anthropic streaming responses - PR #12666
 
 - Custom Auth
- Bubble up custom exceptions properly - PR #13093
 
 - OTEL with Managed Files
- Fix using managed files with OTEL integration - PR #13171
 
 
General Proxy Improvementsโ
Featuresโ
- Database Migration
 - Infrastructure
 - Helm Charts
 
Bugsโ
- Docker
 - Database Configuration
- Fix DB config through environment variables - PR #13111
 
 - Logging
- Suppress httpx logging - PR #13217
 
 - Token Counting
- Ignore unsupported keys like prefix in token counter - PR #11954
 
 
New Contributorsโ
- @5731la made their first contribution in https://github.com/BerriAI/litellm/pull/12989
 - @restato made their first contribution in https://github.com/BerriAI/litellm/pull/12980
 - @strickvl made their first contribution in https://github.com/BerriAI/litellm/pull/12956
 - @Ne0-1 made their first contribution in https://github.com/BerriAI/litellm/pull/12995
 - @maxrabin made their first contribution in https://github.com/BerriAI/litellm/pull/13079
 - @lvuna made their first contribution in https://github.com/BerriAI/litellm/pull/12894
 - @Maximgitman made their first contribution in https://github.com/BerriAI/litellm/pull/12666
 - @pathikrit made their first contribution in https://github.com/BerriAI/litellm/pull/12901
 - @huetterma made their first contribution in https://github.com/BerriAI/litellm/pull/12809
 - @betterthanbreakfast made their first contribution in https://github.com/BerriAI/litellm/pull/13029
 - @phosae made their first contribution in https://github.com/BerriAI/litellm/pull/12606
 - @sahusiddharth made their first contribution in https://github.com/BerriAI/litellm/pull/12507
 - @Amit-kr26 made their first contribution in https://github.com/BerriAI/litellm/pull/11954
 - @kowyo made their first contribution in https://github.com/BerriAI/litellm/pull/13172
 - @AnandKhinvasara made their first contribution in https://github.com/BerriAI/litellm/pull/13187
 - @unique-jakub made their first contribution in https://github.com/BerriAI/litellm/pull/13174
 - @tyumentsev4 made their first contribution in https://github.com/BerriAI/litellm/pull/13134
 - @aayush-malviya-acquia made their first contribution in https://github.com/BerriAI/litellm/pull/12978
 - @kankute-sameer made their first contribution in https://github.com/BerriAI/litellm/pull/13225
 - @AlexanderYastrebov made their first contribution in https://github.com/BerriAI/litellm/pull/13178
 

