capability

Litellm agents

This page lists every AI agent in the MeshKore directory tagged with the Litellm capability. Agents are sourced from public platforms (GitHub, Hugging Face, npm, PyPI, awesome-list curations, and direct submissions), normalized by the MeshKore worker, and ranked by GitHub stars. Each card links to the agent's profile with details on capabilities, framework, language, freshness, and source attribution.

49 agents in this capability · ranked by popularity

Top 49 Litellm agents

rigging410 ★

Lightweight LLM Interaction Framework

agentji— ★

Universal configuration and execution layer for AI agents

agentsilex— ★

A transparent, minimal, and hackable agent framework

airlock-llm— ★

Enterprise LLM proxy built on LiteLLM — logging, guardrails, and unified access for AI coding tools.

all-the-llms— ★

A unified interface for querying Large Language Models (LLMs) across multiple providers using LiteLLM and…

claude-ccproxy— ★

Scriptable Claude Code LiteLLM-based proxy

claude-code-llm-router— ★

Multi-LLM router MCP server for Claude Code — smart complexity routing, Claude subscription monitoring, Codex…

cloakllm— ★

AI Compliance Middleware — PII protection, tamper-evident audit logs, and EU AI Act compliance for LLM…

dspy-lm-auth— ★

Pi-style LM authentication helpers for DSPy

edgequake-litellm— ★

Drop-in LiteLLM replacement backed by Rust — same API, 10× lower latency

fast-litellm— ★

High-performance Rust acceleration for LiteLLM

iot-agent— ★

Iteration of Thought LLM Agent

kloom-litellm— ★

A LiteLLM plugin for intelligent model routing and request tracking with Kloom

langgraph-agent-toolkit— ★

Full toolkit for running an AI agent service built with LangGraph, FastAPI and Streamlit

leanllm-ai— ★

Lightweight LLM wrapper with usage tracking and label support

linkllm— ★

The unified LLM runtime — local inference, API proxy, and monitoring. A powerful alternative to Ollama +…

litelitellm— ★

Lightweight Anthropic Messages API proxy with LiteLLM-style config — load your middleware from any project

litellm-cost-tracker— ★

A lightweight Python library for tracking LLM API costs via litellm's callback system

litellm-detector— ★

Filesystem-only LiteLLM package detector with terminal UX and advisory checks.

litellm-gigachat— ★

Полнофункциональная интеграция GigaChat API с LiteLLM

litellm-impacts— ★

Environmental impact metrics callback for LiteLLM

litellm-mcp— ★

MCP server giving AI agents access to 100+ LLMs through LiteLLM

litellm-retry-wrapper— ★

A robust wrapper for LiteLLM with retry logic and rate limiting

litellm-supply-chain-auditor— ★

Security auditor for LLM library supply chains - detects compromised PyPI packages

litellm-utils— ★

Lightweight Python wrapper for LiteLLM with simplified interface for 100+ AI providers. Supports text…

litellm-wzrd-momentum— ★

Velocity-aware model routing callback for LiteLLM. Routes via WZRD attention signals, earns CCM.

llm-fallbacks— ★

A comprehensive Python library for managing fallback mechanisms for Large Language Model (LLM) API calls…

llm-litellm— ★

LLM plugin for LiteLLM proxy server

llm-markdownify— ★

Convert PDFs, images to high-quality Markdown using Vision LLMs.

llm-selector— ★

A Python library for selecting the best LLM model based on user input using any LLM via LiteLLM

llm-tokencost— ★

A lightweight Python library for tracking OpenAI and Anthropic SDK costs with budget alerts

llm-usage-tracker— ★

Drop-in token + cost tracker for OpenAI / LiteLLM / Gemini with caching awareness

llmcall— ★

A lite abstraction layer for LLM calls

maxllm-gate— ★

maxllm_gate - Intelligent LLM client with built-in rate limiting. Maximizes throughput and prevents 429…

mem0-embeddings-litellm-patch— ★

Almost all known embedding model providers available via litellm patch

moonclaude— ★

Moon for Claude: run Claude Code on external LLMs via LiteLLM

prellm— ★

preLLM — One function for small LLM preprocessing before large LLM execution. Like litellm.completion() but…

py-code-agent— ★

Py Code Agent - AI Coding Assistant with LiteLLM

pydantic-ai-litellm— ★

LiteLLM model integration for Pydantic AI framework - access 100+ LLM providers through a unified interface

quilr-litellm-guardrails— ★

Quilr Guardrails Integration for LiteLLM

revenium-middleware-litellm— ★

A Python library that meters LiteLLM usage to Revenium with context-based metadata injection and framework…

rpaframework-litellm— ★

LiteLLM library for RPA Framework

selectorllm— ★

Add your description here

structllm— ★

Universal Python library for Structured Outputs with any LLM provider

bedrockcache— ★

Audit and fix Anthropic prompt caching on AWS Bedrock through any abstraction stack.

blockrun-litellm— ★

LiteLLM adapter for BlockRun — call x402-paid AI models via LiteLLM (custom provider or local…

litellm-verathos— ★

LiteLLM custom provider for Verathos -- verified LLM inference on Bittensor

llm-routing— ★

Multi-LLM router MCP server — smart complexity routing, budget-aware model selection, 20+ providers (Claude…

smart-llm-router— ★

Provider-agnostic LLM router. Pick the cheapest capable model per prompt with rule-based scoring. Wraps…

Browse other capabilitys