PitchHut logo
A versatile API client for LLMs with support for 142+ providers.
Pitch

liter-llm offers a streamlined solution to interact with multiple LLM providers through a single API client, developed in Rust. With support for 11 programming languages and over 142 providers, it ensures a lightweight, efficient, and secure experience for developers building AI applications.

Description

liter-llm is a versatile, fast, and secure universal API client for large language models (LLMs), designed with a robust Rust core that minimizes vulnerabilities associated with traditional package management. With over 142 providers and 11 native language bindings, it stands out as a comprehensive solution for interacting with various LLMs without the complexities of dependency management.

Key Features:

  • Compiled Rust Core: Built from the ground up in Rust, ensuring no runtime dependency trees or potential supply chain risks, thereby enhancing security during deployment.

  • Enhanced Security for API Keys: Utilizes secrecy::SecretString, which ensures that API keys remain confidential by zeroing them on drop and redacting logs.

  • Multi-language Support: Seamlessly integrates with various programming languages including Python, TypeScript, Go, Java, Ruby, PHP, C#, Elixir, WebAssembly, and C/FFI. This eliminates issues related to implementation drift, as all bindings share the same core logic.

  • Built-in Observability: Incorporates production-grade OpenTelemetry with GenAI semantic conventions, allowing for comprehensive tracking and monitoring of API usage without the need for additional integrations.

  • Composable Middleware: Enables the integration of rate limiting, caching, cost tracking, health checks, and fallback strategies using a Tower-based architecture, allowing custom middleware to be easily composed.

Comparisons and Advantages:

Compared to other solutions like litellm, liter-llm offers a leaner and more depth-oriented approach, especially in terms of safety and compiled performance:

Featureliter-llmlitellm
Core LanguageRust (memory-safe)Python
Language Bindings11 languagesPython only
Providers Included142 compiled at build time100+ with runtime resolution
ObservabilityBuilt-in OpenTelemetryCallback integrations
API Key Securitysecrecy::SecretStringPlain strings
CachingIn-memory LRU + 40+ backends via OpenDALLimited options

Usage Example:

To demonstrate its functionality, below is a simple usage example in Python:

import asyncio, os
from liter_llm import LlmClient

async def main():
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])
    response = await client.chat(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "Hello!"}],
    )
    print(response.choices[0].message.content)

asyncio.run(main())

Architecture Overview:

The project consists of multiple crates and packages, structured to support different programming languages and functionalities. The Rust core library is complemented by wrappers for the various language bindings, ensuring a consistent API surface across all platforms.

Contribution and Community:

Contributions are encouraged, and discussions or queries can be directed to the Discord community.

For more details, explore the documentation and check out other related projects from the kreuzberg.dev team.

0 comments

No comments yet.

Sign in to be the first to comment.