Skip to content

llm-connectorRust LLM Connector

Provider abstraction for OpenAI/Anthropic/Gemini and more — streaming, tools, and multimodal.

Install

bash
cargo add llm-connector

Quick Example

rust
use llm_connector::{LlmClient, types::{ChatRequest, Message, Role}};

let client = LlmClient::openai("sk-...")?;
let request = ChatRequest {
    model: "gpt-4".to_string(),
    messages: vec![Message::text(Role::User, "Hello!")],
    ..Default::default()
};

let response = client.chat(&request).await?;
println!("{}", response.content);

Released under the MIT License.