Skip to content

Providers

llm-connector supports 12+ LLM providers with a unified interface.

Provider Overview

ProviderService NameStructAPI Format
OpenAIopenaiOpenAIProviderNative
Anthropic ClaudeanthropicAnthropicProviderNative
Google GeminigoogleGoogleProviderNative
Aliyun DashScopealiyunAliyunProviderCustom
Zhipu GLMzhipuZhipuProviderNative/OpenAI
Tencent HunyuantencentTencentProviderNative V3
VolcenginevolcengineVolcengineProviderOpenAI Compatible
DeepSeekdeepseekDeepSeekProviderOpenAI Compatible
MoonshotmoonshotMoonshotProviderOpenAI Compatible
Xiaomi MiMoxiaomiXiaomiProviderOpenAI Compatible
OllamaollamaOllamaProviderNative
LongCatlongcatLongCatAnthropicProviderOpenAI/Anthropic

OpenAI

Standard OpenAI API support.

Usage

rust
use llm_connector::LlmClient;

let client = LlmClient::openai("sk-...")?;
// Or with custom base URL
let client = LlmClient::openai_with_base_url("sk-...", "https://api.openai.com")?;

Anthropic Claude

Native support for Anthropic's Claude API.

Usage

rust
use llm_connector::LlmClient;

let client = LlmClient::anthropic("sk-ant-...")?;

AWS Bedrock / Google Vertex AI

rust
// AWS Bedrock
let client = LlmClient::anthropic_bedrock("us-east-1", "access_key", "secret_key")?;

// Google Vertex AI
let client = LlmClient::anthropic_vertex("project-id", "us-central1", "access-token")?;

Aliyun DashScope (Qwen)

Support for Alibaba Cloud's Qwen models.

Usage

rust
use llm_connector::LlmClient;

let client = LlmClient::aliyun("sk-...")?;

Zhipu GLM (Zhipu AI)

Support for ChatGLM models.

Usage

rust
use llm_connector::LlmClient;

// Native SDK style
let client = LlmClient::zhipu("your-api-key")?;

// OpenAI Compatible Mode (Recommended for some models)
let client = LlmClient::zhipu_openai_compatible("your-api-key")?;

Tencent Hunyuan (Hunyuan)

Native support for Tencent Cloud API v3 (TC3-HMAC-SHA256), including Streaming support.

Usage

rust
use llm_connector::LlmClient;

// Requires SecretId and SecretKey
let client = LlmClient::tencent("AKID...", "SecretKey...")?;

Volcengine

Support for Doubao models via Volcengine Ark.

Usage

rust
use llm_connector::LlmClient;

// Uses API Key (UUID format)
let client = LlmClient::volcengine("your-api-key")?;

DeepSeek

Support for DeepSeek-V3 and R1 reasoning models.

Usage

rust
use llm_connector::LlmClient;

let client = LlmClient::deepseek("sk-...")?;

Moonshot

Support for Kimi models.

Usage

rust
use llm_connector::LlmClient;

let client = LlmClient::moonshot("sk-...")?;

Xiaomi MiMo

Support for Xiaomi's MiMo LLM platform.

Usage

rust
use llm_connector::LlmClient;

let client = LlmClient::xiaomi("sk-...")?;

Ollama

Support for local models via Ollama.

Usage

rust
use llm_connector::LlmClient;

// Default (http://localhost:11434)
let client = LlmClient::ollama()?;

// Custom URL
let client = LlmClient::ollama_with_base_url("http://192.168.1.100:11434")?;

Google Gemini

Support for Google's Gemini models.

Usage

rust
use llm_connector::LlmClient;

let client = LlmClient::google("your-api-key")?;

Generic/Custom Providers

For any other OpenAI-compatible provider:

rust
use llm_connector::LlmClient;

let client = LlmClient::openai_compatible(
    "api-key",
    "https://api.example.com",
    "provider-name"
)?;

Environment Variables

ProviderEnvironment Variable
OpenAIOPENAI_API_KEY
AnthropicANTHROPIC_API_KEY
AliyunDASHSCOPE_API_KEY
ZhipuZHIPU_API_KEY
DeepSeekDEEPSEEK_API_KEY
MoonshotMOONSHOT_API_KEY
XiaomiXIAOMI_API_KEY
GoogleGOOGLE_API_KEY
TencentTENCENT_SECRET_ID, TENCENT_SECRET_KEY
VolcengineVOLCENGINE_API_KEY

Released under the MIT License.