2 releases
Uses new Rust 2024
| 0.4.5 | Jul 29, 2025 |
|---|---|
| 0.4.4 | May 6, 2025 |
| 0.4.3 |
|
| 0.3.0 |
|
| 0.1.0 |
|
#167 in HTTP client
644 downloads per month
Used in 2 crates
125KB
2K
SLoC
OpenRouter Rust SDK
A type-safe, async Rust SDK for the OpenRouter API - Access 200+ AI models with ease
π What makes this special?
- π Type Safety: Leverages Rust's type system for compile-time error prevention
- β‘ Async/Await: Built on
tokiofor high-performance concurrent operations - π§ Reasoning Tokens: Industry-leading chain-of-thought reasoning support
- π‘ Streaming: Real-time response streaming with
futures - ποΈ Builder Pattern: Ergonomic client and request construction
- βοΈ Smart Presets: Curated model groups for programming, reasoning, and free tiers
- π― Complete Coverage: All OpenRouter API endpoints supported
π Quick Start
Installation
Add to your Cargo.toml:
[dependencies]
openrouter-rs = "0.4.5"
tokio = { version = "1", features = ["full"] }
30-Second Example
use openrouter_rs::{OpenRouterClient, api::chat::*, types::Role};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create client
let client = OpenRouterClient::builder()
.api_key("your_api_key")
.build()?;
// Send chat completion
let request = ChatCompletionRequest::builder()
.model("anthropic/claude-sonnet-4")
.messages(vec![
Message::new(Role::User, "Explain Rust ownership in simple terms")
])
.build()?;
let response = client.send_chat_completion(&request).await?;
println!("{}", response.choices[0].content().unwrap_or(""));
Ok(())
}
β¨ Key Features
π§ Advanced Reasoning Support
Leverage chain-of-thought processing with reasoning tokens:
use openrouter_rs::types::Effort;
let request = ChatCompletionRequest::builder()
.model("deepseek/deepseek-r1")
.messages(vec![Message::new(Role::User, "What's bigger: 9.9 or 9.11?")])
.reasoning_effort(Effort::High) // Enable deep reasoning
.reasoning_max_tokens(2000) // Control reasoning depth
.build()?;
let response = client.send_chat_completion(&request).await?;
// Access both reasoning and final answer
println!("π§ Reasoning: {}", response.choices[0].reasoning().unwrap_or(""));
println!("π‘ Answer: {}", response.choices[0].content().unwrap_or(""));
π‘ Real-time Streaming
Process responses as they arrive:
use futures_util::StreamExt;
let stream = client.stream_chat_completion(&request).await?;
stream
.filter_map(|event| async { event.ok() })
.for_each(|chunk| async move {
if let Some(content) = chunk.choices[0].content() {
print!("{}", content); // Print as it arrives
}
})
.await;
βοΈ Smart Model Presets
Use curated model collections:
use openrouter_rs::config::OpenRouterConfig;
let config = OpenRouterConfig::default();
// Three built-in presets:
// β’ programming: Code generation and development
// β’ reasoning: Advanced problem-solving models
// β’ free: Free-tier models for experimentation
println!("Available models: {:?}", config.get_resolved_models());
π‘οΈ Comprehensive Error Handling
use openrouter_rs::error::OpenRouterError;
match client.send_chat_completion(&request).await {
Ok(response) => println!("Success!"),
Err(OpenRouterError::ModerationError { reasons, .. }) => {
eprintln!("Content flagged: {:?}", reasons);
}
Err(OpenRouterError::ApiError { code, message }) => {
eprintln!("API error {}: {}", code, message);
}
Err(e) => eprintln!("Other error: {}", e),
}
π API Coverage
| Feature | Status | Module |
|---|---|---|
| Chat Completions | β | api::chat |
| Text Completions | β | api::completion |
| Reasoning Tokens | β | api::chat |
| Streaming Responses | β | api::chat |
| Model Information | β | api::models |
| API Key Management | β | api::api_keys |
| Credit Management | β | api::credits |
| Authentication | β | api::auth |
π― More Examples
Filter Models by Category
use openrouter_rs::types::ModelCategory;
let models = client
.list_models_by_category(ModelCategory::Programming)
.await?;
println!("Found {} programming models", models.len());
Advanced Client Configuration
let client = OpenRouterClient::builder()
.api_key("your_key")
.http_referer("https://yourapp.com")
.x_title("My AI App")
.base_url("https://openrouter.ai/api/v1") // Custom endpoint
.build()?;
Streaming with Reasoning
let stream = client.stream_chat_completion(
&ChatCompletionRequest::builder()
.model("anthropic/claude-sonnet-4")
.messages(vec![Message::new(Role::User, "Solve this step by step: 2x + 5 = 13")])
.reasoning_effort(Effort::High)
.build()?
).await?;
let mut reasoning_buffer = String::new();
let mut content_buffer = String::new();
stream.filter_map(|event| async { event.ok() })
.for_each(|chunk| async {
if let Some(reasoning) = chunk.choices[0].reasoning() {
reasoning_buffer.push_str(reasoning);
print!("π§ "); // Show reasoning progress
}
if let Some(content) = chunk.choices[0].content() {
content_buffer.push_str(content);
print!("π¬"); // Show content progress
}
}).await;
println!("\nπ§ Reasoning: {}", reasoning_buffer);
println!("π‘ Answer: {}", content_buffer);
π Documentation & Resources
- π API Documentation - Complete API reference
- π― Examples Repository - Comprehensive usage examples
- π§ Configuration Guide - Model presets and configuration
- β‘ OpenRouter API Docs - Official OpenRouter documentation
Run Examples Locally
# Set your API key
export OPENROUTER_API_KEY="your_key_here"
# Basic chat completion
cargo run --example send_chat_completion
# Reasoning tokens demo
cargo run --example chat_with_reasoning
# Streaming responses
cargo run --example stream_chat_completion
# Run with reasoning
cargo run --example stream_chat_with_reasoning
π€ Community & Support
π Found a Bug?
Please open an issue with:
- Your Rust version (
rustc --version) - SDK version you're using
- Minimal code example
- Expected vs actual behavior
π‘ Feature Requests
We love hearing your ideas! Start a discussion to:
- Suggest new features
- Share use cases
- Get help with implementation
π οΈ Contributing
Contributions are welcome! Please see our contributing guidelines:
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Follow the existing code style
- Submit a pull request
β Show Your Support
If this SDK helps your project, consider:
- β Starring the repository
- π¦ Sharing on social media
- π Writing about your experience
- π€ Contributing improvements
π Requirements
- Rust: 1.85+ (2024 edition)
- Tokio: 1.0+ (for async runtime)
- OpenRouter API Key: Get yours here
πΊοΈ Roadmap
- WebSocket Support - Real-time bidirectional communication
- Retry Strategies - Automatic retry with exponential backoff
- Caching Layer - Response caching for improved performance
- CLI Tool - Command-line interface for quick testing
- Middleware System - Request/response interceptors
π License
This project is licensed under the MIT License - see the LICENSE file for details.
β οΈ Disclaimer
This is a third-party SDK not officially affiliated with OpenRouter. Use at your own discretion.
π Release History
Version 0.4.5 (Latest)
- π§ New: Complete reasoning tokens implementation with chain-of-thought support
- βοΈ Updated: Model presets restructured to
programming/reasoning/freecategories - π Enhanced: Professional-grade documentation with comprehensive examples
- ποΈ Improved: Configuration system with better model management
Version 0.4.4
- Added: Support for listing models by supported parameters
- Note: OpenRouter API limitations on simultaneous category and parameter filtering
Version 0.4.3
- Added: Support for listing models by category
- Thanks to OpenRouter team for the API enhancement!
Made with β€οΈ for the Rust community
β Star us on GitHub | π¦ Find us on Crates.io | π Read the Docs
Dependencies
~12β18MB
~306K SLoC