llm-unify-core provides foundational types and abstractions for building applications that work with multiple Large Language Model providers. It offers a unified data model for conversations, messages, and provider-specific metadata.
-
Provider-Agnostic: Single data model works across ChatGPT, Claude, Gemini, Copilot, and custom providers
-
Extensible:
ProviderTraitenables custom provider implementations -
Serializable: Full serde support for JSON interchange and persistence
-
Type-Safe: Rust’s type system ensures correctness at compile time
-
Minimal: Focused core (~180 lines) without unnecessary dependencies
| Type | Purpose |
|---|---|
|
Enum identifying LLM providers (ChatGPT, Claude, Gemini, Copilot, Other) |
|
Enum for message roles (User, Assistant, System) |
|
Single message with id, role, content, timestamp, and metadata |
|
Container for messages with provider association and timestamps |
|
Flexible key-value storage using |
|
Comprehensive error types with context preservation |
|
Interface for implementing custom provider parsers |
pub enum Provider {
ChatGpt, // OpenAI ChatGPT
Claude, // Anthropic Claude
Gemini, // Google Gemini
Copilot, // Microsoft Copilot
Other, // Custom/unknown providers
}pub struct Message {
pub id: String,
pub conversation_id: String,
pub role: MessageRole, // User | Assistant | System
pub content: String,
pub timestamp: DateTime<Utc>,
pub metadata: Metadata,
}pub struct Conversation {
pub id: String,
pub title: String,
pub provider: Provider,
pub created_at: DateTime<Utc>,
pub updated_at: DateTime<Utc>,
pub messages: Vec<Message>,
pub metadata: Metadata,
}pub trait ProviderTrait {
/// Parse provider-specific export format
fn parse(&self, data: &[u8]) -> Result<Vec<Conversation>>;
/// Get provider name
fn name(&self) -> &'static str;
/// Validate conversation data
fn validate(&self, conversation: &Conversation) -> Result<()>;
}Add to your Cargo.toml:
[dependencies]
llm-unify-core = "0.1"Or via cargo:
cargo add llm-unify-coreuse llm_unify_core::{Conversation, Message, MessageRole, Provider, Metadata};
use chrono::Utc;
// Create a new conversation
let mut conversation = Conversation::new(
"conv-123".into(),
"Rust Ownership Questions".into(),
Provider::Claude,
);
// Add a user message
let user_msg = Message {
id: "msg-1".into(),
conversation_id: "conv-123".into(),
role: MessageRole::User,
content: "What is ownership in Rust?".into(),
timestamp: Utc::now(),
metadata: Metadata::new(),
};
conversation.add_message(user_msg);
// Add an assistant response
let assistant_msg = Message {
id: "msg-2".into(),
conversation_id: "conv-123".into(),
role: MessageRole::Assistant,
content: "Ownership is Rust's memory management system...".into(),
timestamp: Utc::now(),
metadata: Metadata::new(),
};
conversation.add_message(assistant_msg);
println!("Messages: {}", conversation.message_count()); // 2use llm_unify_core::{ProviderTrait, Conversation, Result, Error};
struct LocalLlamaProvider;
impl ProviderTrait for LocalLlamaProvider {
fn parse(&self, data: &[u8]) -> Result<Vec<Conversation>> {
// Parse your provider's export format
let json: serde_json::Value = serde_json::from_slice(data)?;
// Transform to Conversation structs...
Ok(vec![])
}
fn name(&self) -> &'static str {
"local-llama"
}
fn validate(&self, conversation: &Conversation) -> Result<()> {
if conversation.messages.is_empty() {
return Err(Error::InvalidConversation(
"Conversation must have at least one message".into()
));
}
Ok(())
}
}use llm_unify_core::{Conversation, Provider};
let conversation = Conversation::new(
"conv-1".into(),
"Test".into(),
Provider::ChatGpt,
);
// Serialize to JSON
let json = serde_json::to_string_pretty(&conversation)?;
// Deserialize from JSON
let loaded: Conversation = serde_json::from_str(&json)?;use llm_unify_core::Metadata;
use serde_json::json;
let mut metadata = Metadata::new();
// Store arbitrary data
metadata.insert("model".into(), json!("gpt-4"));
metadata.insert("temperature".into(), json!(0.7));
metadata.insert("tokens_used".into(), json!(1500));
// Retrieve data
if let Some(model) = metadata.get("model") {
println!("Model: {}", model);
}| Crate | Version | Purpose |
|---|---|---|
|
1.0 |
Serialization/deserialization with derive macros |
|
1.0 |
JSON support for Metadata and interchange |
|
0.4 |
Timestamp handling with serde integration |
|
2.0 |
Ergonomic error type definitions |
|
1.0 |
Error context and propagation |
-
llm-unify - Full application using this library
Dual licensed under MIT OR PMPL-1.0-or-later.
See LICENSE.txt for details including the Palimpsest Philosophical Overlay.