|
| 1 | +# CLAUDE.md |
| 2 | + |
| 3 | +This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository. |
| 4 | + |
| 5 | +## Project Overview |
| 6 | + |
| 7 | +`rullm` is a Rust library and CLI for interacting with Large Language Models (LLMs). It consists of two main crates: |
| 8 | + |
| 9 | +- **rullm-core**: The core library providing a high-performance LLM client with Tower middleware for enterprise features |
| 10 | +- **rullm-cli**: A CLI tool built on top of rullm-core for interactive LLM usage |
| 11 | + |
| 12 | +## Architecture |
| 13 | + |
| 14 | +### Core Library (rullm-core) |
| 15 | +- **Providers**: Modular provider system supporting OpenAI, Anthropic, and Google AI APIs |
| 16 | +- **Middleware**: Built on Tower for retry logic, rate limiting, circuit breakers, and timeouts |
| 17 | +- **Dual APIs**: Simple string-based API and advanced API with full control over parameters |
| 18 | +- **Streaming**: Real-time token-by-token streaming support via async streams |
| 19 | +- **Types**: Comprehensive type system for chat messages, requests, responses, and configuration |
| 20 | + |
| 21 | +### CLI Application (rullm-cli) |
| 22 | +- **Commands**: Modular command structure for chat, models, aliases, keys, templates, etc. |
| 23 | +- **Configuration**: TOML-based config management with user-defined aliases and templates |
| 24 | +- **Interactive Chat**: Full-featured chat mode with history, slash commands, and editor integration |
| 25 | +- **Templates**: TOML-based prompt templates with placeholder substitution |
| 26 | + |
| 27 | +## Development Commands |
| 28 | + |
| 29 | +### Building |
| 30 | +```bash |
| 31 | +# Build the entire workspace |
| 32 | +cargo build |
| 33 | + |
| 34 | +# Build with release optimizations |
| 35 | +cargo build --release |
| 36 | + |
| 37 | +# Build specific crate |
| 38 | +cargo build -p rullm-core |
| 39 | +cargo build -p rullm-cli |
| 40 | +``` |
| 41 | + |
| 42 | +### Testing |
| 43 | +```bash |
| 44 | +# Run all tests |
| 45 | +cargo test |
| 46 | + |
| 47 | +# Run tests for specific crate |
| 48 | +cargo test -p rullm-core |
| 49 | +cargo test -p rullm-cli |
| 50 | + |
| 51 | +# Run integration tests (may require API keys) |
| 52 | +cargo test --test integration |
| 53 | +``` |
| 54 | + |
| 55 | +### Running Examples |
| 56 | +```bash |
| 57 | +# Core library examples (require API keys) |
| 58 | +cargo run --example openai_simple |
| 59 | +cargo run --example anthropic_stream |
| 60 | +cargo run --example gemini_stream |
| 61 | +cargo run --example test_all_providers |
| 62 | + |
| 63 | +# CLI binary |
| 64 | +cargo run -- "What is Rust?" |
| 65 | +cargo run -- chat --model claude |
| 66 | +``` |
| 67 | + |
| 68 | +### Linting and Formatting |
| 69 | +```bash |
| 70 | +# Check code formatting |
| 71 | +cargo fmt --check |
| 72 | + |
| 73 | +# Format code |
| 74 | +cargo fmt |
| 75 | + |
| 76 | +# Run clippy lints |
| 77 | +cargo clippy |
| 78 | + |
| 79 | +# Run clippy with all targets |
| 80 | +cargo clippy --all-targets --all-features |
| 81 | +``` |
| 82 | + |
| 83 | +## Key Patterns and Conventions |
| 84 | + |
| 85 | +### Provider Implementation |
| 86 | +- All providers implement the `ChatProvider` trait with `chat_completion` and `chat_completion_stream` methods |
| 87 | +- Configuration structs follow the pattern `{Provider}Config` (e.g., `OpenAIConfig`, `AnthropicConfig`) |
| 88 | +- Provider structs follow the pattern `{Provider}Provider` (e.g., `OpenAIProvider`, `AnthropicProvider`) |
| 89 | + |
| 90 | +### Error Handling |
| 91 | +- All public APIs return `Result<T, LlmError>` for comprehensive error handling |
| 92 | +- LlmError enum covers authentication, rate limiting, network issues, and provider-specific errors |
| 93 | +- Streaming APIs emit `ChatStreamEvent` enum variants: `Token`, `Done`, `Error` |
| 94 | + |
| 95 | +### Configuration Management |
| 96 | +- CLI config stored in `~/.config/rullm/` (or platform equivalent) |
| 97 | +- Templates stored as TOML files in `templates/` subdirectory |
| 98 | +- Model aliases defined in config.toml for user convenience |
| 99 | + |
| 100 | +### Testing |
| 101 | +- Unit tests co-located with implementation files |
| 102 | +- Integration tests in `tests/` directories |
| 103 | +- Examples serve as both documentation and integration tests |
| 104 | +- Test helpers in `utils/test_helpers.rs` for common test patterns |
| 105 | + |
| 106 | +## Important Files |
| 107 | + |
| 108 | +- `crates/rullm-core/src/lib.rs` - Main library entry point and public API |
| 109 | +- `crates/rullm-core/src/types.rs` - Core type definitions for requests/responses |
| 110 | +- `crates/rullm-core/src/providers/` - LLM provider implementations |
| 111 | +- `crates/rullm-cli/src/main.rs` - CLI entry point and argument parsing |
| 112 | +- `crates/rullm-cli/src/commands/` - CLI command implementations |
| 113 | +- `crates/rullm-cli/src/config.rs` - Configuration management |
| 114 | + |
| 115 | +## Development Notes |
| 116 | + |
| 117 | +- The project uses Rust 2024 edition with MSRV 1.85 |
| 118 | +- Tower middleware provides enterprise-grade reliability features |
| 119 | +- Async/await throughout with tokio runtime |
| 120 | +- Comprehensive error handling and observability via metrics and logging |
| 121 | +- Shell completion support for bash, zsh, and fish |
0 commit comments