Skip to content

Conversation

@basnijholt
Copy link
Owner

Summary

This PR introduces a Nix flake to provide a declarative and reproducible way to install and run agent-cli with all its dependencies and services.

Motivation

  • Simplifies setup process for Nix/NixOS users
  • Provides reproducible builds and environments
  • Handles all service dependencies automatically
  • Offers both development shells and system-wide installation options

Changes

New Files

  • flake.nix - Main flake configuration with:

    • Development shell with all Python dependencies
    • NixOS module for system integration
    • Platform-aware service management script
    • Package definitions for agent-cli
  • flake.lock - Lock file for reproducible builds

  • docs/installation/flake.md - Comprehensive documentation for:

    • Quick start guide
    • Multiple installation methods
    • GPU acceleration setup
    • Platform-specific instructions
    • Troubleshooting guide

Features

Platform Support

  • NixOS: Native systemd service integration for Ollama and Wyoming services
  • macOS: Automatic fallback to Docker containers for AI services
  • Linux: Support for both Docker and manual service management

Key Commands

# Enter development environment
nix develop

# Start all services
nix run .#start-services

# Or use from GitHub directly
nix run github:basnijholt/agent-cli#start-services

NixOS Module

Users can add agent-cli to their system configuration:

{
  services.agent-cli = {
    enable = true;
    enableOllama = true;
    enableWhisper = true;
    enablePiper = true;
    enableOpenWakeWord = true;
    enableServer = true;  # Optional API server
  };
}

Testing

The flake has been tested for:

  • ✅ Syntax validation (nix flake check)
  • ✅ Output evaluation (nix flake show)
  • ✅ Package building
  • ✅ Development shell creation

Notes

  • Wyoming services (wyoming.faster-whisper, wyoming.piper, wyoming.openwakeword) are NixOS-specific
  • macOS users need Docker Desktop installed for AI services
  • The flake automatically handles Python packages not available in nixpkgs by installing them via pip

Future Improvements

  • Add support for more Whisper models
  • Configure GPU acceleration options
  • Add Home Manager module
  • Support for additional TTS voices

This commit introduces a Nix flake that provides:

- Complete development environment with all Python dependencies
- Platform-aware service management (Docker for macOS, systemd for NixOS)
- NixOS module for system-wide integration
- Automated service startup script for Ollama and Wyoming services
- Support for running agent-cli server in the background

The flake handles platform differences:
- On NixOS: Uses native systemd services for Ollama, Wyoming (Whisper, Piper, OpenWakeWord)
- On macOS: Falls back to Docker containers for AI services
- On Linux: Can use either Docker or manual service management

Key features:
- `nix develop` - Enter development shell with all dependencies
- `nix run .#start-services` - Start all required background services
- NixOS module for declarative system configuration
- Automatic Python dependency installation via pip for packages not in nixpkgs
- GPU acceleration support (CUDA on Linux, Metal on macOS when available)

Documentation added in docs/installation/flake.md with comprehensive usage instructions.
- Remove all macOS/Darwin compatibility code
- Eliminate Docker dependencies and fallbacks
- Use native NixOS systemd services exclusively
- Simplify from 456 to 330 lines
- Add GPU acceleration options (CUDA/ROCm)
- Replace complex start-services script with simple check-services
- Fix build issues with git and runtime dependency checks
- Remove all references to start-agent-services script
- Replace with check-agent-services for status checking
- Update to reflect NixOS-only approach
- Document GPU acceleration options
- Clarify that services are managed by systemd
- Add note about NixOS-specific design
- Add configurable Ollama host and environment variables
- Add configurable Whisper model, language, and URI
- Add configurable Piper voice and URI
- Add configurable OpenWakeWord models and URI
- Extract ports dynamically from URIs for firewall rules
- Support advanced configurations like large-v3 model for Whisper
- Add detailed documentation for all new configuration options
- Show examples for Ollama, Whisper, Piper, and OpenWakeWord settings
- Include complete configuration example with advanced options
- Document support for large-v3 model and custom URIs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants