Skip to content

unmodeled-tyler/aider-ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

aider-ollama

An interactive model selector for using Aider with Ollama Screenshot from 2025-12-05 14-21-47

Tired of typing aider --model ollama/some-long-model-name:tag every time? This simple wrapper gives you a beautiful interactive menu to select from all your available Ollama models.

Features

  • 🎯 Interactive Selection - Use arrow keys to browse, type to fuzzy-search
  • Fast - Fetches models directly from Ollama API
  • 🔧 Configurable - Works with custom Ollama endpoints and virtual environments
  • 📦 Zero Config - Works out of the box with sensible defaults

Prerequisites

  • Aider (pip install aider-chat)
  • Ollama running locally
  • fzf - fuzzy finder
  • jq - JSON processor
  • curl

Installing Dependencies

Ubuntu/Debian:

sudo apt-get install fzf jq curl
pip install aider-chat

macOS:

brew install fzf jq curl
pip install aider-chat

Arch Linux:

sudo pacman -S fzf jq curl
pip install aider-chat

Installation

Option 1: Quick Install (Recommended)

# Download the script
curl -o ~/.local/bin/aider-ollama https://raw.githubusercontent.com/unmodeled-tyler/aider-ollama/main/aider-ollama
chmod +x ~/.local/bin/aider-ollama

# Add to your shell config (bash)
echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.bashrc
echo 'export OLLAMA_API_BASE=http://localhost:11434' >> ~/.bashrc
source ~/.bashrc

Option 2: Clone the Repo

git clone https://github.com/unmodeled-tyler/aider-ollama.git
cd aider-ollama
./install.sh

Usage

Simply run:

aider-ollama

You'll see a list of all your Ollama models. Use arrow keys to navigate, type to filter, and press Enter to select.

Pass Arguments to Aider

Any arguments after aider-ollama are passed directly to aider:

# Start without git integration
aider-ollama --no-git

# Start with specific files
aider-ollama main.py utils.py

# Use architect mode
aider-ollama --architect

Create an Alias

Add this to your ~/.bashrc or ~/.zshrc for even quicker access:

alias aider='aider-ollama'

Now just type aider and you'll get the model selector!

Configuration

Environment Variables

Variable Default Description
OLLAMA_API_BASE http://localhost:11434 Ollama API endpoint
AIDER_VENV (none) Path to aider's virtualenv (if not in PATH)

Examples

# Use a remote Ollama server
export OLLAMA_API_BASE=http://192.168.1.100:11434

# Use aider from a specific virtualenv
export AIDER_VENV=/home/user/aider/.venv

How It Works

  1. Queries Ollama's /api/tags endpoint to list available models
  2. Displays them in an interactive fzf menu
  3. Prepends ollama/ to your selection and launches aider

Troubleshooting

"Could not connect to Ollama"

Make sure Ollama is running:

ollama serve

"No models found"

Pull some models first:

ollama pull llama3.2
ollama pull codellama
ollama pull deepseek-coder

"aider is not installed"

Install aider:

pip install aider-chat

Or if using a virtualenv, set AIDER_VENV:

export AIDER_VENV=/path/to/your/aider/.venv

Contributing

Contributions welcome! Feel free to open issues or PRs.

License

MIT License - see LICENSE

Credits

  • Aider - AI pair programming in your terminal
  • Ollama - Run LLMs locally
  • fzf - A command-line fuzzy finder

Releases

No releases published

Packages

No packages published

Languages