An interactive model selector for using Aider with Ollama

Tired of typing aider --model ollama/some-long-model-name:tag every time? This simple wrapper gives you a beautiful interactive menu to select from all your available Ollama models.
- 🎯 Interactive Selection - Use arrow keys to browse, type to fuzzy-search
- ⚡ Fast - Fetches models directly from Ollama API
- 🔧 Configurable - Works with custom Ollama endpoints and virtual environments
- 📦 Zero Config - Works out of the box with sensible defaults
Ubuntu/Debian:
sudo apt-get install fzf jq curl
pip install aider-chatmacOS:
brew install fzf jq curl
pip install aider-chatArch Linux:
sudo pacman -S fzf jq curl
pip install aider-chat# Download the script
curl -o ~/.local/bin/aider-ollama https://raw.githubusercontent.com/unmodeled-tyler/aider-ollama/main/aider-ollama
chmod +x ~/.local/bin/aider-ollama
# Add to your shell config (bash)
echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.bashrc
echo 'export OLLAMA_API_BASE=http://localhost:11434' >> ~/.bashrc
source ~/.bashrcgit clone https://github.com/unmodeled-tyler/aider-ollama.git
cd aider-ollama
./install.shSimply run:
aider-ollamaYou'll see a list of all your Ollama models. Use arrow keys to navigate, type to filter, and press Enter to select.
Any arguments after aider-ollama are passed directly to aider:
# Start without git integration
aider-ollama --no-git
# Start with specific files
aider-ollama main.py utils.py
# Use architect mode
aider-ollama --architectAdd this to your ~/.bashrc or ~/.zshrc for even quicker access:
alias aider='aider-ollama'Now just type aider and you'll get the model selector!
| Variable | Default | Description |
|---|---|---|
OLLAMA_API_BASE |
http://localhost:11434 |
Ollama API endpoint |
AIDER_VENV |
(none) | Path to aider's virtualenv (if not in PATH) |
# Use a remote Ollama server
export OLLAMA_API_BASE=http://192.168.1.100:11434
# Use aider from a specific virtualenv
export AIDER_VENV=/home/user/aider/.venv- Queries Ollama's
/api/tagsendpoint to list available models - Displays them in an interactive
fzfmenu - Prepends
ollama/to your selection and launches aider
Make sure Ollama is running:
ollama servePull some models first:
ollama pull llama3.2
ollama pull codellama
ollama pull deepseek-coderInstall aider:
pip install aider-chatOr if using a virtualenv, set AIDER_VENV:
export AIDER_VENV=/path/to/your/aider/.venvContributions welcome! Feel free to open issues or PRs.
MIT License - see LICENSE