mirror of
https://github.com/Aider-AI/aider.git
synced 2025-05-25 23:05:00 +00:00
823 B
823 B
parent | nav_order |
---|---|
Connecting to LLMs | 500 |
Ollama
Aider can connect to local Ollama models.
# Pull the model
ollama pull <model>
# Start your ollama server
ollama serve
# In another terminal window...
pip install aider-chat
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows
aider --model ollama/<model>
In particular, llama3:70b
works well with aider:
ollama pull llama3:70b
ollama serve
# In another terminal window...
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows
aider --model ollama/llama3:70b
See the model warnings section for information on warnings which will occur when working with models that aider is not familiar with.