mirror of
https://github.com/Aider-AI/aider.git
synced 2025-05-29 16:54:59 +00:00
43 lines
823 B
Markdown
43 lines
823 B
Markdown
---
|
|
parent: Connecting to LLMs
|
|
nav_order: 500
|
|
---
|
|
|
|
# Ollama
|
|
|
|
Aider can connect to local Ollama models.
|
|
|
|
```
|
|
# Pull the model
|
|
ollama pull <model>
|
|
|
|
# Start your ollama server
|
|
ollama serve
|
|
|
|
# In another terminal window...
|
|
pip install aider-chat
|
|
|
|
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
|
|
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows
|
|
|
|
aider --model ollama/<model>
|
|
```
|
|
|
|
In particular, `llama3:70b` works well with aider:
|
|
|
|
|
|
```
|
|
ollama pull llama3:70b
|
|
ollama serve
|
|
|
|
# In another terminal window...
|
|
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
|
|
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows
|
|
|
|
aider --model ollama/llama3:70b
|
|
```
|
|
|
|
See the [model warnings](warnings.html)
|
|
section for information on warnings which will occur
|
|
when working with models that aider is not familiar with.
|
|
|