Broke apart llms

This commit is contained in:
Paul Gauthier 2024-06-05 20:50:55 -07:00
parent 65f80e965f
commit c5e127d6fa
15 changed files with 444 additions and 384 deletions

View file

@ -0,0 +1,43 @@
---
parent: Connecting to LLMs
nav_order: 500
---
# Ollama
Aider can connect to local Ollama models.
```
# Pull the model
ollama pull <model>
# Start your ollama server
ollama serve
# In another terminal window...
pip install aider-chat
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows
aider --model ollama/<model>
```
In particular, `llama3:70b` works well with aider:
```
ollama pull llama3:70b
ollama serve
# In another terminal window...
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows
aider --model ollama/llama3:70b
```
See the [model warnings](warnings.html)
section for information on warnings which will occur
when working with models that aider is not familiar with.