ollama: suggest higher minimum context length

This updates the doc with the new ENV variable ollama supports since [v0.5.13](https://github.com/ollama/ollama/releases/tag/v0.5.13)
This commit is contained in:
Adrian Cole 2025-03-10 15:32:53 +08:00 committed by GitHub
parent 74ecdf2d3f
commit 10d599f26a
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -11,8 +11,8 @@ Aider can connect to local Ollama models.
# Pull the model
ollama pull <model>
# Start your ollama server
ollama serve
# Start your ollama server, increasing the context window to 8k tokens
OLLAMA_CONTEXT_LENGTH=8192 ollama serve
# In another terminal window...
python -m pip install -U aider-chat