From 10d599f26a9b844d8982570838454f1c07a7e017 Mon Sep 17 00:00:00 2001 From: Adrian Cole <64215+codefromthecrypt@users.noreply.github.com> Date: Mon, 10 Mar 2025 15:32:53 +0800 Subject: [PATCH] ollama: suggest higher minimum context length This updates the doc with the new ENV variable ollama supports since [v0.5.13](https://github.com/ollama/ollama/releases/tag/v0.5.13) --- aider/website/docs/llms/ollama.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/aider/website/docs/llms/ollama.md b/aider/website/docs/llms/ollama.md index 014baa175..463dc4a3e 100644 --- a/aider/website/docs/llms/ollama.md +++ b/aider/website/docs/llms/ollama.md @@ -11,8 +11,8 @@ Aider can connect to local Ollama models. # Pull the model ollama pull -# Start your ollama server -ollama serve +# Start your ollama server, increasing the context window to 8k tokens +OLLAMA_CONTEXT_LENGTH=8192 ollama serve # In another terminal window... python -m pip install -U aider-chat