mirror of
https://github.com/Aider-AI/aider.git
synced 2025-06-15 09:05:00 +00:00
copy
This commit is contained in:
parent
79f32c2ebd
commit
21e96df85a
5 changed files with 81 additions and 11 deletions
|
@ -44,15 +44,16 @@ setx OLLAMA_API_KEY <api-key> # Windows, restart shell after setx
|
|||
|
||||
[Ollama uses a 2k context window by default](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-specify-the-context-window-size),
|
||||
which is very small for working with aider.
|
||||
|
||||
It also **silently** discards context that exceeds the window.
|
||||
This is especially dangerous because many users don't even realize that most of their data
|
||||
is being discarded by Ollama.
|
||||
|
||||
By default, aider sets Ollama's context window
|
||||
to be large enough for each request you send plus 8k tokens for the reply.
|
||||
This ensures data isn't silently discarded by Ollama.
|
||||
|
||||
Larger context windows may be helpful to allow larger replies from the LLM
|
||||
but will use memory and increase latency.
|
||||
If you would like
|
||||
a larger context window
|
||||
you can use a
|
||||
If you'd like you can configure a fixed sized context window instead
|
||||
with an
|
||||
[`.aider.model.settings.yml` file](https://aider.chat/docs/config/adv-model-settings.html#model-settings)
|
||||
like this:
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue