mirror of
https://github.com/Aider-AI/aider.git
synced 2025-06-02 02:34:59 +00:00
copy
This commit is contained in:
parent
0b5e0a1113
commit
3422718415
2 changed files with 7 additions and 4 deletions
|
@ -119,8 +119,11 @@ setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows, restart shell after set
|
||||||
aider --model ollama/deepseek-v3
|
aider --model ollama/deepseek-v3
|
||||||
```
|
```
|
||||||
|
|
||||||
It's important to provide model settings, especially the `num_ctx` parameter.
|
It's important to provide model settings, especially the `num_ctx` parameter to
|
||||||
|
set the context window.
|
||||||
Ollama uses a 2k context window by default, which is very small for working with aider.
|
Ollama uses a 2k context window by default, which is very small for working with aider.
|
||||||
|
Larger context windows will allow you to work with larger amounts of code,
|
||||||
|
but will use memory and increase latency.
|
||||||
|
|
||||||
Unlike most other LLM servers, Ollama does not throw an error if you submit a request that exceeds the context window. Instead, it just silently truncates the request by discarding the “oldest” messages in the chat to make it fit within the context window.
|
Unlike most other LLM servers, Ollama does not throw an error if you submit a request that exceeds the context window. Instead, it just silently truncates the request by discarding the “oldest” messages in the chat to make it fit within the context window.
|
||||||
|
|
||||||
|
@ -137,15 +140,13 @@ Create a `.aider.model.settings.yml` file in your home directory or git project
|
||||||
lazy: false
|
lazy: false
|
||||||
reminder: sys
|
reminder: sys
|
||||||
examples_as_sys_msg: true
|
examples_as_sys_msg: true
|
||||||
extra_params:
|
|
||||||
max_tokens: 8192
|
|
||||||
cache_control: false
|
cache_control: false
|
||||||
caches_by_default: true
|
caches_by_default: true
|
||||||
use_system_prompt: true
|
use_system_prompt: true
|
||||||
use_temperature: true
|
use_temperature: true
|
||||||
streaming: true
|
streaming: true
|
||||||
extra_params:
|
extra_params:
|
||||||
num_ctx: 8192
|
num_ctx: 8192 # How large a context window?
|
||||||
```
|
```
|
||||||
|
|
||||||
## Other providers
|
## Other providers
|
||||||
|
|
|
@ -56,6 +56,8 @@ you added to the chat.
|
||||||
That's because ollama is silently discarding them because they exceed the context window.
|
That's because ollama is silently discarding them because they exceed the context window.
|
||||||
|
|
||||||
Aider sets Ollama's context window to 8k by default.
|
Aider sets Ollama's context window to 8k by default.
|
||||||
|
Larger context windows will allow you to work with larger amounts of code,
|
||||||
|
but will use memory and increase latency.
|
||||||
If you would like
|
If you would like
|
||||||
a larger context window
|
a larger context window
|
||||||
you can use a
|
you can use a
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue