mirror of
https://github.com/Aider-AI/aider.git
synced 2025-05-31 01:35:00 +00:00
updated docs
This commit is contained in:
parent
7e511dc21f
commit
e095fde27e
8 changed files with 92 additions and 70 deletions
|
@ -14,6 +14,9 @@ for that model.
|
||||||
Aider will use an unlimited context window and assume the model is free,
|
Aider will use an unlimited context window and assume the model is free,
|
||||||
so this is not usually a significant problem.
|
so this is not usually a significant problem.
|
||||||
|
|
||||||
|
See the docs on
|
||||||
|
[configuring advanced model settings](/docs/config/adv-model-settings.html)
|
||||||
|
for details on how to remove this warning.
|
||||||
|
|
||||||
## Did you mean?
|
## Did you mean?
|
||||||
|
|
||||||
|
|
|
@ -12,7 +12,7 @@ Most options can also be set in an `.aider.conf.yml` file
|
||||||
which can be placed in your home directory or at the root of
|
which can be placed in your home directory or at the root of
|
||||||
your git repo.
|
your git repo.
|
||||||
Or via environment variables like `AIDER_xxx`,
|
Or via environment variables like `AIDER_xxx`,
|
||||||
as noted in the [options reference](options.html).
|
as noted in the [options reference](/docs/config/options.html).
|
||||||
|
|
||||||
Here are 3 equivalent ways of setting an option. First, via a command line switch:
|
Here are 3 equivalent ways of setting an option. First, via a command line switch:
|
||||||
|
|
||||||
|
|
86
website/docs/config/adv-model-settings.md
Normal file
86
website/docs/config/adv-model-settings.md
Normal file
|
@ -0,0 +1,86 @@
|
||||||
|
---
|
||||||
|
parent: Configuration
|
||||||
|
nav_order: 950
|
||||||
|
description: Configuring advanced settings for LLMs.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Advanced model settings
|
||||||
|
|
||||||
|
## Context window size and token costs
|
||||||
|
|
||||||
|
In most cases, you can safely ignore aider's warning about unknown context
|
||||||
|
window size and model costs.
|
||||||
|
|
||||||
|
But, you can register context window limits and costs for models that aren't known
|
||||||
|
to aider. Create a `.aider.litellm.models.json` file in one of these locations:
|
||||||
|
|
||||||
|
- Your home directory.
|
||||||
|
- The root if your git repo.
|
||||||
|
- The current directory where you launch aider.
|
||||||
|
- Or specify a specific file with the `--model-metadata-file <filename>` switch.
|
||||||
|
|
||||||
|
|
||||||
|
If the files above exist, they will be loaded in that order.
|
||||||
|
Files loaded last will take priority.
|
||||||
|
|
||||||
|
The json file should be a dictionary with an entry for each model, as follows:
|
||||||
|
|
||||||
|
```
|
||||||
|
{
|
||||||
|
"deepseek-chat": {
|
||||||
|
"max_tokens": 4096,
|
||||||
|
"max_input_tokens": 32000,
|
||||||
|
"max_output_tokens": 4096,
|
||||||
|
"input_cost_per_token": 0.00000014,
|
||||||
|
"output_cost_per_token": 0.00000028,
|
||||||
|
"litellm_provider": "deepseek",
|
||||||
|
"mode": "chat"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
See
|
||||||
|
[litellm's model_prices_and_context_window.json file](https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json) for more examples.
|
||||||
|
|
||||||
|
## Model settings
|
||||||
|
|
||||||
|
Aider has a number of settings that control how it works with
|
||||||
|
different models.
|
||||||
|
These model settings are pre-configured for most popular models.
|
||||||
|
But it can sometimes be helpful to override them or add settings for
|
||||||
|
a model that aider doesn't know about.
|
||||||
|
|
||||||
|
To do that,
|
||||||
|
create a `.aider.models.yml` file in one of these locations:
|
||||||
|
|
||||||
|
- Your home directory.
|
||||||
|
- The root if your git repo.
|
||||||
|
- The current directory where you launch aider.
|
||||||
|
- Or specify a specific file with the `--model-settings-file <filename>` switch.
|
||||||
|
|
||||||
|
If the files above exist, they will be loaded in that order.
|
||||||
|
Files loaded last will take priority.
|
||||||
|
|
||||||
|
The yaml file should be a a list of dictionary objects for each model, as follows:
|
||||||
|
|
||||||
|
```
|
||||||
|
- name: "gpt-3.5-turbo"
|
||||||
|
edit_format: "whole"
|
||||||
|
weak_model_name: "gpt-3.5-turbo"
|
||||||
|
use_repo_map: false
|
||||||
|
send_undo_reply: false
|
||||||
|
accepts_images: false
|
||||||
|
lazy: false
|
||||||
|
reminder_as_sys_msg: true
|
||||||
|
examples_as_sys_msg: false
|
||||||
|
- name: "gpt-4-turbo-2024-04-09"
|
||||||
|
edit_format: "udiff"
|
||||||
|
weak_model_name: "gpt-3.5-turbo"
|
||||||
|
use_repo_map: true
|
||||||
|
send_undo_reply: true
|
||||||
|
accepts_images: true
|
||||||
|
lazy: true
|
||||||
|
reminder_as_sys_msg: true
|
||||||
|
examples_as_sys_msg: false
|
||||||
|
```
|
||||||
|
|
|
@ -8,70 +8,3 @@ nav_order: 900
|
||||||
{% include model-warnings.md %}
|
{% include model-warnings.md %}
|
||||||
|
|
||||||
|
|
||||||
## Adding settings for missing models
|
|
||||||
You can register model settings used by aider for unknown models.
|
|
||||||
Create a `.aider.models.yml` file in one of these locations:
|
|
||||||
|
|
||||||
- Your home directory.
|
|
||||||
- The root if your git repo.
|
|
||||||
- The current directory where you launch aider.
|
|
||||||
- Or specify a specific file with the `--model-settings-file <filename>` switch.
|
|
||||||
|
|
||||||
If the files above exist, they will be loaded in that order.
|
|
||||||
Files loaded last will take priority.
|
|
||||||
|
|
||||||
The yaml file should be a a list of dictionary objects for each model, as follows:
|
|
||||||
|
|
||||||
```
|
|
||||||
- name: "gpt-3.5-turbo"
|
|
||||||
edit_format: "whole"
|
|
||||||
weak_model_name: "gpt-3.5-turbo"
|
|
||||||
use_repo_map: false
|
|
||||||
send_undo_reply: false
|
|
||||||
accepts_images: false
|
|
||||||
lazy: false
|
|
||||||
reminder_as_sys_msg: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
- name: "gpt-4-turbo-2024-04-09"
|
|
||||||
edit_format: "udiff"
|
|
||||||
weak_model_name: "gpt-3.5-turbo"
|
|
||||||
use_repo_map: true
|
|
||||||
send_undo_reply: true
|
|
||||||
accepts_images: true
|
|
||||||
lazy: true
|
|
||||||
reminder_as_sys_msg: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
```
|
|
||||||
|
|
||||||
## Specifying context window size and token costs
|
|
||||||
|
|
||||||
You can register context window limits and costs for models that aren't known
|
|
||||||
to aider. Create a `.aider.litellm.models.json` file in one of these locations:
|
|
||||||
|
|
||||||
- Your home directory.
|
|
||||||
- The root if your git repo.
|
|
||||||
- The current directory where you launch aider.
|
|
||||||
- Or specify a specific file with the `--model-metadata-file <filename>` switch.
|
|
||||||
|
|
||||||
|
|
||||||
If the files above exist, they will be loaded in that order.
|
|
||||||
Files loaded last will take priority.
|
|
||||||
|
|
||||||
The json file should be a dictionary with an entry for each model, as follows:
|
|
||||||
|
|
||||||
```
|
|
||||||
{
|
|
||||||
"deepseek-chat": {
|
|
||||||
"max_tokens": 4096,
|
|
||||||
"max_input_tokens": 32000,
|
|
||||||
"max_output_tokens": 4096,
|
|
||||||
"input_cost_per_token": 0.00000014,
|
|
||||||
"output_cost_per_token": 0.00000028,
|
|
||||||
"litellm_provider": "deepseek",
|
|
||||||
"mode": "chat"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
See
|
|
||||||
[litellm's model_prices_and_context_window.json file](https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json) for more examples.
|
|
||||||
|
|
|
@ -21,8 +21,8 @@ In these cases, here are some things you might try.
|
||||||
|
|
||||||
## Use a capable model
|
## Use a capable model
|
||||||
|
|
||||||
If possible try using GPT-4o or Opus, as they are the strongest and most
|
If possible try using GPT-4o, Claude 3.5 Sonnet or Claude 3 Opus,
|
||||||
capable models.
|
as they are the strongest and most capable models.
|
||||||
|
|
||||||
Weaker models
|
Weaker models
|
||||||
are more prone to
|
are more prone to
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue