updated docs

This commit is contained in:
Paul Gauthier 2024-06-27 10:10:36 -07:00
parent 7e511dc21f
commit e095fde27e
8 changed files with 92 additions and 70 deletions

View file

@ -12,7 +12,7 @@ Most options can also be set in an `.aider.conf.yml` file
which can be placed in your home directory or at the root of
your git repo.
Or via environment variables like `AIDER_xxx`,
as noted in the [options reference](options.html).
as noted in the [options reference](/docs/config/options.html).
Here are 3 equivalent ways of setting an option. First, via a command line switch:

View file

@ -0,0 +1,86 @@
---
parent: Configuration
nav_order: 950
description: Configuring advanced settings for LLMs.
---
# Advanced model settings
## Context window size and token costs
In most cases, you can safely ignore aider's warning about unknown context
window size and model costs.
But, you can register context window limits and costs for models that aren't known
to aider. Create a `.aider.litellm.models.json` file in one of these locations:
- Your home directory.
- The root if your git repo.
- The current directory where you launch aider.
- Or specify a specific file with the `--model-metadata-file <filename>` switch.
If the files above exist, they will be loaded in that order.
Files loaded last will take priority.
The json file should be a dictionary with an entry for each model, as follows:
```
{
"deepseek-chat": {
"max_tokens": 4096,
"max_input_tokens": 32000,
"max_output_tokens": 4096,
"input_cost_per_token": 0.00000014,
"output_cost_per_token": 0.00000028,
"litellm_provider": "deepseek",
"mode": "chat"
}
}
```
See
[litellm's model_prices_and_context_window.json file](https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json) for more examples.
## Model settings
Aider has a number of settings that control how it works with
different models.
These model settings are pre-configured for most popular models.
But it can sometimes be helpful to override them or add settings for
a model that aider doesn't know about.
To do that,
create a `.aider.models.yml` file in one of these locations:
- Your home directory.
- The root if your git repo.
- The current directory where you launch aider.
- Or specify a specific file with the `--model-settings-file <filename>` switch.
If the files above exist, they will be loaded in that order.
Files loaded last will take priority.
The yaml file should be a a list of dictionary objects for each model, as follows:
```
- name: "gpt-3.5-turbo"
edit_format: "whole"
weak_model_name: "gpt-3.5-turbo"
use_repo_map: false
send_undo_reply: false
accepts_images: false
lazy: false
reminder_as_sys_msg: true
examples_as_sys_msg: false
- name: "gpt-4-turbo-2024-04-09"
edit_format: "udiff"
weak_model_name: "gpt-3.5-turbo"
use_repo_map: true
send_undo_reply: true
accepts_images: true
lazy: true
reminder_as_sys_msg: true
examples_as_sys_msg: false
```

View file

@ -8,70 +8,3 @@ nav_order: 900
{% include model-warnings.md %}
## Adding settings for missing models
You can register model settings used by aider for unknown models.
Create a `.aider.models.yml` file in one of these locations:
- Your home directory.
- The root if your git repo.
- The current directory where you launch aider.
- Or specify a specific file with the `--model-settings-file <filename>` switch.
If the files above exist, they will be loaded in that order.
Files loaded last will take priority.
The yaml file should be a a list of dictionary objects for each model, as follows:
```
- name: "gpt-3.5-turbo"
edit_format: "whole"
weak_model_name: "gpt-3.5-turbo"
use_repo_map: false
send_undo_reply: false
accepts_images: false
lazy: false
reminder_as_sys_msg: true
examples_as_sys_msg: false
- name: "gpt-4-turbo-2024-04-09"
edit_format: "udiff"
weak_model_name: "gpt-3.5-turbo"
use_repo_map: true
send_undo_reply: true
accepts_images: true
lazy: true
reminder_as_sys_msg: true
examples_as_sys_msg: false
```
## Specifying context window size and token costs
You can register context window limits and costs for models that aren't known
to aider. Create a `.aider.litellm.models.json` file in one of these locations:
- Your home directory.
- The root if your git repo.
- The current directory where you launch aider.
- Or specify a specific file with the `--model-metadata-file <filename>` switch.
If the files above exist, they will be loaded in that order.
Files loaded last will take priority.
The json file should be a dictionary with an entry for each model, as follows:
```
{
"deepseek-chat": {
"max_tokens": 4096,
"max_input_tokens": 32000,
"max_output_tokens": 4096,
"input_cost_per_token": 0.00000014,
"output_cost_per_token": 0.00000028,
"litellm_provider": "deepseek",
"mode": "chat"
}
}
```
See
[litellm's model_prices_and_context_window.json file](https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json) for more examples.

View file

@ -21,8 +21,8 @@ In these cases, here are some things you might try.
## Use a capable model
If possible try using GPT-4o or Opus, as they are the strongest and most
capable models.
If possible try using GPT-4o, Claude 3.5 Sonnet or Claude 3 Opus,
as they are the strongest and most capable models.
Weaker models
are more prone to