This commit is contained in:
Paul Gauthier 2024-08-19 15:59:03 -07:00
parent e7fdce0b75
commit 076db26854
8 changed files with 23 additions and 12 deletions

View file

@ -14,6 +14,7 @@
- Bugfix: properly load `.aider.models.metadata.json` data.
- Bugfix: Using `--msg /ask ...` caused an exception.
- Bugfix: litellm tokenizer bug for images.
- Aider wrote 55% of the code in this release.
### Aider v0.50.1

View file

@ -18,8 +18,18 @@ cog.out(text)
### main branch
- Prompt caching for Anthropic models with `--cache-prompts`.
- Caches the system prompt, repo map and `/read-only` files.
- Repo map recomputes less often in large/mono repos or when caching enabled.
- Use `--map-refresh <always|files|manual|auto>` to configure.
- Improved cost estimate logic for caching.
- Improved editing performance on Jupyter Notebook `.ipynb` files.
- Work around litellm tokenizer bug for images.
- Show which config yaml file is loaded with `--verbose`.
- Bumped dependency versions.
- Bugfix: properly load `.aider.models.metadata.json` data.
- Bugfix: Using `--msg /ask ...` caused an exception.
- Bugfix: litellm tokenizer bug for images.
- Aider wrote 55% of the code in this release.
### Aider v0.50.1

View file

@ -92,10 +92,10 @@
## Max number of tokens to use for repo map, use 0 to disable (default: 1024)
#map-tokens:
## Control when the repo map is refreshed (default: auto)
## Control how often the repo map is refreshed (default: auto)
#map-refresh: auto
## Enable caching of prompts (forces map_refresh='files') (default: False)
## Enable caching of prompts (default: False)
#cache-prompts: false
## Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.

View file

@ -96,10 +96,10 @@
## Max number of tokens to use for repo map, use 0 to disable (default: 1024)
#AIDER_MAP_TOKENS=
## Control when the repo map is refreshed (default: auto)
## Control how often the repo map is refreshed (default: auto)
#AIDER_MAP_REFRESH=auto
## Enable caching of prompts (forces map_refresh='files') (default: False)
## Enable caching of prompts (default: False)
#AIDER_CACHE_PROMPTS=false
## Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.

View file

@ -131,10 +131,10 @@ cog.outl("```")
## Max number of tokens to use for repo map, use 0 to disable (default: 1024)
#map-tokens:
## Control when the repo map is refreshed (default: auto)
## Control how often the repo map is refreshed (default: auto)
#map-refresh: auto
## Enable caching of prompts (forces map_refresh='files') (default: False)
## Enable caching of prompts (default: False)
#cache-prompts: false
## Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.

View file

@ -138,10 +138,10 @@ cog.outl("```")
## Max number of tokens to use for repo map, use 0 to disable (default: 1024)
#AIDER_MAP_TOKENS=
## Control when the repo map is refreshed (default: auto)
## Control how often the repo map is refreshed (default: auto)
#AIDER_MAP_REFRESH=auto
## Enable caching of prompts (forces map_refresh='files') (default: False)
## Enable caching of prompts (default: False)
#AIDER_CACHE_PROMPTS=false
## Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.

View file

@ -192,12 +192,12 @@ Max number of tokens to use for repo map, use 0 to disable (default: 1024)
Environment variable: `AIDER_MAP_TOKENS`
### `--map-refresh VALUE`
Control when the repo map is refreshed (default: auto)
Control how often the repo map is refreshed (default: auto)
Default: auto
Environment variable: `AIDER_MAP_REFRESH`
### `--cache-prompts`
Enable caching of prompts (forces map_refresh='files') (default: False)
Enable caching of prompts (default: False)
Default: False
Environment variable: `AIDER_CACHE_PROMPTS`
Aliases:

View file

@ -33,7 +33,7 @@ cog.out(get_help_md())
| **/model** | Switch to a new LLM |
| **/models** | Search the list of available models |
| **/quit** | Exit the application |
| **/read** | Add a file to the chat that is for reference, not to be edited |
| **/read-only** | Add a file to the chat that is for reference, not to be edited |
| **/run** | Run a shell command and optionally add the output to the chat (alias: !) |
| **/test** | Run a shell command and add the output to the chat on non-zero exit code |
| **/tokens** | Report on the number of tokens used by the current chat context |