Merge branch 'main' into shell-commands-new

This commit is contained in:
Paul Gauthier 2024-08-20 17:33:30 -07:00
commit 18d6260c44
10 changed files with 41 additions and 23 deletions

View file

@ -1 +1 @@
__version__ = "0.51.1-dev"
__version__ = "0.51.2-dev"

View file

@ -98,6 +98,9 @@
## Enable caching of prompts (default: False)
#cache-prompts: false
## Multiplier for map tokens when no files are specified (default: 2)
#map-multiplier-no-files: true
## Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.
#max-chat-history-tokens:

View file

@ -102,6 +102,9 @@
## Enable caching of prompts (default: False)
#AIDER_CACHE_PROMPTS=false
## Multiplier for map tokens when no files are specified (default: 2)
#AIDER_MAP_MULTIPLIER_NO_FILES=true
## Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.
#AIDER_MAX_CHAT_HISTORY_TOKENS=

View file

@ -137,6 +137,9 @@ cog.outl("```")
## Enable caching of prompts (default: False)
#cache-prompts: false
## Multiplier for map tokens when no files are specified (default: 2)
#map-multiplier-no-files: true
## Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.
#max-chat-history-tokens:

View file

@ -144,6 +144,9 @@ cog.outl("```")
## Enable caching of prompts (default: False)
#AIDER_CACHE_PROMPTS=false
## Multiplier for map tokens when no files are specified (default: 2)
#AIDER_MAP_MULTIPLIER_NO_FILES=true
## Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.
#AIDER_MAX_CHAT_HISTORY_TOKENS=

View file

@ -36,8 +36,9 @@ usage: aider [-h] [--openai-api-key] [--anthropic-api-key] [--model]
[--show-model-warnings | --no-show-model-warnings]
[--map-tokens] [--map-refresh]
[--cache-prompts | --no-cache-prompts]
[--max-chat-history-tokens] [--env-file]
[--input-history-file] [--chat-history-file]
[--map-multiplier-no-files] [--max-chat-history-tokens]
[--env-file] [--input-history-file]
[--chat-history-file]
[--restore-chat-history | --no-restore-chat-history]
[--llm-history-file] [--dark-mode] [--light-mode]
[--pretty | --no-pretty] [--stream | --no-stream]
@ -204,6 +205,11 @@ Aliases:
- `--cache-prompts`
- `--no-cache-prompts`
### `--map-multiplier-no-files VALUE`
Multiplier for map tokens when no files are specified (default: 2)
Default: 2
Environment variable: `AIDER_MAP_MULTIPLIER_NO_FILES`
### `--max-chat-history-tokens VALUE`
Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.
Environment variable: `AIDER_MAX_CHAT_HISTORY_TOKENS`

View file

@ -14,7 +14,7 @@ This usually happens because the LLM is disobeying the system prompts
and trying to make edits in a format that aider doesn't expect.
Aider makes every effort to get the LLM
to conform, and works hard to deal with
LLMM edits that are "almost" correctly formatted.
LLM edits that are "almost" correctly formatted.
But sometimes the LLM just won't cooperate.
In these cases, here are some things you might try.
@ -42,14 +42,14 @@ Models: claude-3-5-sonnet-20240620 with ♾️ diff edit format
## Reduce distractions
Many LLM now have very large context windows,
Many LLMs now have very large context windows,
but filling them with irrelevant code or conversation
can cofuse the model.
can confuse the model.
- Don't add too many files to the chat, *just* add the files you think need to be edited.
Aider also sends the LLM a [map of your entire git repo](https://aider.chat/docs/repomap.html), so other relevant code will be included automatically.
- Use `/drop` to remove files from the chat session which aren't needed for the task at hand. This will reduce distractions and may help GPT produce properly formatted edits.
- Use `/clear` to remove the conversation history, again to help GPT focus.
- Use `/drop` to remove files from the chat session which aren't needed for the task at hand. This will reduce distractions and may help the LLM produce properly formatted edits.
- Use `/clear` to remove the conversation history, again to help the LLM focus.
## More help