mirror of
https://github.com/Aider-AI/aider.git
synced 2025-06-12 07:35:00 +00:00
copy
This commit is contained in:
parent
e7fdce0b75
commit
076db26854
8 changed files with 23 additions and 12 deletions
|
@ -131,10 +131,10 @@ cog.outl("```")
|
|||
## Max number of tokens to use for repo map, use 0 to disable (default: 1024)
|
||||
#map-tokens:
|
||||
|
||||
## Control when the repo map is refreshed (default: auto)
|
||||
## Control how often the repo map is refreshed (default: auto)
|
||||
#map-refresh: auto
|
||||
|
||||
## Enable caching of prompts (forces map_refresh='files') (default: False)
|
||||
## Enable caching of prompts (default: False)
|
||||
#cache-prompts: false
|
||||
|
||||
## Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.
|
||||
|
|
|
@ -138,10 +138,10 @@ cog.outl("```")
|
|||
## Max number of tokens to use for repo map, use 0 to disable (default: 1024)
|
||||
#AIDER_MAP_TOKENS=
|
||||
|
||||
## Control when the repo map is refreshed (default: auto)
|
||||
## Control how often the repo map is refreshed (default: auto)
|
||||
#AIDER_MAP_REFRESH=auto
|
||||
|
||||
## Enable caching of prompts (forces map_refresh='files') (default: False)
|
||||
## Enable caching of prompts (default: False)
|
||||
#AIDER_CACHE_PROMPTS=false
|
||||
|
||||
## Maximum number of tokens to use for chat history. If not specified, uses the model's max_chat_history_tokens.
|
||||
|
|
|
@ -192,12 +192,12 @@ Max number of tokens to use for repo map, use 0 to disable (default: 1024)
|
|||
Environment variable: `AIDER_MAP_TOKENS`
|
||||
|
||||
### `--map-refresh VALUE`
|
||||
Control when the repo map is refreshed (default: auto)
|
||||
Control how often the repo map is refreshed (default: auto)
|
||||
Default: auto
|
||||
Environment variable: `AIDER_MAP_REFRESH`
|
||||
|
||||
### `--cache-prompts`
|
||||
Enable caching of prompts (forces map_refresh='files') (default: False)
|
||||
Enable caching of prompts (default: False)
|
||||
Default: False
|
||||
Environment variable: `AIDER_CACHE_PROMPTS`
|
||||
Aliases:
|
||||
|
|
|
@ -33,7 +33,7 @@ cog.out(get_help_md())
|
|||
| **/model** | Switch to a new LLM |
|
||||
| **/models** | Search the list of available models |
|
||||
| **/quit** | Exit the application |
|
||||
| **/read** | Add a file to the chat that is for reference, not to be edited |
|
||||
| **/read-only** | Add a file to the chat that is for reference, not to be edited |
|
||||
| **/run** | Run a shell command and optionally add the output to the chat (alias: !) |
|
||||
| **/test** | Run a shell command and add the output to the chat on non-zero exit code |
|
||||
| **/tokens** | Report on the number of tokens used by the current chat context |
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue