mirror of
https://github.com/Aider-AI/aider.git
synced 2025-06-06 20:54:59 +00:00
Compare commits
No commits in common. "main" and "v0.83.1.dev" have entirely different histories.
main
...
v0.83.1.de
61 changed files with 1500 additions and 3522 deletions
37
HISTORY.md
37
HISTORY.md
|
@ -1,33 +1,6 @@
|
||||||
# Release history
|
# Release history
|
||||||
|
|
||||||
### Aider v0.84.0
|
### main branch
|
||||||
|
|
||||||
- Added support for new Claude models including the Sonnet 4 and Opus 4 series (e.g., `claude-sonnet-4-20250514`,
|
|
||||||
`claude-opus-4-20250514`) across various providers. The default `sonnet` and `opus` aliases were updated to these newer
|
|
||||||
versions.
|
|
||||||
- Added support for the `vertex_ai/gemini-2.5-flash-preview-05-20` model.
|
|
||||||
- Fixed OpenRouter token cost calculation for improved accuracy.
|
|
||||||
- Updated default OpenRouter models during onboarding to `deepseek/deepseek-r1:free` for the free tier and
|
|
||||||
`anthropic/claude-sonnet-4` for paid tiers.
|
|
||||||
- Automatically refresh GitHub Copilot tokens when used as OpenAI API keys, by Lih Chen.
|
|
||||||
- Aider wrote 79% of the code in this release.
|
|
||||||
|
|
||||||
### Aider v0.83.2
|
|
||||||
|
|
||||||
- Bumped configargparse to 1.7.1 as 1.7 was pulled.
|
|
||||||
- Added shell tab completion for file path arguments (by saviour) and for `--edit-format`/`--editor-edit-format` options.
|
|
||||||
- Improved OpenRouter model metadata handling by introducing a local cache, increasing reliability and performance.
|
|
||||||
- The `/settings` command now displays detailed metadata for active main, editor, and weak models.
|
|
||||||
- Fixed an issue where files explicitly added via the command line were not correctly ignored if listed in `.gitignore`.
|
|
||||||
- Improved automatic commit messages by providing more context during their generation, by wangboxue.
|
|
||||||
|
|
||||||
### Aider v0.83.1
|
|
||||||
|
|
||||||
- Improved user language detection by correctly normalizing hyphenated language codes (e.g., `en-US` to `en`) and enhancing the validation of locale results.
|
|
||||||
- Prevented Aider from instructing the LLM to reply in 'C' or 'POSIX' when these are detected as the system locale.
|
|
||||||
- Displayed a spinner with the model name when generating commit messages.
|
|
||||||
|
|
||||||
### Aider v0.83.0
|
|
||||||
|
|
||||||
- Added support for `gemini-2.5-pro-preview-05-06` models.
|
- Added support for `gemini-2.5-pro-preview-05-06` models.
|
||||||
- Added support for `qwen3-235b` models.
|
- Added support for `qwen3-235b` models.
|
||||||
|
@ -431,7 +404,7 @@ versions.
|
||||||
- [Aider works with LLM web chat UIs](https://aider.chat/docs/usage/copypaste.html).
|
- [Aider works with LLM web chat UIs](https://aider.chat/docs/usage/copypaste.html).
|
||||||
- New `--copy-paste` mode.
|
- New `--copy-paste` mode.
|
||||||
- New `/copy-context` command.
|
- New `/copy-context` command.
|
||||||
- [Set API keys and other environment variables for all providers from command line or YAML conf file](https://aider.chat/docs/config/aider_conf.html#storing-llm-keys).
|
- [Set API keys and other environment variables for all providers from command line or yaml conf file](https://aider.chat/docs/config/aider_conf.html#storing-llm-keys).
|
||||||
- New `--api-key provider=key` setting.
|
- New `--api-key provider=key` setting.
|
||||||
- New `--set-env VAR=value` setting.
|
- New `--set-env VAR=value` setting.
|
||||||
- Added bash and zsh support to `--watch-files`.
|
- Added bash and zsh support to `--watch-files`.
|
||||||
|
@ -599,7 +572,7 @@ versions.
|
||||||
|
|
||||||
### Aider v0.59.1
|
### Aider v0.59.1
|
||||||
|
|
||||||
- Check for obsolete `yes: true` in YAML config, show helpful error.
|
- Check for obsolete `yes: true` in yaml config, show helpful error.
|
||||||
- Model settings for openrouter/anthropic/claude-3.5-sonnet:beta
|
- Model settings for openrouter/anthropic/claude-3.5-sonnet:beta
|
||||||
|
|
||||||
### Aider v0.59.0
|
### Aider v0.59.0
|
||||||
|
@ -609,7 +582,7 @@ versions.
|
||||||
- Still auto-completes the full paths of the repo files like `/add`.
|
- Still auto-completes the full paths of the repo files like `/add`.
|
||||||
- Now supports globs like `src/**/*.py`
|
- Now supports globs like `src/**/*.py`
|
||||||
- Renamed `--yes` to `--yes-always`.
|
- Renamed `--yes` to `--yes-always`.
|
||||||
- Now uses `AIDER_YES_ALWAYS` env var and `yes-always:` YAML key.
|
- Now uses `AIDER_YES_ALWAYS` env var and `yes-always:` yaml key.
|
||||||
- Existing YAML and .env files will need to be updated.
|
- Existing YAML and .env files will need to be updated.
|
||||||
- Can still abbreviate to `--yes` on the command line.
|
- Can still abbreviate to `--yes` on the command line.
|
||||||
- Config file now uses standard YAML list syntax with ` - list entries`, one per line.
|
- Config file now uses standard YAML list syntax with ` - list entries`, one per line.
|
||||||
|
@ -816,7 +789,7 @@ versions.
|
||||||
- Use `--map-refresh <always|files|manual|auto>` to configure.
|
- Use `--map-refresh <always|files|manual|auto>` to configure.
|
||||||
- Improved cost estimate logic for caching.
|
- Improved cost estimate logic for caching.
|
||||||
- Improved editing performance on Jupyter Notebook `.ipynb` files.
|
- Improved editing performance on Jupyter Notebook `.ipynb` files.
|
||||||
- Show which config YAML file is loaded with `--verbose`.
|
- Show which config yaml file is loaded with `--verbose`.
|
||||||
- Bumped dependency versions.
|
- Bumped dependency versions.
|
||||||
- Bugfix: properly load `.aider.models.metadata.json` data.
|
- Bugfix: properly load `.aider.models.metadata.json` data.
|
||||||
- Bugfix: Using `--msg /ask ...` caused an exception.
|
- Bugfix: Using `--msg /ask ...` caused an exception.
|
||||||
|
|
75
README.md
75
README.md
|
@ -27,13 +27,13 @@ cog.out(text)
|
||||||
<a href="https://github.com/Aider-AI/aider/stargazers"><img alt="GitHub Stars" title="Total number of GitHub stars the Aider project has received"
|
<a href="https://github.com/Aider-AI/aider/stargazers"><img alt="GitHub Stars" title="Total number of GitHub stars the Aider project has received"
|
||||||
src="https://img.shields.io/github/stars/Aider-AI/aider?style=flat-square&logo=github&color=f1c40f&labelColor=555555"/></a>
|
src="https://img.shields.io/github/stars/Aider-AI/aider?style=flat-square&logo=github&color=f1c40f&labelColor=555555"/></a>
|
||||||
<a href="https://pypi.org/project/aider-chat/"><img alt="PyPI Downloads" title="Total number of installations via pip from PyPI"
|
<a href="https://pypi.org/project/aider-chat/"><img alt="PyPI Downloads" title="Total number of installations via pip from PyPI"
|
||||||
src="https://img.shields.io/badge/📦%20Installs-2.5M-2ecc71?style=flat-square&labelColor=555555"/></a>
|
src="https://img.shields.io/badge/📦%20Installs-2.2M-2ecc71?style=flat-square&labelColor=555555"/></a>
|
||||||
<img alt="Tokens per week" title="Number of tokens processed weekly by Aider users"
|
<img alt="Tokens per week" title="Number of tokens processed weekly by Aider users"
|
||||||
src="https://img.shields.io/badge/📈%20Tokens%2Fweek-15B-3498db?style=flat-square&labelColor=555555"/>
|
src="https://img.shields.io/badge/📈%20Tokens%2Fweek-15B-3498db?style=flat-square&labelColor=555555"/>
|
||||||
<a href="https://openrouter.ai/#options-menu"><img alt="OpenRouter Ranking" title="Aider's ranking among applications on the OpenRouter platform"
|
<a href="https://openrouter.ai/#options-menu"><img alt="OpenRouter Ranking" title="Aider's ranking among applications on the OpenRouter platform"
|
||||||
src="https://img.shields.io/badge/🏆%20OpenRouter-Top%2020-9b59b6?style=flat-square&labelColor=555555"/></a>
|
src="https://img.shields.io/badge/🏆%20OpenRouter-Top%2020-9b59b6?style=flat-square&labelColor=555555"/></a>
|
||||||
<a href="https://aider.chat/HISTORY.html"><img alt="Singularity" title="Percentage of the new code in Aider's last release written by Aider itself"
|
<a href="https://aider.chat/HISTORY.html"><img alt="Singularity" title="Percentage of the new code in Aider's last release written by Aider itself"
|
||||||
src="https://img.shields.io/badge/🔄%20Singularity-79%25-e74c3c?style=flat-square&labelColor=555555"/></a>
|
src="https://img.shields.io/badge/🔄%20Singularity-92%25-e74c3c?style=flat-square&labelColor=555555"/></a>
|
||||||
<!--[[[end]]]-->
|
<!--[[[end]]]-->
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
|
@ -136,44 +136,43 @@ See the [installation instructions](https://aider.chat/docs/install.html) and [u
|
||||||
- [LLM Leaderboards](https://aider.chat/docs/leaderboards/)
|
- [LLM Leaderboards](https://aider.chat/docs/leaderboards/)
|
||||||
- [GitHub Repository](https://github.com/Aider-AI/aider)
|
- [GitHub Repository](https://github.com/Aider-AI/aider)
|
||||||
- [Discord Community](https://discord.gg/Y7X7bhMQFV)
|
- [Discord Community](https://discord.gg/Y7X7bhMQFV)
|
||||||
- [Release notes](https://aider.chat/HISTORY.html)
|
|
||||||
- [Blog](https://aider.chat/blog/)
|
- [Blog](https://aider.chat/blog/)
|
||||||
|
|
||||||
## Kind Words From Users
|
## Kind Words From Users
|
||||||
|
|
||||||
- *"My life has changed... Aider... It's going to rock your world."* — [Eric S. Raymond on X](https://x.com/esrtweet/status/1910809356381413593)
|
- *"My life has changed... There's finally an AI coding tool that's good enough to keep up with me... Aider... It's going to rock your world."* — [Eric S. Raymond](https://x.com/esrtweet/status/1910809356381413593)
|
||||||
- *"The best free open source AI coding assistant."* — [IndyDevDan on YouTube](https://youtu.be/YALpX8oOn78)
|
- *"The best free open source AI coding assistant."* — [IndyDevDan](https://youtu.be/YALpX8oOn78)
|
||||||
- *"The best AI coding assistant so far."* — [Matthew Berman on YouTube](https://www.youtube.com/watch?v=df8afeb1FY8)
|
- *"The best AI coding assistant so far."* — [Matthew Berman](https://www.youtube.com/watch?v=df8afeb1FY8)
|
||||||
- *"Aider ... has easily quadrupled my coding productivity."* — [SOLAR_FIELDS on Hacker News](https://news.ycombinator.com/item?id=36212100)
|
- *"Aider ... has easily quadrupled my coding productivity."* — [SOLAR_FIELDS](https://news.ycombinator.com/item?id=36212100)
|
||||||
- *"It's a cool workflow... Aider's ergonomics are perfect for me."* — [qup on Hacker News](https://news.ycombinator.com/item?id=38185326)
|
- *"It's a cool workflow... Aider's ergonomics are perfect for me."* — [qup](https://news.ycombinator.com/item?id=38185326)
|
||||||
- *"It's really like having your senior developer live right in your Git repo - truly amazing!"* — [rappster on GitHub](https://github.com/Aider-AI/aider/issues/124)
|
- *"It's really like having your senior developer live right in your Git repo - truly amazing!"* — [rappster](https://github.com/Aider-AI/aider/issues/124)
|
||||||
- *"What an amazing tool. It's incredible."* — [valyagolev on GitHub](https://github.com/Aider-AI/aider/issues/6#issue-1722897858)
|
- *"What an amazing tool. It's incredible."* — [valyagolev](https://github.com/Aider-AI/aider/issues/6#issue-1722897858)
|
||||||
- *"Aider is such an astounding thing!"* — [cgrothaus on GitHub](https://github.com/Aider-AI/aider/issues/82#issuecomment-1631876700)
|
- *"Aider is such an astounding thing!"* — [cgrothaus](https://github.com/Aider-AI/aider/issues/82#issuecomment-1631876700)
|
||||||
- *"It was WAY faster than I would be getting off the ground and making the first few working versions."* — [Daniel Feldman on X](https://twitter.com/d_feldman/status/1662295077387923456)
|
- *"It was WAY faster than I would be getting off the ground and making the first few working versions."* — [Daniel Feldman](https://twitter.com/d_feldman/status/1662295077387923456)
|
||||||
- *"THANK YOU for Aider! It really feels like a glimpse into the future of coding."* — [derwiki on Hacker News](https://news.ycombinator.com/item?id=38205643)
|
- *"THANK YOU for Aider! It really feels like a glimpse into the future of coding."* — [derwiki](https://news.ycombinator.com/item?id=38205643)
|
||||||
- *"It's just amazing. It is freeing me to do things I felt were out my comfort zone before."* — [Dougie on Discord](https://discord.com/channels/1131200896827654144/1174002618058678323/1174084556257775656)
|
- *"It's just amazing. It is freeing me to do things I felt were out my comfort zone before."* — [Dougie](https://discord.com/channels/1131200896827654144/1174002618058678323/1174084556257775656)
|
||||||
- *"This project is stellar."* — [funkytaco on GitHub](https://github.com/Aider-AI/aider/issues/112#issuecomment-1637429008)
|
- *"This project is stellar."* — [funkytaco](https://github.com/Aider-AI/aider/issues/112#issuecomment-1637429008)
|
||||||
- *"Amazing project, definitely the best AI coding assistant I've used."* — [joshuavial on GitHub](https://github.com/Aider-AI/aider/issues/84)
|
- *"Amazing project, definitely the best AI coding assistant I've used."* — [joshuavial](https://github.com/Aider-AI/aider/issues/84)
|
||||||
- *"I absolutely love using Aider ... It makes software development feel so much lighter as an experience."* — [principalideal0 on Discord](https://discord.com/channels/1131200896827654144/1133421607499595858/1229689636012691468)
|
- *"I absolutely love using Aider ... It makes software development feel so much lighter as an experience."* — [principalideal0](https://discord.com/channels/1131200896827654144/1133421607499595858/1229689636012691468)
|
||||||
- *"I have been recovering from ... surgeries ... aider ... has allowed me to continue productivity."* — [codeninja on Reddit](https://www.reddit.com/r/OpenAI/s/nmNwkHy1zG)
|
- *"I have been recovering from multiple shoulder surgeries ... and have used aider extensively. It has allowed me to continue productivity."* — [codeninja](https://www.reddit.com/r/OpenAI/s/nmNwkHy1zG)
|
||||||
- *"I am an aider addict. I'm getting so much more work done, but in less time."* — [dandandan on Discord](https://discord.com/channels/1131200896827654144/1131200896827654149/1135913253483069470)
|
- *"I am an aider addict. I'm getting so much more work done, but in less time."* — [dandandan](https://discord.com/channels/1131200896827654144/1131200896827654149/1135913253483069470)
|
||||||
- *"Aider... blows everything else out of the water hands down, there's no competition whatsoever."* — [SystemSculpt on Discord](https://discord.com/channels/1131200896827654144/1131200896827654149/1178736602797846548)
|
- *"After wasting $100 on tokens trying to find something better, I'm back to Aider. It blows everything else out of the water hands down, there's no competition whatsoever."* — [SystemSculpt](https://discord.com/channels/1131200896827654144/1131200896827654149/1178736602797846548)
|
||||||
- *"Aider is amazing, coupled with Sonnet 3.5 it's quite mind blowing."* — [Josh Dingus on Discord](https://discord.com/channels/1131200896827654144/1133060684540813372/1262374225298198548)
|
- *"Aider is amazing, coupled with Sonnet 3.5 it's quite mind blowing."* — [Josh Dingus](https://discord.com/channels/1131200896827654144/1133060684540813372/1262374225298198548)
|
||||||
- *"Hands down, this is the best AI coding assistant tool so far."* — [IndyDevDan on YouTube](https://www.youtube.com/watch?v=MPYFPvxfGZs)
|
- *"Hands down, this is the best AI coding assistant tool so far."* — [IndyDevDan](https://www.youtube.com/watch?v=MPYFPvxfGZs)
|
||||||
- *"[Aider] changed my daily coding workflows. It's mind-blowing how ...(it)... can change your life."* — [maledorak on Discord](https://discord.com/channels/1131200896827654144/1131200896827654149/1258453375620747264)
|
- *"[Aider] changed my daily coding workflows. It's mind-blowing how a single Python application can change your life."* — [maledorak](https://discord.com/channels/1131200896827654144/1131200896827654149/1258453375620747264)
|
||||||
- *"Best agent for actual dev work in existing codebases."* — [Nick Dobos on X](https://twitter.com/NickADobos/status/1690408967963652097?s=20)
|
- *"Best agent for actual dev work in existing codebases."* — [Nick Dobos](https://twitter.com/NickADobos/status/1690408967963652097?s=20)
|
||||||
- *"One of my favorite pieces of software. Blazing trails on new paradigms!"* — [Chris Wall on X](https://x.com/chris65536/status/1905053299251798432)
|
- *"One of my favorite pieces of software. Blazing trails on new paradigms!"* — [Chris Wall](https://x.com/chris65536/status/1905053299251798432)
|
||||||
- *"Aider has been revolutionary for me and my work."* — [Starry Hope on X](https://x.com/starryhopeblog/status/1904985812137132056)
|
- *"Aider has been revolutionary for me and my work."* — [Starry Hope](https://x.com/starryhopeblog/status/1904985812137132056)
|
||||||
- *"Try aider! One of the best ways to vibe code."* — [Chris Wall on X](https://x.com/Chris65536/status/1905053418961391929)
|
- *"Try aider! One of the best ways to vibe code."* — [Chris Wall](https://x.com/Chris65536/status/1905053418961391929)
|
||||||
- *"Aider is hands down the best. And it's free and opensource."* — [AriyaSavakaLurker on Reddit](https://www.reddit.com/r/ChatGPTCoding/comments/1ik16y6/whats_your_take_on_aider/mbip39n/)
|
- *"Aider is hands down the best. And it's free and opensource."* — [AriyaSavakaLurker](https://www.reddit.com/r/ChatGPTCoding/comments/1ik16y6/whats_your_take_on_aider/mbip39n/)
|
||||||
- *"Aider is also my best friend."* — [jzn21 on Reddit](https://www.reddit.com/r/ChatGPTCoding/comments/1heuvuo/aider_vs_cline_vs_windsurf_vs_cursor/m27dcnb/)
|
- *"Aider is also my best friend."* — [jzn21](https://www.reddit.com/r/ChatGPTCoding/comments/1heuvuo/aider_vs_cline_vs_windsurf_vs_cursor/m27dcnb/)
|
||||||
- *"Try Aider, it's worth it."* — [jorgejhms on Reddit](https://www.reddit.com/r/ChatGPTCoding/comments/1heuvuo/aider_vs_cline_vs_windsurf_vs_cursor/m27cp99/)
|
- *"Try Aider, it's worth it."* — [jorgejhms](https://www.reddit.com/r/ChatGPTCoding/comments/1heuvuo/aider_vs_cline_vs_windsurf_vs_cursor/m27cp99/)
|
||||||
- *"I like aider :)"* — [Chenwei Cui on X](https://x.com/ccui42/status/1904965344999145698)
|
- *"I like aider :)"* — [Chenwei Cui](https://x.com/ccui42/status/1904965344999145698)
|
||||||
- *"Aider is the precision tool of LLM code gen... Minimal, thoughtful and capable of surgical changes ... while keeping the developer in control."* — [Reilly Sweetland on X](https://x.com/rsweetland/status/1904963807237259586)
|
- *"Aider is the precision tool of LLM code gen... Minimal, thoughtful and capable of surgical changes to your codebase all while keeping the developer in control."* — [Reilly Sweetland](https://x.com/rsweetland/status/1904963807237259586)
|
||||||
- *"Cannot believe aider vibe coded a 650 LOC feature across service and cli today in 1 shot."* - [autopoietist on Discord](https://discord.com/channels/1131200896827654144/1131200896827654149/1355675042259796101)
|
- *"Cannot believe aider vibe coded a 650 LOC feature across service and cli today in 1 shot."* - [autopoietist](https://discord.com/channels/1131200896827654144/1131200896827654149/1355675042259796101)
|
||||||
- *"Oh no the secret is out! Yes, Aider is the best coding tool around. I highly, highly recommend it to anyone."* — [Joshua D Vander Hook on X](https://x.com/jodavaho/status/1911154899057795218)
|
- *"Oh no the secret is out! Yes, Aider is the best coding tool around. I highly, highly recommend it to anyone."* — [Joshua D Vander Hook](https://x.com/jodavaho/status/1911154899057795218)
|
||||||
- *"thanks to aider, i have started and finished three personal projects within the last two days"* — [joseph stalzyn on X](https://x.com/anitaheeder/status/1908338609645904160)
|
- *"thanks to aider, i have started and finished three personal projects within the last two days"* — [joseph stalzyn](https://x.com/anitaheeder/status/1908338609645904160)
|
||||||
- *"Been using aider as my daily driver for over a year ... I absolutely love the tool, like beyond words."* — [koleok on Discord](https://discord.com/channels/1131200896827654144/1273248471394291754/1356727448372252783)
|
- *"Been using aider as my daily driver for over a year ... I absolutely love the tool, like beyond words."* — [koleok](https://discord.com/channels/1131200896827654144/1273248471394291754/1356727448372252783)
|
||||||
- *"Aider ... is the tool to benchmark against."* — [BeetleB on Hacker News](https://news.ycombinator.com/item?id=43930201)
|
- *"Aider ... is the tool to benchmark against."* — [BeetleB](https://news.ycombinator.com/item?id=43930201)
|
||||||
- *"aider is really cool"* — [kache on X](https://x.com/yacineMTB/status/1911224442430124387)
|
- *"aider is really cool"* — [kache (@yacineMTB)](https://x.com/yacineMTB/status/1911224442430124387)
|
||||||
|
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
from packaging import version
|
from packaging import version
|
||||||
|
|
||||||
__version__ = "0.84.1.dev"
|
__version__ = "0.83.1.dev"
|
||||||
safe_version = __version__
|
safe_version = __version__
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
|
|
@ -40,22 +40,10 @@ def get_parser(default_config_files, git_root):
|
||||||
config_file_parser_class=configargparse.YAMLConfigFileParser,
|
config_file_parser_class=configargparse.YAMLConfigFileParser,
|
||||||
auto_env_var_prefix="AIDER_",
|
auto_env_var_prefix="AIDER_",
|
||||||
)
|
)
|
||||||
# List of valid edit formats for argparse validation & shtab completion.
|
|
||||||
# Dynamically gather them from the registered coder classes so the list
|
|
||||||
# stays in sync if new formats are added.
|
|
||||||
from aider import coders as _aider_coders
|
|
||||||
|
|
||||||
edit_format_choices = sorted(
|
|
||||||
{
|
|
||||||
c.edit_format
|
|
||||||
for c in _aider_coders.__all__
|
|
||||||
if hasattr(c, "edit_format") and c.edit_format is not None
|
|
||||||
}
|
|
||||||
)
|
|
||||||
group = parser.add_argument_group("Main model")
|
group = parser.add_argument_group("Main model")
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"files", metavar="FILE", nargs="*", help="files to edit with an LLM (optional)"
|
"files", metavar="FILE", nargs="*", help="files to edit with an LLM (optional)"
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--model",
|
"--model",
|
||||||
metavar="MODEL",
|
metavar="MODEL",
|
||||||
|
@ -122,13 +110,13 @@ def get_parser(default_config_files, git_root):
|
||||||
metavar="MODEL_SETTINGS_FILE",
|
metavar="MODEL_SETTINGS_FILE",
|
||||||
default=".aider.model.settings.yml",
|
default=".aider.model.settings.yml",
|
||||||
help="Specify a file with aider model settings for unknown models",
|
help="Specify a file with aider model settings for unknown models",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--model-metadata-file",
|
"--model-metadata-file",
|
||||||
metavar="MODEL_METADATA_FILE",
|
metavar="MODEL_METADATA_FILE",
|
||||||
default=".aider.model.metadata.json",
|
default=".aider.model.metadata.json",
|
||||||
help="Specify a file with context window and costs for unknown models",
|
help="Specify a file with context window and costs for unknown models",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--alias",
|
"--alias",
|
||||||
action="append",
|
action="append",
|
||||||
|
@ -161,7 +149,6 @@ def get_parser(default_config_files, git_root):
|
||||||
"--edit-format",
|
"--edit-format",
|
||||||
"--chat-mode",
|
"--chat-mode",
|
||||||
metavar="EDIT_FORMAT",
|
metavar="EDIT_FORMAT",
|
||||||
choices=edit_format_choices,
|
|
||||||
default=None,
|
default=None,
|
||||||
help="Specify what edit format the LLM should use (default depends on model)",
|
help="Specify what edit format the LLM should use (default depends on model)",
|
||||||
)
|
)
|
||||||
|
@ -196,7 +183,6 @@ def get_parser(default_config_files, git_root):
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--editor-edit-format",
|
"--editor-edit-format",
|
||||||
metavar="EDITOR_EDIT_FORMAT",
|
metavar="EDITOR_EDIT_FORMAT",
|
||||||
choices=edit_format_choices,
|
|
||||||
default=None,
|
default=None,
|
||||||
help="Specify the edit format for the editor model (default: depends on editor model)",
|
help="Specify the edit format for the editor model (default: depends on editor model)",
|
||||||
)
|
)
|
||||||
|
@ -276,13 +262,13 @@ def get_parser(default_config_files, git_root):
|
||||||
metavar="INPUT_HISTORY_FILE",
|
metavar="INPUT_HISTORY_FILE",
|
||||||
default=default_input_history_file,
|
default=default_input_history_file,
|
||||||
help=f"Specify the chat input history file (default: {default_input_history_file})",
|
help=f"Specify the chat input history file (default: {default_input_history_file})",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--chat-history-file",
|
"--chat-history-file",
|
||||||
metavar="CHAT_HISTORY_FILE",
|
metavar="CHAT_HISTORY_FILE",
|
||||||
default=default_chat_history_file,
|
default=default_chat_history_file,
|
||||||
help=f"Specify the chat history file (default: {default_chat_history_file})",
|
help=f"Specify the chat history file (default: {default_chat_history_file})",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--restore-chat-history",
|
"--restore-chat-history",
|
||||||
action=argparse.BooleanOptionalAction,
|
action=argparse.BooleanOptionalAction,
|
||||||
|
@ -294,7 +280,7 @@ def get_parser(default_config_files, git_root):
|
||||||
metavar="LLM_HISTORY_FILE",
|
metavar="LLM_HISTORY_FILE",
|
||||||
default=None,
|
default=None,
|
||||||
help="Log the conversation with the LLM to this file (for example, .aider.llm.history)",
|
help="Log the conversation with the LLM to this file (for example, .aider.llm.history)",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
|
|
||||||
##########
|
##########
|
||||||
group = parser.add_argument_group("Output settings")
|
group = parser.add_argument_group("Output settings")
|
||||||
|
@ -420,7 +406,7 @@ def get_parser(default_config_files, git_root):
|
||||||
type=lambda path_str: resolve_aiderignore_path(path_str, git_root),
|
type=lambda path_str: resolve_aiderignore_path(path_str, git_root),
|
||||||
default=default_aiderignore_file,
|
default=default_aiderignore_file,
|
||||||
help="Specify the aider ignore file (default: .aiderignore in git root)",
|
help="Specify the aider ignore file (default: .aiderignore in git root)",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--subtree-only",
|
"--subtree-only",
|
||||||
action="store_true",
|
action="store_true",
|
||||||
|
@ -566,7 +552,7 @@ def get_parser(default_config_files, git_root):
|
||||||
"--analytics-log",
|
"--analytics-log",
|
||||||
metavar="ANALYTICS_LOG_FILE",
|
metavar="ANALYTICS_LOG_FILE",
|
||||||
help="Specify a file to log analytics events",
|
help="Specify a file to log analytics events",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--analytics-disable",
|
"--analytics-disable",
|
||||||
action="store_true",
|
action="store_true",
|
||||||
|
@ -633,7 +619,7 @@ def get_parser(default_config_files, git_root):
|
||||||
"Specify a file containing the message to send the LLM, process reply, then exit"
|
"Specify a file containing the message to send the LLM, process reply, then exit"
|
||||||
" (disables chat mode)"
|
" (disables chat mode)"
|
||||||
),
|
),
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--gui",
|
"--gui",
|
||||||
"--browser",
|
"--browser",
|
||||||
|
@ -651,7 +637,7 @@ def get_parser(default_config_files, git_root):
|
||||||
"--apply",
|
"--apply",
|
||||||
metavar="FILE",
|
metavar="FILE",
|
||||||
help="Apply the changes from the given file instead of running the chat (debug)",
|
help="Apply the changes from the given file instead of running the chat (debug)",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--apply-clipboard-edits",
|
"--apply-clipboard-edits",
|
||||||
action="store_true",
|
action="store_true",
|
||||||
|
@ -712,13 +698,13 @@ def get_parser(default_config_files, git_root):
|
||||||
action="append",
|
action="append",
|
||||||
metavar="FILE",
|
metavar="FILE",
|
||||||
help="specify a file to edit (can be used multiple times)",
|
help="specify a file to edit (can be used multiple times)",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--read",
|
"--read",
|
||||||
action="append",
|
action="append",
|
||||||
metavar="FILE",
|
metavar="FILE",
|
||||||
help="specify a read-only file (can be used multiple times)",
|
help="specify a read-only file (can be used multiple times)",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--vim",
|
"--vim",
|
||||||
action="store_true",
|
action="store_true",
|
||||||
|
@ -731,12 +717,6 @@ def get_parser(default_config_files, git_root):
|
||||||
default=None,
|
default=None,
|
||||||
help="Specify the language to use in the chat (default: None, uses system settings)",
|
help="Specify the language to use in the chat (default: None, uses system settings)",
|
||||||
)
|
)
|
||||||
group.add_argument(
|
|
||||||
"--commit-language",
|
|
||||||
metavar="COMMIT_LANGUAGE",
|
|
||||||
default=None,
|
|
||||||
help="Specify the language to use in the commit message (default: None, user language)",
|
|
||||||
)
|
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--yes-always",
|
"--yes-always",
|
||||||
action="store_true",
|
action="store_true",
|
||||||
|
@ -754,7 +734,7 @@ def get_parser(default_config_files, git_root):
|
||||||
"--load",
|
"--load",
|
||||||
metavar="LOAD_FILE",
|
metavar="LOAD_FILE",
|
||||||
help="Load and execute /commands from a file on launch",
|
help="Load and execute /commands from a file on launch",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--encoding",
|
"--encoding",
|
||||||
default="utf-8",
|
default="utf-8",
|
||||||
|
@ -775,7 +755,7 @@ def get_parser(default_config_files, git_root):
|
||||||
"Specify the config file (default: search for .aider.conf.yml in git root, cwd"
|
"Specify the config file (default: search for .aider.conf.yml in git root, cwd"
|
||||||
" or home directory)"
|
" or home directory)"
|
||||||
),
|
),
|
||||||
).complete = shtab.FILE
|
)
|
||||||
# This is a duplicate of the argument in the preparser and is a no-op by this time of
|
# This is a duplicate of the argument in the preparser and is a no-op by this time of
|
||||||
# argument parsing, but it's here so that the help is displayed as expected.
|
# argument parsing, but it's here so that the help is displayed as expected.
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
|
@ -783,7 +763,7 @@ def get_parser(default_config_files, git_root):
|
||||||
metavar="ENV_FILE",
|
metavar="ENV_FILE",
|
||||||
default=default_env_file(git_root),
|
default=default_env_file(git_root),
|
||||||
help="Specify the .env file to load (default: .env in git root)",
|
help="Specify the .env file to load (default: .env in git root)",
|
||||||
).complete = shtab.FILE
|
)
|
||||||
group.add_argument(
|
group.add_argument(
|
||||||
"--suggest-shell-commands",
|
"--suggest-shell-commands",
|
||||||
action=argparse.BooleanOptionalAction,
|
action=argparse.BooleanOptionalAction,
|
||||||
|
|
|
@ -96,7 +96,7 @@ class YamlHelpFormatter(argparse.HelpFormatter):
|
||||||
# Place in your home dir, or at the root of your git repo.
|
# Place in your home dir, or at the root of your git repo.
|
||||||
##########################################################
|
##########################################################
|
||||||
|
|
||||||
# Note: You can only put OpenAI and Anthropic API keys in the YAML
|
# Note: You can only put OpenAI and Anthropic API keys in the yaml
|
||||||
# config file. Keys for all APIs can be stored in a .env file
|
# config file. Keys for all APIs can be stored in a .env file
|
||||||
# https://aider.chat/docs/config/dotenv.html
|
# https://aider.chat/docs/config/dotenv.html
|
||||||
|
|
||||||
|
|
|
@ -8,7 +8,8 @@ class AskPrompts(CoderPrompts):
|
||||||
Answer questions about the supplied code.
|
Answer questions about the supplied code.
|
||||||
Always reply to the user in {language}.
|
Always reply to the user in {language}.
|
||||||
|
|
||||||
If you need to describe code changes, do so *briefly*.
|
Describe code changes however you like, but elide unchanging code.
|
||||||
|
Don't use SEARCH/REPLACE blocks or return huge swaths of unchanging code.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
example_messages = []
|
example_messages = []
|
||||||
|
|
|
@ -118,7 +118,6 @@ class Coder:
|
||||||
detect_urls = True
|
detect_urls = True
|
||||||
ignore_mentions = None
|
ignore_mentions = None
|
||||||
chat_language = None
|
chat_language = None
|
||||||
commit_language = None
|
|
||||||
file_watcher = None
|
file_watcher = None
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
|
@ -329,7 +328,6 @@ class Coder:
|
||||||
num_cache_warming_pings=0,
|
num_cache_warming_pings=0,
|
||||||
suggest_shell_commands=True,
|
suggest_shell_commands=True,
|
||||||
chat_language=None,
|
chat_language=None,
|
||||||
commit_language=None,
|
|
||||||
detect_urls=True,
|
detect_urls=True,
|
||||||
ignore_mentions=None,
|
ignore_mentions=None,
|
||||||
total_tokens_sent=0,
|
total_tokens_sent=0,
|
||||||
|
@ -343,7 +341,6 @@ class Coder:
|
||||||
|
|
||||||
self.event = self.analytics.event
|
self.event = self.analytics.event
|
||||||
self.chat_language = chat_language
|
self.chat_language = chat_language
|
||||||
self.commit_language = commit_language
|
|
||||||
self.commit_before_message = []
|
self.commit_before_message = []
|
||||||
self.aider_commit_hashes = set()
|
self.aider_commit_hashes = set()
|
||||||
self.rejected_urls = set()
|
self.rejected_urls = set()
|
||||||
|
@ -448,7 +445,6 @@ class Coder:
|
||||||
fname = Path(fname)
|
fname = Path(fname)
|
||||||
if self.repo and self.repo.git_ignored_file(fname):
|
if self.repo and self.repo.git_ignored_file(fname):
|
||||||
self.io.tool_warning(f"Skipping {fname} that matches gitignore spec.")
|
self.io.tool_warning(f"Skipping {fname} that matches gitignore spec.")
|
||||||
continue
|
|
||||||
|
|
||||||
if self.repo and self.repo.ignored_file(fname):
|
if self.repo and self.repo.ignored_file(fname):
|
||||||
self.io.tool_warning(f"Skipping {fname} that matches aiderignore spec.")
|
self.io.tool_warning(f"Skipping {fname} that matches aiderignore spec.")
|
||||||
|
@ -1053,9 +1049,6 @@ class Coder:
|
||||||
if not lang_code:
|
if not lang_code:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
if lang_code.upper() in ("C", "POSIX"):
|
|
||||||
return None
|
|
||||||
|
|
||||||
# Probably already a language name
|
# Probably already a language name
|
||||||
if (
|
if (
|
||||||
len(lang_code) > 3
|
len(lang_code) > 3
|
||||||
|
@ -1086,8 +1079,7 @@ class Coder:
|
||||||
"ko": "Korean",
|
"ko": "Korean",
|
||||||
"ru": "Russian",
|
"ru": "Russian",
|
||||||
}
|
}
|
||||||
primary_lang_code = lang_code.replace("-", "_").split("_")[0].lower()
|
return fallback.get(lang_code.split("_")[0].lower(), lang_code)
|
||||||
return fallback.get(primary_lang_code, lang_code)
|
|
||||||
|
|
||||||
def get_user_language(self):
|
def get_user_language(self):
|
||||||
"""
|
"""
|
||||||
|
@ -1098,7 +1090,6 @@ class Coder:
|
||||||
2. ``locale.getlocale()``
|
2. ``locale.getlocale()``
|
||||||
3. ``LANG`` / ``LANGUAGE`` / ``LC_ALL`` / ``LC_MESSAGES`` environment variables
|
3. ``LANG`` / ``LANGUAGE`` / ``LC_ALL`` / ``LC_MESSAGES`` environment variables
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# Explicit override
|
# Explicit override
|
||||||
if self.chat_language:
|
if self.chat_language:
|
||||||
return self.normalize_language(self.chat_language)
|
return self.normalize_language(self.chat_language)
|
||||||
|
@ -1107,11 +1098,9 @@ class Coder:
|
||||||
try:
|
try:
|
||||||
lang = locale.getlocale()[0]
|
lang = locale.getlocale()[0]
|
||||||
if lang:
|
if lang:
|
||||||
lang = self.normalize_language(lang)
|
return self.normalize_language(lang)
|
||||||
if lang:
|
|
||||||
return lang
|
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass # pragma: no cover
|
||||||
|
|
||||||
# Environment variables
|
# Environment variables
|
||||||
for env_var in ("LANG", "LANGUAGE", "LC_ALL", "LC_MESSAGES"):
|
for env_var in ("LANG", "LANGUAGE", "LC_ALL", "LC_MESSAGES"):
|
||||||
|
@ -1193,10 +1182,10 @@ class Coder:
|
||||||
)
|
)
|
||||||
rename_with_shell = ""
|
rename_with_shell = ""
|
||||||
|
|
||||||
if user_lang: # user_lang is the result of self.get_user_language()
|
if self.chat_language:
|
||||||
language = user_lang
|
language = self.chat_language
|
||||||
else:
|
else:
|
||||||
language = "the same language they are using" # Default if no specific lang detected
|
language = "the same language they are using"
|
||||||
|
|
||||||
if self.fence[0] == "`" * 4:
|
if self.fence[0] == "`" * 4:
|
||||||
quad_backtick_reminder = (
|
quad_backtick_reminder = (
|
||||||
|
|
|
@ -109,7 +109,7 @@ class RelativeIndenter:
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if self.marker in text:
|
if self.marker in text:
|
||||||
raise ValueError(f"Text already contains the outdent marker: {self.marker}")
|
raise ValueError("Text already contains the outdent marker: {self.marker}")
|
||||||
|
|
||||||
lines = text.splitlines(keepends=True)
|
lines = text.splitlines(keepends=True)
|
||||||
|
|
||||||
|
|
|
@ -346,7 +346,7 @@ class Commands:
|
||||||
return
|
return
|
||||||
|
|
||||||
commit_message = args.strip() if args else None
|
commit_message = args.strip() if args else None
|
||||||
self.coder.repo.commit(message=commit_message, coder=self.coder)
|
self.coder.repo.commit(message=commit_message)
|
||||||
|
|
||||||
def cmd_lint(self, args="", fnames=None):
|
def cmd_lint(self, args="", fnames=None):
|
||||||
"Lint and fix in-chat files or all dirty files if none in chat"
|
"Lint and fix in-chat files or all dirty files if none in chat"
|
||||||
|
@ -1392,30 +1392,7 @@ class Commands:
|
||||||
"Print out the current settings"
|
"Print out the current settings"
|
||||||
settings = format_settings(self.parser, self.args)
|
settings = format_settings(self.parser, self.args)
|
||||||
announcements = "\n".join(self.coder.get_announcements())
|
announcements = "\n".join(self.coder.get_announcements())
|
||||||
|
|
||||||
# Build metadata for the active models (main, editor, weak)
|
|
||||||
model_sections = []
|
|
||||||
active_models = [
|
|
||||||
("Main model", self.coder.main_model),
|
|
||||||
("Editor model", getattr(self.coder.main_model, "editor_model", None)),
|
|
||||||
("Weak model", getattr(self.coder.main_model, "weak_model", None)),
|
|
||||||
]
|
|
||||||
for label, model in active_models:
|
|
||||||
if not model:
|
|
||||||
continue
|
|
||||||
info = getattr(model, "info", {}) or {}
|
|
||||||
if not info:
|
|
||||||
continue
|
|
||||||
model_sections.append(f"{label} ({model.name}):")
|
|
||||||
for k, v in sorted(info.items()):
|
|
||||||
model_sections.append(f" {k}: {v}")
|
|
||||||
model_sections.append("") # blank line between models
|
|
||||||
|
|
||||||
model_metadata = "\n".join(model_sections)
|
|
||||||
|
|
||||||
output = f"{announcements}\n{settings}"
|
output = f"{announcements}\n{settings}"
|
||||||
if model_metadata:
|
|
||||||
output += "\n" + model_metadata
|
|
||||||
self.io.tool_output(output)
|
self.io.tool_output(output)
|
||||||
|
|
||||||
def completions_raw_load(self, document, complete_event):
|
def completions_raw_load(self, document, complete_event):
|
||||||
|
|
|
@ -749,7 +749,7 @@ class InputOutput:
|
||||||
if not self.llm_history_file:
|
if not self.llm_history_file:
|
||||||
return
|
return
|
||||||
timestamp = datetime.now().isoformat(timespec="seconds")
|
timestamp = datetime.now().isoformat(timespec="seconds")
|
||||||
with open(self.llm_history_file, "a", encoding="utf-8") as log_file:
|
with open(self.llm_history_file, "a", encoding=self.encoding) as log_file:
|
||||||
log_file.write(f"{role.upper()} {timestamp}\n")
|
log_file.write(f"{role.upper()} {timestamp}\n")
|
||||||
log_file.write(content + "\n")
|
log_file.write(content + "\n")
|
||||||
|
|
||||||
|
|
|
@ -993,7 +993,6 @@ def main(argv=None, input=None, output=None, force_git_root=None, return_coder=F
|
||||||
num_cache_warming_pings=args.cache_keepalive_pings,
|
num_cache_warming_pings=args.cache_keepalive_pings,
|
||||||
suggest_shell_commands=args.suggest_shell_commands,
|
suggest_shell_commands=args.suggest_shell_commands,
|
||||||
chat_language=args.chat_language,
|
chat_language=args.chat_language,
|
||||||
commit_language=args.commit_language,
|
|
||||||
detect_urls=args.detect_urls,
|
detect_urls=args.detect_urls,
|
||||||
auto_copy_context=args.copy_paste,
|
auto_copy_context=args.copy_paste,
|
||||||
auto_accept_architect=args.auto_accept_architect,
|
auto_accept_architect=args.auto_accept_architect,
|
||||||
|
|
|
@ -8,7 +8,6 @@ import platform
|
||||||
import sys
|
import sys
|
||||||
import time
|
import time
|
||||||
from dataclasses import dataclass, fields
|
from dataclasses import dataclass, fields
|
||||||
from datetime import datetime
|
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Optional, Union
|
from typing import Optional, Union
|
||||||
|
|
||||||
|
@ -16,10 +15,8 @@ import json5
|
||||||
import yaml
|
import yaml
|
||||||
from PIL import Image
|
from PIL import Image
|
||||||
|
|
||||||
from aider import __version__
|
|
||||||
from aider.dump import dump # noqa: F401
|
from aider.dump import dump # noqa: F401
|
||||||
from aider.llm import litellm
|
from aider.llm import litellm
|
||||||
from aider.openrouter import OpenRouterModelManager
|
|
||||||
from aider.sendchat import ensure_alternating_roles, sanity_check_messages
|
from aider.sendchat import ensure_alternating_roles, sanity_check_messages
|
||||||
from aider.utils import check_pip_install_extra
|
from aider.utils import check_pip_install_extra
|
||||||
|
|
||||||
|
@ -72,8 +69,6 @@ claude-3-opus-20240229
|
||||||
claude-3-sonnet-20240229
|
claude-3-sonnet-20240229
|
||||||
claude-3-5-sonnet-20240620
|
claude-3-5-sonnet-20240620
|
||||||
claude-3-5-sonnet-20241022
|
claude-3-5-sonnet-20241022
|
||||||
claude-sonnet-4-20250514
|
|
||||||
claude-opus-4-20250514
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
ANTHROPIC_MODELS = [ln.strip() for ln in ANTHROPIC_MODELS.splitlines() if ln.strip()]
|
ANTHROPIC_MODELS = [ln.strip() for ln in ANTHROPIC_MODELS.splitlines() if ln.strip()]
|
||||||
|
@ -81,9 +76,9 @@ ANTHROPIC_MODELS = [ln.strip() for ln in ANTHROPIC_MODELS.splitlines() if ln.str
|
||||||
# Mapping of model aliases to their canonical names
|
# Mapping of model aliases to their canonical names
|
||||||
MODEL_ALIASES = {
|
MODEL_ALIASES = {
|
||||||
# Claude models
|
# Claude models
|
||||||
"sonnet": "anthropic/claude-sonnet-4-20250514",
|
"sonnet": "anthropic/claude-3-7-sonnet-20250219",
|
||||||
"haiku": "claude-3-5-haiku-20241022",
|
"haiku": "claude-3-5-haiku-20241022",
|
||||||
"opus": "claude-opus-4-20250514",
|
"opus": "claude-3-opus-20240229",
|
||||||
# GPT models
|
# GPT models
|
||||||
"4": "gpt-4-0613",
|
"4": "gpt-4-0613",
|
||||||
"4o": "gpt-4o",
|
"4o": "gpt-4o",
|
||||||
|
@ -96,8 +91,8 @@ MODEL_ALIASES = {
|
||||||
"flash": "gemini/gemini-2.5-flash-preview-04-17",
|
"flash": "gemini/gemini-2.5-flash-preview-04-17",
|
||||||
"quasar": "openrouter/openrouter/quasar-alpha",
|
"quasar": "openrouter/openrouter/quasar-alpha",
|
||||||
"r1": "deepseek/deepseek-reasoner",
|
"r1": "deepseek/deepseek-reasoner",
|
||||||
"gemini-2.5-pro": "gemini/gemini-2.5-pro-preview-06-05",
|
"gemini-2.5-pro": "gemini/gemini-2.5-pro-preview-05-06",
|
||||||
"gemini": "gemini/gemini-2.5-pro-preview-06-05",
|
"gemini": "gemini/gemini-2.5-pro-preview-05-06",
|
||||||
"gemini-exp": "gemini/gemini-2.5-pro-exp-03-25",
|
"gemini-exp": "gemini/gemini-2.5-pro-exp-03-25",
|
||||||
"grok3": "xai/grok-3-beta",
|
"grok3": "xai/grok-3-beta",
|
||||||
"optimus": "openrouter/openrouter/optimus-alpha",
|
"optimus": "openrouter/openrouter/optimus-alpha",
|
||||||
|
@ -154,13 +149,8 @@ class ModelInfoManager:
|
||||||
self.verify_ssl = True
|
self.verify_ssl = True
|
||||||
self._cache_loaded = False
|
self._cache_loaded = False
|
||||||
|
|
||||||
# Manager for the cached OpenRouter model database
|
|
||||||
self.openrouter_manager = OpenRouterModelManager()
|
|
||||||
|
|
||||||
def set_verify_ssl(self, verify_ssl):
|
def set_verify_ssl(self, verify_ssl):
|
||||||
self.verify_ssl = verify_ssl
|
self.verify_ssl = verify_ssl
|
||||||
if hasattr(self, "openrouter_manager"):
|
|
||||||
self.openrouter_manager.set_verify_ssl(verify_ssl)
|
|
||||||
|
|
||||||
def _load_cache(self):
|
def _load_cache(self):
|
||||||
if self._cache_loaded:
|
if self._cache_loaded:
|
||||||
|
@ -242,12 +232,6 @@ class ModelInfoManager:
|
||||||
return litellm_info
|
return litellm_info
|
||||||
|
|
||||||
if not cached_info and model.startswith("openrouter/"):
|
if not cached_info and model.startswith("openrouter/"):
|
||||||
# First try using the locally cached OpenRouter model database
|
|
||||||
openrouter_info = self.openrouter_manager.get_model_info(model)
|
|
||||||
if openrouter_info:
|
|
||||||
return openrouter_info
|
|
||||||
|
|
||||||
# Fallback to legacy web-scraping if the API cache does not contain the model
|
|
||||||
openrouter_info = self.fetch_openrouter_model_info(model)
|
openrouter_info = self.fetch_openrouter_model_info(model)
|
||||||
if openrouter_info:
|
if openrouter_info:
|
||||||
return openrouter_info
|
return openrouter_info
|
||||||
|
@ -877,57 +861,6 @@ class Model(ModelSettings):
|
||||||
def is_ollama(self):
|
def is_ollama(self):
|
||||||
return self.name.startswith("ollama/") or self.name.startswith("ollama_chat/")
|
return self.name.startswith("ollama/") or self.name.startswith("ollama_chat/")
|
||||||
|
|
||||||
def github_copilot_token_to_open_ai_key(self, extra_headers):
|
|
||||||
# check to see if there's an openai api key
|
|
||||||
# If so, check to see if it's expire
|
|
||||||
openai_api_key = "OPENAI_API_KEY"
|
|
||||||
|
|
||||||
if openai_api_key not in os.environ or (
|
|
||||||
int(dict(x.split("=") for x in os.environ[openai_api_key].split(";"))["exp"])
|
|
||||||
< int(datetime.now().timestamp())
|
|
||||||
):
|
|
||||||
import requests
|
|
||||||
|
|
||||||
class GitHubCopilotTokenError(Exception):
|
|
||||||
"""Custom exception for GitHub Copilot token-related errors."""
|
|
||||||
|
|
||||||
pass
|
|
||||||
|
|
||||||
# Validate GitHub Copilot token exists
|
|
||||||
if "GITHUB_COPILOT_TOKEN" not in os.environ:
|
|
||||||
raise KeyError("GITHUB_COPILOT_TOKEN environment variable not found")
|
|
||||||
|
|
||||||
github_token = os.environ["GITHUB_COPILOT_TOKEN"]
|
|
||||||
if not github_token.strip():
|
|
||||||
raise KeyError("GITHUB_COPILOT_TOKEN environment variable is empty")
|
|
||||||
|
|
||||||
headers = {
|
|
||||||
"Authorization": f"Bearer {os.environ['GITHUB_COPILOT_TOKEN']}",
|
|
||||||
"Editor-Version": extra_headers["Editor-Version"],
|
|
||||||
"Copilot-Integration-Id": extra_headers["Copilot-Integration-Id"],
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
}
|
|
||||||
|
|
||||||
url = "https://api.github.com/copilot_internal/v2/token"
|
|
||||||
res = requests.get(url, headers=headers)
|
|
||||||
if res.status_code != 200:
|
|
||||||
safe_headers = {k: v for k, v in headers.items() if k != "Authorization"}
|
|
||||||
token_preview = github_token[:5] + "..." if len(github_token) >= 5 else github_token
|
|
||||||
safe_headers["Authorization"] = f"Bearer {token_preview}"
|
|
||||||
raise GitHubCopilotTokenError(
|
|
||||||
f"GitHub Copilot API request failed (Status: {res.status_code})\n"
|
|
||||||
f"URL: {url}\n"
|
|
||||||
f"Headers: {json.dumps(safe_headers, indent=2)}\n"
|
|
||||||
f"JSON: {res.text}"
|
|
||||||
)
|
|
||||||
|
|
||||||
response_data = res.json()
|
|
||||||
token = response_data.get("token")
|
|
||||||
if not token:
|
|
||||||
raise GitHubCopilotTokenError("Response missing 'token' field")
|
|
||||||
|
|
||||||
os.environ[openai_api_key] = token
|
|
||||||
|
|
||||||
def send_completion(self, messages, functions, stream, temperature=None):
|
def send_completion(self, messages, functions, stream, temperature=None):
|
||||||
if os.environ.get("AIDER_SANITY_CHECK_TURNS"):
|
if os.environ.get("AIDER_SANITY_CHECK_TURNS"):
|
||||||
sanity_check_messages(messages)
|
sanity_check_messages(messages)
|
||||||
|
@ -969,16 +902,6 @@ class Model(ModelSettings):
|
||||||
dump(kwargs)
|
dump(kwargs)
|
||||||
kwargs["messages"] = messages
|
kwargs["messages"] = messages
|
||||||
|
|
||||||
# Are we using github copilot?
|
|
||||||
if "GITHUB_COPILOT_TOKEN" in os.environ:
|
|
||||||
if "extra_headers" not in kwargs:
|
|
||||||
kwargs["extra_headers"] = {
|
|
||||||
"Editor-Version": f"aider/{__version__}",
|
|
||||||
"Copilot-Integration-Id": "vscode-chat",
|
|
||||||
}
|
|
||||||
|
|
||||||
self.github_copilot_token_to_open_ai_key(kwargs["extra_headers"])
|
|
||||||
|
|
||||||
res = litellm.completion(**kwargs)
|
res = litellm.completion(**kwargs)
|
||||||
return hash_object, res
|
return hash_object, res
|
||||||
|
|
||||||
|
|
|
@ -55,9 +55,9 @@ def try_to_select_default_model():
|
||||||
# Check if the user is on a free tier
|
# Check if the user is on a free tier
|
||||||
is_free_tier = check_openrouter_tier(openrouter_key)
|
is_free_tier = check_openrouter_tier(openrouter_key)
|
||||||
if is_free_tier:
|
if is_free_tier:
|
||||||
return "openrouter/deepseek/deepseek-r1:free"
|
return "openrouter/google/gemini-2.5-pro-exp-03-25:free"
|
||||||
else:
|
else:
|
||||||
return "openrouter/anthropic/claude-sonnet-4"
|
return "openrouter/anthropic/claude-3.7-sonnet"
|
||||||
|
|
||||||
# Select model based on other available API keys
|
# Select model based on other available API keys
|
||||||
model_key_pairs = [
|
model_key_pairs = [
|
||||||
|
|
|
@ -1,128 +0,0 @@
|
||||||
"""
|
|
||||||
OpenRouter model metadata caching and lookup.
|
|
||||||
|
|
||||||
This module keeps a local cached copy of the OpenRouter model list
|
|
||||||
(downloaded from ``https://openrouter.ai/api/v1/models``) and exposes a
|
|
||||||
helper class that returns metadata for a given model in a format compatible
|
|
||||||
with litellm’s ``get_model_info``.
|
|
||||||
"""
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
import json
|
|
||||||
import time
|
|
||||||
from pathlib import Path
|
|
||||||
from typing import Dict
|
|
||||||
|
|
||||||
import requests
|
|
||||||
|
|
||||||
|
|
||||||
def _cost_per_token(val: str | None) -> float | None:
|
|
||||||
"""Convert a price string (USD per token) to a float."""
|
|
||||||
if val in (None, "", "0"):
|
|
||||||
return 0.0 if val == "0" else None
|
|
||||||
try:
|
|
||||||
return float(val)
|
|
||||||
except Exception: # noqa: BLE001
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
class OpenRouterModelManager:
|
|
||||||
MODELS_URL = "https://openrouter.ai/api/v1/models"
|
|
||||||
CACHE_TTL = 60 * 60 * 24 # 24 h
|
|
||||||
|
|
||||||
def __init__(self) -> None:
|
|
||||||
self.cache_dir = Path.home() / ".aider" / "caches"
|
|
||||||
self.cache_file = self.cache_dir / "openrouter_models.json"
|
|
||||||
self.content: Dict | None = None
|
|
||||||
self.verify_ssl: bool = True
|
|
||||||
self._cache_loaded = False
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------ #
|
|
||||||
# Public API #
|
|
||||||
# ------------------------------------------------------------------ #
|
|
||||||
def set_verify_ssl(self, verify_ssl: bool) -> None:
|
|
||||||
"""Enable/disable SSL verification for API requests."""
|
|
||||||
self.verify_ssl = verify_ssl
|
|
||||||
|
|
||||||
def get_model_info(self, model: str) -> Dict:
|
|
||||||
"""
|
|
||||||
Return metadata for *model* or an empty ``dict`` when unknown.
|
|
||||||
|
|
||||||
``model`` should use the aider naming convention, e.g.
|
|
||||||
``openrouter/nousresearch/deephermes-3-mistral-24b-preview:free``.
|
|
||||||
"""
|
|
||||||
self._ensure_content()
|
|
||||||
if not self.content or "data" not in self.content:
|
|
||||||
return {}
|
|
||||||
|
|
||||||
route = self._strip_prefix(model)
|
|
||||||
|
|
||||||
# Consider both the exact id and id without any “:suffix”.
|
|
||||||
candidates = {route}
|
|
||||||
if ":" in route:
|
|
||||||
candidates.add(route.split(":", 1)[0])
|
|
||||||
|
|
||||||
record = next((item for item in self.content["data"] if item.get("id") in candidates), None)
|
|
||||||
if not record:
|
|
||||||
return {}
|
|
||||||
|
|
||||||
context_len = (
|
|
||||||
record.get("top_provider", {}).get("context_length")
|
|
||||||
or record.get("context_length")
|
|
||||||
or None
|
|
||||||
)
|
|
||||||
|
|
||||||
pricing = record.get("pricing", {})
|
|
||||||
return {
|
|
||||||
"max_input_tokens": context_len,
|
|
||||||
"max_tokens": context_len,
|
|
||||||
"max_output_tokens": context_len,
|
|
||||||
"input_cost_per_token": _cost_per_token(pricing.get("prompt")),
|
|
||||||
"output_cost_per_token": _cost_per_token(pricing.get("completion")),
|
|
||||||
"litellm_provider": "openrouter",
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------ #
|
|
||||||
# Internal helpers #
|
|
||||||
# ------------------------------------------------------------------ #
|
|
||||||
def _strip_prefix(self, model: str) -> str:
|
|
||||||
return model[len("openrouter/") :] if model.startswith("openrouter/") else model
|
|
||||||
|
|
||||||
def _ensure_content(self) -> None:
|
|
||||||
self._load_cache()
|
|
||||||
if not self.content:
|
|
||||||
self._update_cache()
|
|
||||||
|
|
||||||
def _load_cache(self) -> None:
|
|
||||||
if self._cache_loaded:
|
|
||||||
return
|
|
||||||
try:
|
|
||||||
self.cache_dir.mkdir(parents=True, exist_ok=True)
|
|
||||||
if self.cache_file.exists():
|
|
||||||
cache_age = time.time() - self.cache_file.stat().st_mtime
|
|
||||||
if cache_age < self.CACHE_TTL:
|
|
||||||
try:
|
|
||||||
self.content = json.loads(self.cache_file.read_text())
|
|
||||||
except json.JSONDecodeError:
|
|
||||||
self.content = None
|
|
||||||
except OSError:
|
|
||||||
# Cache directory might be unwritable; ignore.
|
|
||||||
pass
|
|
||||||
|
|
||||||
self._cache_loaded = True
|
|
||||||
|
|
||||||
def _update_cache(self) -> None:
|
|
||||||
try:
|
|
||||||
response = requests.get(self.MODELS_URL, timeout=10, verify=self.verify_ssl)
|
|
||||||
if response.status_code == 200:
|
|
||||||
self.content = response.json()
|
|
||||||
try:
|
|
||||||
self.cache_file.write_text(json.dumps(self.content, indent=2))
|
|
||||||
except OSError:
|
|
||||||
pass # Non-fatal if we can’t write the cache
|
|
||||||
except Exception as ex: # noqa: BLE001
|
|
||||||
print(f"Failed to fetch OpenRouter model list: {ex}")
|
|
||||||
try:
|
|
||||||
self.cache_file.write_text("{}")
|
|
||||||
except OSError:
|
|
||||||
pass
|
|
|
@ -21,7 +21,6 @@ import pathspec
|
||||||
from aider import prompts, utils
|
from aider import prompts, utils
|
||||||
|
|
||||||
from .dump import dump # noqa: F401
|
from .dump import dump # noqa: F401
|
||||||
from .waiting import WaitingSpinner
|
|
||||||
|
|
||||||
ANY_GIT_ERROR += [
|
ANY_GIT_ERROR += [
|
||||||
OSError,
|
OSError,
|
||||||
|
@ -210,8 +209,6 @@ class GitRepo:
|
||||||
else:
|
else:
|
||||||
user_language = None
|
user_language = None
|
||||||
if coder:
|
if coder:
|
||||||
user_language = coder.commit_language
|
|
||||||
if not user_language:
|
|
||||||
user_language = coder.get_user_language()
|
user_language = coder.get_user_language()
|
||||||
commit_message = self.get_commit_message(diffs, context, user_language)
|
commit_message = self.get_commit_message(diffs, context, user_language)
|
||||||
|
|
||||||
|
@ -334,35 +331,25 @@ class GitRepo:
|
||||||
content += diffs
|
content += diffs
|
||||||
|
|
||||||
system_content = self.commit_prompt or prompts.commit_system
|
system_content = self.commit_prompt or prompts.commit_system
|
||||||
|
|
||||||
language_instruction = ""
|
language_instruction = ""
|
||||||
if user_language:
|
if user_language:
|
||||||
language_instruction = f"\n- Is written in {user_language}."
|
language_instruction = f"\n- Is written in {user_language}."
|
||||||
system_content = system_content.format(language_instruction=language_instruction)
|
system_content = system_content.format(language_instruction=language_instruction)
|
||||||
|
|
||||||
commit_message = None
|
|
||||||
for model in self.models:
|
|
||||||
spinner_text = f"Generating commit message with {model.name}"
|
|
||||||
with WaitingSpinner(spinner_text):
|
|
||||||
if model.system_prompt_prefix:
|
|
||||||
current_system_content = model.system_prompt_prefix + "\n" + system_content
|
|
||||||
else:
|
|
||||||
current_system_content = system_content
|
|
||||||
|
|
||||||
messages = [
|
messages = [
|
||||||
dict(role="system", content=current_system_content),
|
dict(role="system", content=system_content),
|
||||||
dict(role="user", content=content),
|
dict(role="user", content=content),
|
||||||
]
|
]
|
||||||
|
|
||||||
|
commit_message = None
|
||||||
|
for model in self.models:
|
||||||
num_tokens = model.token_count(messages)
|
num_tokens = model.token_count(messages)
|
||||||
max_tokens = model.info.get("max_input_tokens") or 0
|
max_tokens = model.info.get("max_input_tokens") or 0
|
||||||
|
|
||||||
if max_tokens and num_tokens > max_tokens:
|
if max_tokens and num_tokens > max_tokens:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
commit_message = model.simple_send_with_retries(messages)
|
commit_message = model.simple_send_with_retries(messages)
|
||||||
if commit_message:
|
if commit_message:
|
||||||
break # Found a model that could generate the message
|
break
|
||||||
|
|
||||||
if not commit_message:
|
if not commit_message:
|
||||||
self.io.tool_error("Failed to generate commit message!")
|
self.io.tool_error("Failed to generate commit message!")
|
||||||
|
@ -399,20 +386,14 @@ class GitRepo:
|
||||||
try:
|
try:
|
||||||
if current_branch_has_commits:
|
if current_branch_has_commits:
|
||||||
args = ["HEAD", "--"] + list(fnames)
|
args = ["HEAD", "--"] + list(fnames)
|
||||||
diffs += self.repo.git.diff(*args, stdout_as_string=False).decode(
|
diffs += self.repo.git.diff(*args)
|
||||||
self.io.encoding, "replace"
|
|
||||||
)
|
|
||||||
return diffs
|
return diffs
|
||||||
|
|
||||||
wd_args = ["--"] + list(fnames)
|
wd_args = ["--"] + list(fnames)
|
||||||
index_args = ["--cached"] + wd_args
|
index_args = ["--cached"] + wd_args
|
||||||
|
|
||||||
diffs += self.repo.git.diff(*index_args, stdout_as_string=False).decode(
|
diffs += self.repo.git.diff(*index_args)
|
||||||
self.io.encoding, "replace"
|
diffs += self.repo.git.diff(*wd_args)
|
||||||
)
|
|
||||||
diffs += self.repo.git.diff(*wd_args, stdout_as_string=False).decode(
|
|
||||||
self.io.encoding, "replace"
|
|
||||||
)
|
|
||||||
|
|
||||||
return diffs
|
return diffs
|
||||||
except ANY_GIT_ERROR as err:
|
except ANY_GIT_ERROR as err:
|
||||||
|
@ -426,9 +407,7 @@ class GitRepo:
|
||||||
args += ["--color=never"]
|
args += ["--color=never"]
|
||||||
|
|
||||||
args += [from_commit, to_commit]
|
args += [from_commit, to_commit]
|
||||||
diffs = self.repo.git.diff(*args, stdout_as_string=False).decode(
|
diffs = self.repo.git.diff(*args)
|
||||||
self.io.encoding, "replace"
|
|
||||||
)
|
|
||||||
|
|
||||||
return diffs
|
return diffs
|
||||||
|
|
||||||
|
|
|
@ -19,7 +19,7 @@ from tqdm import tqdm
|
||||||
|
|
||||||
from aider.dump import dump
|
from aider.dump import dump
|
||||||
from aider.special import filter_important_files
|
from aider.special import filter_important_files
|
||||||
from aider.waiting import Spinner
|
from aider.utils import Spinner
|
||||||
|
|
||||||
# tree_sitter is throwing a FutureWarning
|
# tree_sitter is throwing a FutureWarning
|
||||||
warnings.simplefilter("ignore", category=FutureWarning)
|
warnings.simplefilter("ignore", category=FutureWarning)
|
||||||
|
|
|
@ -49,7 +49,7 @@
|
||||||
},
|
},
|
||||||
"openrouter/deepseek/deepseek-chat-v3-0324": {
|
"openrouter/deepseek/deepseek-chat-v3-0324": {
|
||||||
"max_tokens": 8192,
|
"max_tokens": 8192,
|
||||||
"max_input_tokens": 131072,
|
"max_input_tokens": 64000,
|
||||||
"max_output_tokens": 8192,
|
"max_output_tokens": 8192,
|
||||||
"input_cost_per_token": 0.00000055,
|
"input_cost_per_token": 0.00000055,
|
||||||
"input_cost_per_token_cache_hit": 0.00000014,
|
"input_cost_per_token_cache_hit": 0.00000014,
|
||||||
|
@ -432,35 +432,6 @@
|
||||||
"supported_output_modalities": ["text"],
|
"supported_output_modalities": ["text"],
|
||||||
"source": "https://ai.google.dev/gemini-api/docs/models#gemini-2.5-flash-preview"
|
"source": "https://ai.google.dev/gemini-api/docs/models#gemini-2.5-flash-preview"
|
||||||
},
|
},
|
||||||
"gemini-2.5-pro-preview-06-05": {
|
|
||||||
"max_tokens": 65536,
|
|
||||||
"max_input_tokens": 1048576,
|
|
||||||
"max_output_tokens": 65536,
|
|
||||||
"max_images_per_prompt": 3000,
|
|
||||||
"max_videos_per_prompt": 10,
|
|
||||||
"max_video_length": 1,
|
|
||||||
"max_audio_length_hours": 8.4,
|
|
||||||
"max_audio_per_prompt": 1,
|
|
||||||
"max_pdf_size_mb": 30,
|
|
||||||
"input_cost_per_audio_token": 0.00000125,
|
|
||||||
"input_cost_per_token": 0.00000125,
|
|
||||||
"input_cost_per_token_above_200k_tokens": 0.0000025,
|
|
||||||
"output_cost_per_token": 0.00001,
|
|
||||||
"output_cost_per_token_above_200k_tokens": 0.000015,
|
|
||||||
"litellm_provider": "vertex_ai-language-models",
|
|
||||||
"mode": "chat",
|
|
||||||
"supports_reasoning": true,
|
|
||||||
"supports_system_messages": true,
|
|
||||||
"supports_function_calling": true,
|
|
||||||
"supports_vision": true,
|
|
||||||
"supports_response_schema": true,
|
|
||||||
"supports_audio_output": false,
|
|
||||||
"supports_tool_choice": true,
|
|
||||||
"supported_endpoints": ["/v1/chat/completions", "/v1/completions", "/v1/batch"],
|
|
||||||
"supported_modalities": ["text", "image", "audio", "video"],
|
|
||||||
"supported_output_modalities": ["text"],
|
|
||||||
"source": "https://ai.google.dev/gemini-api/docs/models#gemini-2.5-flash-preview"
|
|
||||||
},
|
|
||||||
"gemini/gemini-2.5-pro-preview-05-06": {
|
"gemini/gemini-2.5-pro-preview-05-06": {
|
||||||
"max_tokens": 65536,
|
"max_tokens": 65536,
|
||||||
"max_input_tokens": 1048576,
|
"max_input_tokens": 1048576,
|
||||||
|
@ -490,35 +461,6 @@
|
||||||
"supported_output_modalities": ["text"],
|
"supported_output_modalities": ["text"],
|
||||||
"source": "https://ai.google.dev/gemini-api/docs/pricing#gemini-2.5-pro-preview"
|
"source": "https://ai.google.dev/gemini-api/docs/pricing#gemini-2.5-pro-preview"
|
||||||
},
|
},
|
||||||
"gemini/gemini-2.5-pro-preview-06-05": {
|
|
||||||
"max_tokens": 65536,
|
|
||||||
"max_input_tokens": 1048576,
|
|
||||||
"max_output_tokens": 65536,
|
|
||||||
"max_images_per_prompt": 3000,
|
|
||||||
"max_videos_per_prompt": 10,
|
|
||||||
"max_video_length": 1,
|
|
||||||
"max_audio_length_hours": 8.4,
|
|
||||||
"max_audio_per_prompt": 1,
|
|
||||||
"max_pdf_size_mb": 30,
|
|
||||||
"input_cost_per_audio_token": 0.0000007,
|
|
||||||
"input_cost_per_token": 0.00000125,
|
|
||||||
"input_cost_per_token_above_200k_tokens": 0.0000025,
|
|
||||||
"output_cost_per_token": 0.00001,
|
|
||||||
"output_cost_per_token_above_200k_tokens": 0.000015,
|
|
||||||
"litellm_provider": "gemini",
|
|
||||||
"mode": "chat",
|
|
||||||
"rpm": 10000,
|
|
||||||
"tpm": 10000000,
|
|
||||||
"supports_system_messages": true,
|
|
||||||
"supports_function_calling": true,
|
|
||||||
"supports_vision": true,
|
|
||||||
"supports_response_schema": true,
|
|
||||||
"supports_audio_output": false,
|
|
||||||
"supports_tool_choice": true,
|
|
||||||
"supported_modalities": ["text", "image", "audio", "video"],
|
|
||||||
"supported_output_modalities": ["text"],
|
|
||||||
"source": "https://ai.google.dev/gemini-api/docs/pricing#gemini-2.5-pro-preview"
|
|
||||||
},
|
|
||||||
"together_ai/Qwen/Qwen3-235B-A22B-fp8-tput": {
|
"together_ai/Qwen/Qwen3-235B-A22B-fp8-tput": {
|
||||||
"input_cost_per_token": 0.0000002,
|
"input_cost_per_token": 0.0000002,
|
||||||
"output_cost_per_token": 0.0000006,
|
"output_cost_per_token": 0.0000006,
|
||||||
|
|
|
@ -1399,13 +1399,6 @@
|
||||||
use_repo_map: true
|
use_repo_map: true
|
||||||
weak_model_name: gemini/gemini-2.5-flash-preview-04-17
|
weak_model_name: gemini/gemini-2.5-flash-preview-04-17
|
||||||
|
|
||||||
- name: gemini/gemini-2.5-pro-preview-06-05
|
|
||||||
overeager: true
|
|
||||||
edit_format: diff-fenced
|
|
||||||
use_repo_map: true
|
|
||||||
weak_model_name: gemini/gemini-2.5-flash-preview-04-17
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: vertex_ai/gemini-2.5-pro-preview-05-06
|
- name: vertex_ai/gemini-2.5-pro-preview-05-06
|
||||||
edit_format: diff-fenced
|
edit_format: diff-fenced
|
||||||
use_repo_map: true
|
use_repo_map: true
|
||||||
|
@ -1413,27 +1406,12 @@
|
||||||
overeager: true
|
overeager: true
|
||||||
editor_model_name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
editor_model_name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
||||||
|
|
||||||
- name: vertex_ai/gemini-2.5-pro-preview-06-05
|
|
||||||
edit_format: diff-fenced
|
|
||||||
use_repo_map: true
|
|
||||||
weak_model_name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
|
||||||
overeager: true
|
|
||||||
editor_model_name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: openrouter/google/gemini-2.5-pro-preview-05-06
|
- name: openrouter/google/gemini-2.5-pro-preview-05-06
|
||||||
overeager: true
|
overeager: true
|
||||||
edit_format: diff-fenced
|
edit_format: diff-fenced
|
||||||
use_repo_map: true
|
use_repo_map: true
|
||||||
weak_model_name: openrouter/google/gemini-2.0-flash-001
|
weak_model_name: openrouter/google/gemini-2.0-flash-001
|
||||||
|
|
||||||
- name: openrouter/google/gemini-2.5-pro-preview-06-05
|
|
||||||
overeager: true
|
|
||||||
edit_format: diff-fenced
|
|
||||||
use_repo_map: true
|
|
||||||
weak_model_name: openrouter/google/gemini-2.0-flash-001
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
#- name: openrouter/qwen/qwen3-235b-a22b
|
#- name: openrouter/qwen/qwen3-235b-a22b
|
||||||
# system_prompt_prefix: "/no_think"
|
# system_prompt_prefix: "/no_think"
|
||||||
# use_temperature: 0.7
|
# use_temperature: 0.7
|
||||||
|
@ -1458,332 +1436,3 @@
|
||||||
# min_p: 0.0
|
# min_p: 0.0
|
||||||
# temperature: 0.7
|
# temperature: 0.7
|
||||||
|
|
||||||
|
|
||||||
- name: claude-sonnet-4-20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: claude-3-5-haiku-20241022
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: claude-sonnet-4-20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: anthropic/claude-sonnet-4-20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: anthropic/claude-3-5-haiku-20241022
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: anthropic/claude-sonnet-4-20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: bedrock/anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock/anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock/anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: bedrock/us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock/us.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock/us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: bedrock_converse/anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: bedrock_converse/us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/us.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: vertex_ai/claude-sonnet-4@20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
max_tokens: 64000
|
|
||||||
editor_model_name: vertex_ai/claude-sonnet-4@20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: vertex_ai-anthropic_models/vertex_ai/claude-sonnet-4@20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
max_tokens: 64000
|
|
||||||
editor_model_name: vertex_ai-anthropic_models/vertex_ai/claude-sonnet-4@20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: openrouter/anthropic/claude-sonnet-4
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: openrouter/anthropic/claude-3-5-haiku
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: openrouter/anthropic/claude-sonnet-4
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: bedrock_converse/eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/eu.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: eu.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: us.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: claude-opus-4-20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: claude-3-5-haiku-20241022
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: claude-sonnet-4-20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: anthropic/claude-opus-4-20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: anthropic/claude-3-5-haiku-20241022
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: anthropic/claude-sonnet-4-20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: bedrock_converse/anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: bedrock_converse/us.anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/us.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: bedrock_converse/eu.anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/eu.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: eu.anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: eu.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: us.anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: us.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: vertex_ai/claude-opus-4@20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
max_tokens: 32000
|
|
||||||
editor_model_name: vertex_ai/claude-sonnet-4@20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: vertex_ai-anthropic_models/vertex_ai/claude-opus-4@20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
max_tokens: 32000
|
|
||||||
editor_model_name: vertex_ai-anthropic_models/vertex_ai/claude-sonnet-4@20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
- name: vertex_ai/gemini-2.5-flash-preview-05-20
|
|
||||||
edit_format: diff
|
|
||||||
use_repo_map: true
|
|
||||||
accepts_settings: ["reasoning_effort", "thinking_tokens"]
|
|
||||||
- name: openrouter/anthropic/claude-opus-4
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: openrouter/anthropic/claude-3-5-haiku
|
|
||||||
use_repo_map: true
|
|
||||||
examples_as_sys_msg: false
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: openrouter/anthropic/claude-sonnet-4
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings: ["thinking_tokens"]
|
|
||||||
|
|
||||||
|
|
168
aider/utils.py
168
aider/utils.py
|
@ -3,12 +3,13 @@ import platform
|
||||||
import subprocess
|
import subprocess
|
||||||
import sys
|
import sys
|
||||||
import tempfile
|
import tempfile
|
||||||
|
import time
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
import oslex
|
import oslex
|
||||||
|
from rich.console import Console
|
||||||
|
|
||||||
from aider.dump import dump # noqa: F401
|
from aider.dump import dump # noqa: F401
|
||||||
from aider.waiting import Spinner
|
|
||||||
|
|
||||||
IMAGE_EXTENSIONS = {".png", ".jpg", ".jpeg", ".gif", ".bmp", ".tiff", ".webp", ".pdf"}
|
IMAGE_EXTENSIONS = {".png", ".jpg", ".jpeg", ".gif", ".bmp", ".tiff", ".webp", ".pdf"}
|
||||||
|
|
||||||
|
@ -250,6 +251,154 @@ def run_install(cmd):
|
||||||
return False, output
|
return False, output
|
||||||
|
|
||||||
|
|
||||||
|
class Spinner:
|
||||||
|
"""
|
||||||
|
Minimal spinner that scans a single marker back and forth across a line.
|
||||||
|
|
||||||
|
The animation is pre-rendered into a list of frames. If the terminal
|
||||||
|
cannot display unicode the frames are converted to plain ASCII.
|
||||||
|
"""
|
||||||
|
|
||||||
|
last_frame_idx = 0 # Class variable to store the last frame index
|
||||||
|
|
||||||
|
def __init__(self, text: str, width: int = 7):
|
||||||
|
self.text = text
|
||||||
|
self.start_time = time.time()
|
||||||
|
self.last_update = 0.0
|
||||||
|
self.visible = False
|
||||||
|
self.is_tty = sys.stdout.isatty()
|
||||||
|
self.console = Console()
|
||||||
|
|
||||||
|
# Pre-render the animation frames using pure ASCII so they will
|
||||||
|
# always display, even on very limited terminals.
|
||||||
|
ascii_frames = [
|
||||||
|
"#= ", # C1 C2 space(8)
|
||||||
|
"=# ", # C2 C1 space(8)
|
||||||
|
" =# ", # space(1) C2 C1 space(7)
|
||||||
|
" =# ", # space(2) C2 C1 space(6)
|
||||||
|
" =# ", # space(3) C2 C1 space(5)
|
||||||
|
" =# ", # space(4) C2 C1 space(4)
|
||||||
|
" =# ", # space(5) C2 C1 space(3)
|
||||||
|
" =# ", # space(6) C2 C1 space(2)
|
||||||
|
" =# ", # space(7) C2 C1 space(1)
|
||||||
|
" =#", # space(8) C2 C1
|
||||||
|
" #=", # space(8) C1 C2
|
||||||
|
" #= ", # space(7) C1 C2 space(1)
|
||||||
|
" #= ", # space(6) C1 C2 space(2)
|
||||||
|
" #= ", # space(5) C1 C2 space(3)
|
||||||
|
" #= ", # space(4) C1 C2 space(4)
|
||||||
|
" #= ", # space(3) C1 C2 space(5)
|
||||||
|
" #= ", # space(2) C1 C2 space(6)
|
||||||
|
" #= ", # space(1) C1 C2 space(7)
|
||||||
|
]
|
||||||
|
|
||||||
|
self.unicode_palette = "░█"
|
||||||
|
xlate_from, xlate_to = ("=#", self.unicode_palette)
|
||||||
|
|
||||||
|
# If unicode is supported, swap the ASCII chars for nicer glyphs.
|
||||||
|
if self._supports_unicode():
|
||||||
|
translation_table = str.maketrans(xlate_from, xlate_to)
|
||||||
|
frames = [f.translate(translation_table) for f in ascii_frames]
|
||||||
|
self.scan_char = xlate_to[xlate_from.find("#")]
|
||||||
|
else:
|
||||||
|
frames = ascii_frames
|
||||||
|
self.scan_char = "#"
|
||||||
|
|
||||||
|
# Bounce the scanner back and forth.
|
||||||
|
self.frames = frames
|
||||||
|
self.frame_idx = Spinner.last_frame_idx # Initialize from class variable
|
||||||
|
self.width = len(frames[0]) - 2 # number of chars between the brackets
|
||||||
|
self.animation_len = len(frames[0])
|
||||||
|
self.last_display_len = 0 # Length of the last spinner line (frame + text)
|
||||||
|
|
||||||
|
def _supports_unicode(self) -> bool:
|
||||||
|
if not self.is_tty:
|
||||||
|
return False
|
||||||
|
try:
|
||||||
|
out = self.unicode_palette
|
||||||
|
out += "\b" * len(self.unicode_palette)
|
||||||
|
out += " " * len(self.unicode_palette)
|
||||||
|
out += "\b" * len(self.unicode_palette)
|
||||||
|
sys.stdout.write(out)
|
||||||
|
sys.stdout.flush()
|
||||||
|
return True
|
||||||
|
except UnicodeEncodeError:
|
||||||
|
return False
|
||||||
|
except Exception:
|
||||||
|
return False
|
||||||
|
|
||||||
|
def _next_frame(self) -> str:
|
||||||
|
frame = self.frames[self.frame_idx]
|
||||||
|
self.frame_idx = (self.frame_idx + 1) % len(self.frames)
|
||||||
|
Spinner.last_frame_idx = self.frame_idx # Update class variable
|
||||||
|
return frame
|
||||||
|
|
||||||
|
def step(self, text: str = None) -> None:
|
||||||
|
if text is not None:
|
||||||
|
self.text = text
|
||||||
|
|
||||||
|
if not self.is_tty:
|
||||||
|
return
|
||||||
|
|
||||||
|
now = time.time()
|
||||||
|
if not self.visible and now - self.start_time >= 0.5:
|
||||||
|
self.visible = True
|
||||||
|
self.last_update = 0.0
|
||||||
|
if self.is_tty:
|
||||||
|
self.console.show_cursor(False)
|
||||||
|
|
||||||
|
if not self.visible or now - self.last_update < 0.1:
|
||||||
|
return
|
||||||
|
|
||||||
|
self.last_update = now
|
||||||
|
frame_str = self._next_frame()
|
||||||
|
|
||||||
|
# Determine the maximum width for the spinner line
|
||||||
|
# Subtract 2 as requested, to leave a margin or prevent cursor wrapping issues
|
||||||
|
max_spinner_width = self.console.width - 2
|
||||||
|
if max_spinner_width < 0: # Handle extremely narrow terminals
|
||||||
|
max_spinner_width = 0
|
||||||
|
|
||||||
|
current_text_payload = f" {self.text}"
|
||||||
|
line_to_display = f"{frame_str}{current_text_payload}"
|
||||||
|
|
||||||
|
# Truncate the line if it's too long for the console width
|
||||||
|
if len(line_to_display) > max_spinner_width:
|
||||||
|
line_to_display = line_to_display[:max_spinner_width]
|
||||||
|
|
||||||
|
len_line_to_display = len(line_to_display)
|
||||||
|
|
||||||
|
# Calculate padding to clear any remnants from a longer previous line
|
||||||
|
padding_to_clear = " " * max(0, self.last_display_len - len_line_to_display)
|
||||||
|
|
||||||
|
# Write the spinner frame, text, and any necessary clearing spaces
|
||||||
|
sys.stdout.write(f"\r{line_to_display}{padding_to_clear}")
|
||||||
|
self.last_display_len = len_line_to_display
|
||||||
|
|
||||||
|
# Calculate number of backspaces to position cursor at the scanner character
|
||||||
|
scan_char_abs_pos = frame_str.find(self.scan_char)
|
||||||
|
|
||||||
|
# Total characters written to the line (frame + text + padding)
|
||||||
|
total_chars_written_on_line = len_line_to_display + len(padding_to_clear)
|
||||||
|
|
||||||
|
# num_backspaces will be non-positive if scan_char_abs_pos is beyond
|
||||||
|
# total_chars_written_on_line (e.g., if the scan char itself was truncated).
|
||||||
|
# (e.g., if the scan char itself was truncated).
|
||||||
|
# In such cases, (effectively) 0 backspaces are written,
|
||||||
|
# and the cursor stays at the end of the line.
|
||||||
|
num_backspaces = total_chars_written_on_line - scan_char_abs_pos
|
||||||
|
sys.stdout.write("\b" * num_backspaces)
|
||||||
|
sys.stdout.flush()
|
||||||
|
|
||||||
|
def end(self) -> None:
|
||||||
|
if self.visible and self.is_tty:
|
||||||
|
clear_len = self.last_display_len # Use the length of the last displayed content
|
||||||
|
sys.stdout.write("\r" + " " * clear_len + "\r")
|
||||||
|
sys.stdout.flush()
|
||||||
|
self.console.show_cursor(True)
|
||||||
|
self.visible = False
|
||||||
|
|
||||||
|
|
||||||
def find_common_root(abs_fnames):
|
def find_common_root(abs_fnames):
|
||||||
try:
|
try:
|
||||||
if len(abs_fnames) == 1:
|
if len(abs_fnames) == 1:
|
||||||
|
@ -336,3 +485,20 @@ def printable_shell_command(cmd_list):
|
||||||
str: Shell-escaped command string.
|
str: Shell-escaped command string.
|
||||||
"""
|
"""
|
||||||
return oslex.join(cmd_list)
|
return oslex.join(cmd_list)
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
spinner = Spinner("Running spinner...")
|
||||||
|
try:
|
||||||
|
for _ in range(100):
|
||||||
|
time.sleep(0.15)
|
||||||
|
spinner.step()
|
||||||
|
print("Success!")
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
print("\nInterrupted by user.")
|
||||||
|
finally:
|
||||||
|
spinner.end()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
|
|
172
aider/waiting.py
172
aider/waiting.py
|
@ -13,159 +13,10 @@ Use it like:
|
||||||
spinner.stop()
|
spinner.stop()
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import sys
|
|
||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
|
|
||||||
from rich.console import Console
|
from aider.utils import Spinner
|
||||||
|
|
||||||
|
|
||||||
class Spinner:
|
|
||||||
"""
|
|
||||||
Minimal spinner that scans a single marker back and forth across a line.
|
|
||||||
|
|
||||||
The animation is pre-rendered into a list of frames. If the terminal
|
|
||||||
cannot display unicode the frames are converted to plain ASCII.
|
|
||||||
"""
|
|
||||||
|
|
||||||
last_frame_idx = 0 # Class variable to store the last frame index
|
|
||||||
|
|
||||||
def __init__(self, text: str, width: int = 7):
|
|
||||||
self.text = text
|
|
||||||
self.start_time = time.time()
|
|
||||||
self.last_update = 0.0
|
|
||||||
self.visible = False
|
|
||||||
self.is_tty = sys.stdout.isatty()
|
|
||||||
self.console = Console()
|
|
||||||
|
|
||||||
# Pre-render the animation frames using pure ASCII so they will
|
|
||||||
# always display, even on very limited terminals.
|
|
||||||
ascii_frames = [
|
|
||||||
"#= ", # C1 C2 space(8)
|
|
||||||
"=# ", # C2 C1 space(8)
|
|
||||||
" =# ", # space(1) C2 C1 space(7)
|
|
||||||
" =# ", # space(2) C2 C1 space(6)
|
|
||||||
" =# ", # space(3) C2 C1 space(5)
|
|
||||||
" =# ", # space(4) C2 C1 space(4)
|
|
||||||
" =# ", # space(5) C2 C1 space(3)
|
|
||||||
" =# ", # space(6) C2 C1 space(2)
|
|
||||||
" =# ", # space(7) C2 C1 space(1)
|
|
||||||
" =#", # space(8) C2 C1
|
|
||||||
" #=", # space(8) C1 C2
|
|
||||||
" #= ", # space(7) C1 C2 space(1)
|
|
||||||
" #= ", # space(6) C1 C2 space(2)
|
|
||||||
" #= ", # space(5) C1 C2 space(3)
|
|
||||||
" #= ", # space(4) C1 C2 space(4)
|
|
||||||
" #= ", # space(3) C1 C2 space(5)
|
|
||||||
" #= ", # space(2) C1 C2 space(6)
|
|
||||||
" #= ", # space(1) C1 C2 space(7)
|
|
||||||
]
|
|
||||||
|
|
||||||
self.unicode_palette = "░█"
|
|
||||||
xlate_from, xlate_to = ("=#", self.unicode_palette)
|
|
||||||
|
|
||||||
# If unicode is supported, swap the ASCII chars for nicer glyphs.
|
|
||||||
if self._supports_unicode():
|
|
||||||
translation_table = str.maketrans(xlate_from, xlate_to)
|
|
||||||
frames = [f.translate(translation_table) for f in ascii_frames]
|
|
||||||
self.scan_char = xlate_to[xlate_from.find("#")]
|
|
||||||
else:
|
|
||||||
frames = ascii_frames
|
|
||||||
self.scan_char = "#"
|
|
||||||
|
|
||||||
# Bounce the scanner back and forth.
|
|
||||||
self.frames = frames
|
|
||||||
self.frame_idx = Spinner.last_frame_idx # Initialize from class variable
|
|
||||||
self.width = len(frames[0]) - 2 # number of chars between the brackets
|
|
||||||
self.animation_len = len(frames[0])
|
|
||||||
self.last_display_len = 0 # Length of the last spinner line (frame + text)
|
|
||||||
|
|
||||||
def _supports_unicode(self) -> bool:
|
|
||||||
if not self.is_tty:
|
|
||||||
return False
|
|
||||||
try:
|
|
||||||
out = self.unicode_palette
|
|
||||||
out += "\b" * len(self.unicode_palette)
|
|
||||||
out += " " * len(self.unicode_palette)
|
|
||||||
out += "\b" * len(self.unicode_palette)
|
|
||||||
sys.stdout.write(out)
|
|
||||||
sys.stdout.flush()
|
|
||||||
return True
|
|
||||||
except UnicodeEncodeError:
|
|
||||||
return False
|
|
||||||
except Exception:
|
|
||||||
return False
|
|
||||||
|
|
||||||
def _next_frame(self) -> str:
|
|
||||||
frame = self.frames[self.frame_idx]
|
|
||||||
self.frame_idx = (self.frame_idx + 1) % len(self.frames)
|
|
||||||
Spinner.last_frame_idx = self.frame_idx # Update class variable
|
|
||||||
return frame
|
|
||||||
|
|
||||||
def step(self, text: str = None) -> None:
|
|
||||||
if text is not None:
|
|
||||||
self.text = text
|
|
||||||
|
|
||||||
if not self.is_tty:
|
|
||||||
return
|
|
||||||
|
|
||||||
now = time.time()
|
|
||||||
if not self.visible and now - self.start_time >= 0.5:
|
|
||||||
self.visible = True
|
|
||||||
self.last_update = 0.0
|
|
||||||
if self.is_tty:
|
|
||||||
self.console.show_cursor(False)
|
|
||||||
|
|
||||||
if not self.visible or now - self.last_update < 0.1:
|
|
||||||
return
|
|
||||||
|
|
||||||
self.last_update = now
|
|
||||||
frame_str = self._next_frame()
|
|
||||||
|
|
||||||
# Determine the maximum width for the spinner line
|
|
||||||
# Subtract 2 as requested, to leave a margin or prevent cursor wrapping issues
|
|
||||||
max_spinner_width = self.console.width - 2
|
|
||||||
if max_spinner_width < 0: # Handle extremely narrow terminals
|
|
||||||
max_spinner_width = 0
|
|
||||||
|
|
||||||
current_text_payload = f" {self.text}"
|
|
||||||
line_to_display = f"{frame_str}{current_text_payload}"
|
|
||||||
|
|
||||||
# Truncate the line if it's too long for the console width
|
|
||||||
if len(line_to_display) > max_spinner_width:
|
|
||||||
line_to_display = line_to_display[:max_spinner_width]
|
|
||||||
|
|
||||||
len_line_to_display = len(line_to_display)
|
|
||||||
|
|
||||||
# Calculate padding to clear any remnants from a longer previous line
|
|
||||||
padding_to_clear = " " * max(0, self.last_display_len - len_line_to_display)
|
|
||||||
|
|
||||||
# Write the spinner frame, text, and any necessary clearing spaces
|
|
||||||
sys.stdout.write(f"\r{line_to_display}{padding_to_clear}")
|
|
||||||
self.last_display_len = len_line_to_display
|
|
||||||
|
|
||||||
# Calculate number of backspaces to position cursor at the scanner character
|
|
||||||
scan_char_abs_pos = frame_str.find(self.scan_char)
|
|
||||||
|
|
||||||
# Total characters written to the line (frame + text + padding)
|
|
||||||
total_chars_written_on_line = len_line_to_display + len(padding_to_clear)
|
|
||||||
|
|
||||||
# num_backspaces will be non-positive if scan_char_abs_pos is beyond
|
|
||||||
# total_chars_written_on_line (e.g., if the scan char itself was truncated).
|
|
||||||
# (e.g., if the scan char itself was truncated).
|
|
||||||
# In such cases, (effectively) 0 backspaces are written,
|
|
||||||
# and the cursor stays at the end of the line.
|
|
||||||
num_backspaces = total_chars_written_on_line - scan_char_abs_pos
|
|
||||||
sys.stdout.write("\b" * num_backspaces)
|
|
||||||
sys.stdout.flush()
|
|
||||||
|
|
||||||
def end(self) -> None:
|
|
||||||
if self.visible and self.is_tty:
|
|
||||||
clear_len = self.last_display_len # Use the length of the last displayed content
|
|
||||||
sys.stdout.write("\r" + " " * clear_len + "\r")
|
|
||||||
sys.stdout.flush()
|
|
||||||
self.console.show_cursor(True)
|
|
||||||
self.visible = False
|
|
||||||
|
|
||||||
|
|
||||||
class WaitingSpinner:
|
class WaitingSpinner:
|
||||||
|
@ -179,8 +30,8 @@ class WaitingSpinner:
|
||||||
|
|
||||||
def _spin(self):
|
def _spin(self):
|
||||||
while not self._stop_event.is_set():
|
while not self._stop_event.is_set():
|
||||||
self.spinner.step()
|
|
||||||
time.sleep(self.delay)
|
time.sleep(self.delay)
|
||||||
|
self.spinner.step()
|
||||||
self.spinner.end()
|
self.spinner.end()
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
|
@ -192,7 +43,7 @@ class WaitingSpinner:
|
||||||
"""Request the spinner to stop and wait briefly for the thread to exit."""
|
"""Request the spinner to stop and wait briefly for the thread to exit."""
|
||||||
self._stop_event.set()
|
self._stop_event.set()
|
||||||
if self._thread.is_alive():
|
if self._thread.is_alive():
|
||||||
self._thread.join(timeout=self.delay)
|
self._thread.join(timeout=0.1)
|
||||||
self.spinner.end()
|
self.spinner.end()
|
||||||
|
|
||||||
# Allow use as a context-manager
|
# Allow use as a context-manager
|
||||||
|
@ -202,20 +53,3 @@ class WaitingSpinner:
|
||||||
|
|
||||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||||
self.stop()
|
self.stop()
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
spinner = Spinner("Running spinner...")
|
|
||||||
try:
|
|
||||||
for _ in range(100):
|
|
||||||
time.sleep(0.15)
|
|
||||||
spinner.step()
|
|
||||||
print("Success!")
|
|
||||||
except KeyboardInterrupt:
|
|
||||||
print("\nInterrupted by user.")
|
|
||||||
finally:
|
|
||||||
spinner.end()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
||||||
|
|
|
@ -24,34 +24,7 @@ cog.out(text)
|
||||||
]]]-->
|
]]]-->
|
||||||
|
|
||||||
|
|
||||||
### Aider v0.84.0
|
### main branch
|
||||||
|
|
||||||
- Added support for new Claude models including the Sonnet 4 and Opus 4 series (e.g., `claude-sonnet-4-20250514`,
|
|
||||||
`claude-opus-4-20250514`) across various providers. The default `sonnet` and `opus` aliases were updated to these newer
|
|
||||||
versions.
|
|
||||||
- Added support for the `vertex_ai/gemini-2.5-flash-preview-05-20` model.
|
|
||||||
- Fixed OpenRouter token cost calculation for improved accuracy.
|
|
||||||
- Updated default OpenRouter models during onboarding to `deepseek/deepseek-r1:free` for the free tier and
|
|
||||||
`anthropic/claude-sonnet-4` for paid tiers.
|
|
||||||
- Automatically refresh GitHub Copilot tokens when used as OpenAI API keys, by Lih Chen.
|
|
||||||
- Aider wrote 79% of the code in this release.
|
|
||||||
|
|
||||||
### Aider v0.83.2
|
|
||||||
|
|
||||||
- Bumped configargparse to 1.7.1 as 1.7 was pulled.
|
|
||||||
- Added shell tab completion for file path arguments (by saviour) and for `--edit-format`/`--editor-edit-format` options.
|
|
||||||
- Improved OpenRouter model metadata handling by introducing a local cache, increasing reliability and performance.
|
|
||||||
- The `/settings` command now displays detailed metadata for active main, editor, and weak models.
|
|
||||||
- Fixed an issue where files explicitly added via the command line were not correctly ignored if listed in `.gitignore`.
|
|
||||||
- Improved automatic commit messages by providing more context during their generation, by wangboxue.
|
|
||||||
|
|
||||||
### Aider v0.83.1
|
|
||||||
|
|
||||||
- Improved user language detection by correctly normalizing hyphenated language codes (e.g., `en-US` to `en`) and enhancing the validation of locale results.
|
|
||||||
- Prevented Aider from instructing the LLM to reply in 'C' or 'POSIX' when these are detected as the system locale.
|
|
||||||
- Displayed a spinner with the model name when generating commit messages.
|
|
||||||
|
|
||||||
### Aider v0.83.0
|
|
||||||
|
|
||||||
- Added support for `gemini-2.5-pro-preview-05-06` models.
|
- Added support for `gemini-2.5-pro-preview-05-06` models.
|
||||||
- Added support for `qwen3-235b` models.
|
- Added support for `qwen3-235b` models.
|
||||||
|
@ -455,7 +428,7 @@ versions.
|
||||||
- [Aider works with LLM web chat UIs](https://aider.chat/docs/usage/copypaste.html).
|
- [Aider works with LLM web chat UIs](https://aider.chat/docs/usage/copypaste.html).
|
||||||
- New `--copy-paste` mode.
|
- New `--copy-paste` mode.
|
||||||
- New `/copy-context` command.
|
- New `/copy-context` command.
|
||||||
- [Set API keys and other environment variables for all providers from command line or YAML conf file](https://aider.chat/docs/config/aider_conf.html#storing-llm-keys).
|
- [Set API keys and other environment variables for all providers from command line or yaml conf file](https://aider.chat/docs/config/aider_conf.html#storing-llm-keys).
|
||||||
- New `--api-key provider=key` setting.
|
- New `--api-key provider=key` setting.
|
||||||
- New `--set-env VAR=value` setting.
|
- New `--set-env VAR=value` setting.
|
||||||
- Added bash and zsh support to `--watch-files`.
|
- Added bash and zsh support to `--watch-files`.
|
||||||
|
@ -623,7 +596,7 @@ versions.
|
||||||
|
|
||||||
### Aider v0.59.1
|
### Aider v0.59.1
|
||||||
|
|
||||||
- Check for obsolete `yes: true` in YAML config, show helpful error.
|
- Check for obsolete `yes: true` in yaml config, show helpful error.
|
||||||
- Model settings for openrouter/anthropic/claude-3.5-sonnet:beta
|
- Model settings for openrouter/anthropic/claude-3.5-sonnet:beta
|
||||||
|
|
||||||
### Aider v0.59.0
|
### Aider v0.59.0
|
||||||
|
@ -633,7 +606,7 @@ versions.
|
||||||
- Still auto-completes the full paths of the repo files like `/add`.
|
- Still auto-completes the full paths of the repo files like `/add`.
|
||||||
- Now supports globs like `src/**/*.py`
|
- Now supports globs like `src/**/*.py`
|
||||||
- Renamed `--yes` to `--yes-always`.
|
- Renamed `--yes` to `--yes-always`.
|
||||||
- Now uses `AIDER_YES_ALWAYS` env var and `yes-always:` YAML key.
|
- Now uses `AIDER_YES_ALWAYS` env var and `yes-always:` yaml key.
|
||||||
- Existing YAML and .env files will need to be updated.
|
- Existing YAML and .env files will need to be updated.
|
||||||
- Can still abbreviate to `--yes` on the command line.
|
- Can still abbreviate to `--yes` on the command line.
|
||||||
- Config file now uses standard YAML list syntax with ` - list entries`, one per line.
|
- Config file now uses standard YAML list syntax with ` - list entries`, one per line.
|
||||||
|
@ -840,7 +813,7 @@ versions.
|
||||||
- Use `--map-refresh <always|files|manual|auto>` to configure.
|
- Use `--map-refresh <always|files|manual|auto>` to configure.
|
||||||
- Improved cost estimate logic for caching.
|
- Improved cost estimate logic for caching.
|
||||||
- Improved editing performance on Jupyter Notebook `.ipynb` files.
|
- Improved editing performance on Jupyter Notebook `.ipynb` files.
|
||||||
- Show which config YAML file is loaded with `--verbose`.
|
- Show which config yaml file is loaded with `--verbose`.
|
||||||
- Bumped dependency versions.
|
- Bumped dependency versions.
|
||||||
- Bugfix: properly load `.aider.models.metadata.json` data.
|
- Bugfix: properly load `.aider.models.metadata.json` data.
|
||||||
- Bugfix: Using `--msg /ask ...` caused an exception.
|
- Bugfix: Using `--msg /ask ...` caused an exception.
|
||||||
|
|
|
@ -4500,228 +4500,3 @@
|
||||||
Paul Gauthier (aider): 1567
|
Paul Gauthier (aider): 1567
|
||||||
start_tag: v0.81.0
|
start_tag: v0.81.0
|
||||||
total_lines: 1706
|
total_lines: 1706
|
||||||
- aider_percentage: 54.32
|
|
||||||
aider_total: 1409
|
|
||||||
end_date: '2025-05-09'
|
|
||||||
end_tag: v0.83.0
|
|
||||||
file_counts:
|
|
||||||
.github/workflows/check_pypi_version.yml:
|
|
||||||
Paul Gauthier (aider): 1
|
|
||||||
.github/workflows/pre-commit.yml:
|
|
||||||
MDW: 48
|
|
||||||
.github/workflows/ubuntu-tests.yml:
|
|
||||||
Paul Gauthier (aider): 1
|
|
||||||
.github/workflows/windows-tests.yml:
|
|
||||||
Paul Gauthier (aider): 1
|
|
||||||
.github/workflows/windows_check_pypi_version.yml:
|
|
||||||
Paul Gauthier (aider): 1
|
|
||||||
aider/__init__.py:
|
|
||||||
Paul Gauthier: 1
|
|
||||||
aider/args.py:
|
|
||||||
Andrew Grigorev: 21
|
|
||||||
Andrew Grigorev (aider): 5
|
|
||||||
Paul Gauthier (aider): 38
|
|
||||||
aider/coders/__init__.py:
|
|
||||||
Paul Gauthier (aider): 2
|
|
||||||
aider/coders/base_coder.py:
|
|
||||||
Andrew Grigorev (aider): 2
|
|
||||||
Paul Gauthier: 60
|
|
||||||
Paul Gauthier (aider): 104
|
|
||||||
aider/coders/editblock_coder.py:
|
|
||||||
Paul Gauthier: 10
|
|
||||||
Paul Gauthier (aider): 7
|
|
||||||
zjy1412: 2
|
|
||||||
aider/coders/editblock_fenced_coder.py:
|
|
||||||
MDW: 1
|
|
||||||
aider/coders/help_coder.py:
|
|
||||||
MDW: 1
|
|
||||||
aider/coders/patch_coder.py:
|
|
||||||
Paul Gauthier (aider): 38
|
|
||||||
aider/coders/shell.py:
|
|
||||||
Paul Gauthier: 37
|
|
||||||
aider/coders/udiff_coder.py:
|
|
||||||
Paul Gauthier: 2
|
|
||||||
Paul Gauthier (aider): 9
|
|
||||||
aider/coders/udiff_simple.py:
|
|
||||||
Paul Gauthier (aider): 14
|
|
||||||
aider/commands.py:
|
|
||||||
Andrew Grigorev: 10
|
|
||||||
Paul Gauthier: 7
|
|
||||||
Paul Gauthier (aider): 1
|
|
||||||
aider/gui.py:
|
|
||||||
Jon Keys: 2
|
|
||||||
aider/io.py:
|
|
||||||
Kay Gosho: 1
|
|
||||||
Paul Gauthier (aider): 5
|
|
||||||
aider/linter.py:
|
|
||||||
Paul Gauthier: 1
|
|
||||||
Titusz Pan: 1
|
|
||||||
aider/main.py:
|
|
||||||
Paul Gauthier (aider): 9
|
|
||||||
aider/mdstream.py:
|
|
||||||
Paul Gauthier (aider): 11
|
|
||||||
aider/models.py:
|
|
||||||
Paul Gauthier: 4
|
|
||||||
Paul Gauthier (aider): 66
|
|
||||||
Stefan Hladnik: 4
|
|
||||||
Stefan Hladnik (aider): 41
|
|
||||||
aider/queries/tree-sitter-language-pack/ocaml_interface-tags.scm:
|
|
||||||
Andrey Popp: 98
|
|
||||||
aider/queries/tree-sitter-languages/ocaml_interface-tags.scm:
|
|
||||||
Andrey Popp: 98
|
|
||||||
aider/repo.py:
|
|
||||||
Andrew Grigorev: 115
|
|
||||||
Andrew Grigorev (aider): 21
|
|
||||||
Paul Gauthier: 6
|
|
||||||
Paul Gauthier (aider): 33
|
|
||||||
aider/repomap.py:
|
|
||||||
Paul Gauthier: 5
|
|
||||||
Paul Gauthier (aider): 6
|
|
||||||
aider/resources/model-settings.yml:
|
|
||||||
Paul Gauthier: 183
|
|
||||||
Paul Gauthier (aider): 175
|
|
||||||
cantalupo555: 1
|
|
||||||
aider/scrape.py:
|
|
||||||
Jon Keys: 12
|
|
||||||
aider/utils.py:
|
|
||||||
Paul Gauthier: 13
|
|
||||||
Paul Gauthier (aider): 131
|
|
||||||
Titusz Pan: 1
|
|
||||||
aider/waiting.py:
|
|
||||||
Paul Gauthier: 1
|
|
||||||
Paul Gauthier (aider): 54
|
|
||||||
aider/watch.py:
|
|
||||||
Paul Gauthier: 6
|
|
||||||
Paul Gauthier (aider): 7
|
|
||||||
aider/website/_includes/leaderboard_table.js:
|
|
||||||
Paul Gauthier: 2
|
|
||||||
Paul Gauthier (aider): 18
|
|
||||||
aider/website/docs/leaderboards/index.md:
|
|
||||||
Paul Gauthier: 1
|
|
||||||
Paul Gauthier (aider): 2
|
|
||||||
aider/website/index.html:
|
|
||||||
Paul Gauthier: 13
|
|
||||||
benchmark/benchmark.py:
|
|
||||||
Paul Gauthier: 3
|
|
||||||
Paul Gauthier (aider): 42
|
|
||||||
benchmark/docker.sh:
|
|
||||||
Paul Gauthier: 2
|
|
||||||
benchmark/refactor_tools.py:
|
|
||||||
MDW: 1
|
|
||||||
scripts/30k-image.py:
|
|
||||||
MDW: 1
|
|
||||||
scripts/clean_metadata.py:
|
|
||||||
Paul Gauthier (aider): 258
|
|
||||||
scripts/update-history.py:
|
|
||||||
Paul Gauthier: 2
|
|
||||||
Paul Gauthier (aider): 7
|
|
||||||
tests/basic/test_coder.py:
|
|
||||||
Paul Gauthier (aider): 3
|
|
||||||
tests/basic/test_commands.py:
|
|
||||||
Paul Gauthier: 2
|
|
||||||
Paul Gauthier (aider): 90
|
|
||||||
tests/basic/test_editblock.py:
|
|
||||||
Paul Gauthier: 10
|
|
||||||
zjy1412: 52
|
|
||||||
tests/basic/test_io.py:
|
|
||||||
Paul Gauthier (aider): 132
|
|
||||||
tests/basic/test_linter.py:
|
|
||||||
Paul Gauthier: 22
|
|
||||||
Titusz Pan: 10
|
|
||||||
tests/basic/test_repo.py:
|
|
||||||
Andrew Grigorev: 75
|
|
||||||
Andrew Grigorev (aider): 65
|
|
||||||
Paul Gauthier: 79
|
|
||||||
Paul Gauthier (aider): 6
|
|
||||||
tests/basic/test_repomap.py:
|
|
||||||
Andrey Popp: 7
|
|
||||||
tests/basic/test_watch.py:
|
|
||||||
MDW: 1
|
|
||||||
tests/fixtures/languages/ocaml_interface/test.mli:
|
|
||||||
Andrey Popp: 14
|
|
||||||
tests/scrape/test_playwright_disable.py:
|
|
||||||
Andrew Grigorev: 111
|
|
||||||
Paul Gauthier: 25
|
|
||||||
Paul Gauthier (aider): 3
|
|
||||||
grand_total:
|
|
||||||
Andrew Grigorev: 332
|
|
||||||
Andrew Grigorev (aider): 93
|
|
||||||
Andrey Popp: 217
|
|
||||||
Jon Keys: 14
|
|
||||||
Kay Gosho: 1
|
|
||||||
MDW: 53
|
|
||||||
Paul Gauthier: 497
|
|
||||||
Paul Gauthier (aider): 1275
|
|
||||||
Stefan Hladnik: 4
|
|
||||||
Stefan Hladnik (aider): 41
|
|
||||||
Titusz Pan: 12
|
|
||||||
cantalupo555: 1
|
|
||||||
zjy1412: 54
|
|
||||||
start_tag: v0.82.0
|
|
||||||
total_lines: 2594
|
|
||||||
- aider_percentage: 78.92
|
|
||||||
aider_total: 655
|
|
||||||
end_date: '2025-05-30'
|
|
||||||
end_tag: v0.84.0
|
|
||||||
file_counts:
|
|
||||||
aider/__init__.py:
|
|
||||||
Paul Gauthier: 1
|
|
||||||
aider/args.py:
|
|
||||||
Paul Gauthier (aider): 27
|
|
||||||
saviour: 2
|
|
||||||
aider/args_formatter.py:
|
|
||||||
Paul Gauthier: 1
|
|
||||||
aider/coders/base_coder.py:
|
|
||||||
Paul Gauthier: 4
|
|
||||||
Paul Gauthier (aider): 10
|
|
||||||
aider/commands.py:
|
|
||||||
Paul Gauthier (aider): 23
|
|
||||||
wangboxue: 1
|
|
||||||
aider/models.py:
|
|
||||||
Lih Chen: 15
|
|
||||||
Paul Gauthier: 16
|
|
||||||
Paul Gauthier (aider): 12
|
|
||||||
aider/onboarding.py:
|
|
||||||
Paul Gauthier: 2
|
|
||||||
aider/openrouter.py:
|
|
||||||
Paul Gauthier (aider): 120
|
|
||||||
aider/repo.py:
|
|
||||||
Paul Gauthier: 1
|
|
||||||
Paul Gauthier (aider): 10
|
|
||||||
aider/repomap.py:
|
|
||||||
Paul Gauthier (aider): 1
|
|
||||||
aider/resources/model-settings.yml:
|
|
||||||
Paul Gauthier: 71
|
|
||||||
Paul Gauthier (aider): 193
|
|
||||||
Trung Dinh: 11
|
|
||||||
aider/utils.py:
|
|
||||||
Paul Gauthier (aider): 1
|
|
||||||
aider/waiting.py:
|
|
||||||
Paul Gauthier: 2
|
|
||||||
Paul Gauthier (aider): 6
|
|
||||||
aider/website/docs/leaderboards/index.md:
|
|
||||||
Paul Gauthier: 1
|
|
||||||
aider/website/index.html:
|
|
||||||
Paul Gauthier: 43
|
|
||||||
scripts/update-history.py:
|
|
||||||
Paul Gauthier: 2
|
|
||||||
tests/basic/test_coder.py:
|
|
||||||
Paul Gauthier: 2
|
|
||||||
Paul Gauthier (aider): 144
|
|
||||||
tests/basic/test_main.py:
|
|
||||||
Paul Gauthier (aider): 28
|
|
||||||
tests/basic/test_models.py:
|
|
||||||
Paul Gauthier (aider): 2
|
|
||||||
tests/basic/test_onboarding.py:
|
|
||||||
Paul Gauthier (aider): 5
|
|
||||||
tests/basic/test_openrouter.py:
|
|
||||||
Paul Gauthier (aider): 73
|
|
||||||
grand_total:
|
|
||||||
Lih Chen: 15
|
|
||||||
Paul Gauthier: 146
|
|
||||||
Paul Gauthier (aider): 655
|
|
||||||
Trung Dinh: 11
|
|
||||||
saviour: 2
|
|
||||||
wangboxue: 1
|
|
||||||
start_tag: v0.83.0
|
|
||||||
total_lines: 830
|
|
||||||
|
|
|
@ -1279,202 +1279,30 @@
|
||||||
seconds_per_case: 372.2
|
seconds_per_case: 372.2
|
||||||
total_cost: 0.7603
|
total_cost: 0.7603
|
||||||
|
|
||||||
- dirname: 2025-05-09-17-02-02--qwen3-235b-a22b.unthink_16k_diff
|
- dirname: 2025-05-08-03-22-37--qwen3-235b-defaults
|
||||||
test_cases: 225
|
test_cases: 225
|
||||||
model: Qwen3 235B A22B diff, no think, Alibaba API
|
model: Qwen3 235B A22B
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
commit_hash: 91d7fbd-dirty
|
commit_hash: aaacee5-dirty
|
||||||
pass_rate_1: 28.9
|
pass_rate_1: 17.3
|
||||||
pass_rate_2: 59.6
|
pass_rate_2: 49.8
|
||||||
pass_num_1: 65
|
pass_num_1: 39
|
||||||
pass_num_2: 134
|
pass_num_2: 112
|
||||||
percent_cases_well_formed: 92.9
|
percent_cases_well_formed: 91.6
|
||||||
error_outputs: 22
|
error_outputs: 58
|
||||||
num_malformed_responses: 22
|
num_malformed_responses: 29
|
||||||
num_with_malformed_responses: 16
|
num_with_malformed_responses: 19
|
||||||
user_asks: 111
|
user_asks: 102
|
||||||
lazy_comments: 0
|
lazy_comments: 0
|
||||||
syntax_errors: 0
|
syntax_errors: 0
|
||||||
indentation_errors: 0
|
indentation_errors: 0
|
||||||
exhausted_context_windows: 0
|
exhausted_context_windows: 0
|
||||||
prompt_tokens: 2816192
|
prompt_tokens: 0
|
||||||
completion_tokens: 342062
|
completion_tokens: 0
|
||||||
test_timeouts: 1
|
test_timeouts: 1
|
||||||
total_tests: 225
|
total_tests: 225
|
||||||
command: aider --model openai/qwen3-235b-a22b
|
command: aider --model openrouter/qwen/qwen3-235b-a22b
|
||||||
date: 2025-05-09
|
date: 2025-05-08
|
||||||
versions: 0.82.4.dev
|
versions: 0.82.4.dev
|
||||||
seconds_per_case: 45.4
|
seconds_per_case: 428.1
|
||||||
total_cost: 0.0000
|
total_cost: 1.8037
|
||||||
|
|
||||||
- dirname: 2025-05-24-21-17-54--sonnet4-diff-exuser
|
|
||||||
test_cases: 225
|
|
||||||
model: claude-sonnet-4-20250514 (no thinking)
|
|
||||||
edit_format: diff
|
|
||||||
commit_hash: ef3f8bb-dirty
|
|
||||||
pass_rate_1: 20.4
|
|
||||||
pass_rate_2: 56.4
|
|
||||||
pass_num_1: 46
|
|
||||||
pass_num_2: 127
|
|
||||||
percent_cases_well_formed: 98.2
|
|
||||||
error_outputs: 6
|
|
||||||
num_malformed_responses: 4
|
|
||||||
num_with_malformed_responses: 4
|
|
||||||
user_asks: 129
|
|
||||||
lazy_comments: 0
|
|
||||||
syntax_errors: 0
|
|
||||||
indentation_errors: 0
|
|
||||||
exhausted_context_windows: 1
|
|
||||||
prompt_tokens: 3460663
|
|
||||||
completion_tokens: 433373
|
|
||||||
test_timeouts: 7
|
|
||||||
total_tests: 225
|
|
||||||
command: aider --model claude-sonnet-4-20250514
|
|
||||||
date: 2025-05-24
|
|
||||||
versions: 0.83.3.dev
|
|
||||||
seconds_per_case: 29.8
|
|
||||||
total_cost: 15.8155
|
|
||||||
|
|
||||||
- dirname: 2025-05-24-22-10-36--sonnet4-diff-exuser-think32k
|
|
||||||
test_cases: 225
|
|
||||||
model: claude-sonnet-4-20250514 (32k thinking)
|
|
||||||
edit_format: diff
|
|
||||||
commit_hash: e3cb907
|
|
||||||
thinking_tokens: 32000
|
|
||||||
pass_rate_1: 25.8
|
|
||||||
pass_rate_2: 61.3
|
|
||||||
pass_num_1: 58
|
|
||||||
pass_num_2: 138
|
|
||||||
percent_cases_well_formed: 97.3
|
|
||||||
error_outputs: 10
|
|
||||||
num_malformed_responses: 10
|
|
||||||
num_with_malformed_responses: 6
|
|
||||||
user_asks: 111
|
|
||||||
lazy_comments: 0
|
|
||||||
syntax_errors: 0
|
|
||||||
indentation_errors: 0
|
|
||||||
exhausted_context_windows: 0
|
|
||||||
prompt_tokens: 2863068
|
|
||||||
completion_tokens: 1271074
|
|
||||||
test_timeouts: 6
|
|
||||||
total_tests: 225
|
|
||||||
command: aider --model claude-sonnet-4-20250514
|
|
||||||
date: 2025-05-24
|
|
||||||
versions: 0.83.3.dev
|
|
||||||
seconds_per_case: 79.9
|
|
||||||
total_cost: 26.5755
|
|
||||||
|
|
||||||
- dirname: 2025-05-25-19-57-20--opus4-diff-exuser
|
|
||||||
test_cases: 225
|
|
||||||
model: claude-opus-4-20250514 (no think)
|
|
||||||
edit_format: diff
|
|
||||||
commit_hash: 9ef3211
|
|
||||||
pass_rate_1: 32.9
|
|
||||||
pass_rate_2: 70.7
|
|
||||||
pass_num_1: 74
|
|
||||||
pass_num_2: 159
|
|
||||||
percent_cases_well_formed: 98.7
|
|
||||||
error_outputs: 3
|
|
||||||
num_malformed_responses: 3
|
|
||||||
num_with_malformed_responses: 3
|
|
||||||
user_asks: 105
|
|
||||||
lazy_comments: 0
|
|
||||||
syntax_errors: 0
|
|
||||||
indentation_errors: 0
|
|
||||||
exhausted_context_windows: 0
|
|
||||||
prompt_tokens: 2671437
|
|
||||||
completion_tokens: 380717
|
|
||||||
test_timeouts: 3
|
|
||||||
total_tests: 225
|
|
||||||
command: aider --model claude-opus-4-20250514
|
|
||||||
date: 2025-05-25
|
|
||||||
versions: 0.83.3.dev
|
|
||||||
seconds_per_case: 42.5
|
|
||||||
total_cost: 68.6253
|
|
||||||
|
|
||||||
- dirname: 2025-05-25-20-40-51--opus4-diff-exuser
|
|
||||||
test_cases: 225
|
|
||||||
model: claude-opus-4-20250514 (32k thinking)
|
|
||||||
edit_format: diff
|
|
||||||
commit_hash: 9ef3211
|
|
||||||
thinking_tokens: 32000
|
|
||||||
pass_rate_1: 37.3
|
|
||||||
pass_rate_2: 72.0
|
|
||||||
pass_num_1: 84
|
|
||||||
pass_num_2: 162
|
|
||||||
percent_cases_well_formed: 97.3
|
|
||||||
error_outputs: 10
|
|
||||||
num_malformed_responses: 6
|
|
||||||
num_with_malformed_responses: 6
|
|
||||||
user_asks: 97
|
|
||||||
lazy_comments: 0
|
|
||||||
syntax_errors: 0
|
|
||||||
indentation_errors: 0
|
|
||||||
exhausted_context_windows: 0
|
|
||||||
prompt_tokens: 2567514
|
|
||||||
completion_tokens: 363142
|
|
||||||
test_timeouts: 4
|
|
||||||
total_tests: 225
|
|
||||||
command: aider --model claude-opus-4-20250514
|
|
||||||
date: 2025-05-25
|
|
||||||
versions: 0.83.3.dev
|
|
||||||
seconds_per_case: 44.1
|
|
||||||
total_cost: 65.7484
|
|
||||||
|
|
||||||
- dirname: 2025-05-26-15-56-31--flash25-05-20-24k-think # dirname is misleading
|
|
||||||
test_cases: 225
|
|
||||||
model: gemini-2.5-flash-preview-05-20 (no think)
|
|
||||||
edit_format: diff
|
|
||||||
commit_hash: 214b811-dirty
|
|
||||||
thinking_tokens: 0 # <-- no thinking
|
|
||||||
pass_rate_1: 20.9
|
|
||||||
pass_rate_2: 44.0
|
|
||||||
pass_num_1: 47
|
|
||||||
pass_num_2: 99
|
|
||||||
percent_cases_well_formed: 93.8
|
|
||||||
error_outputs: 16
|
|
||||||
num_malformed_responses: 16
|
|
||||||
num_with_malformed_responses: 14
|
|
||||||
user_asks: 79
|
|
||||||
lazy_comments: 0
|
|
||||||
syntax_errors: 0
|
|
||||||
indentation_errors: 0
|
|
||||||
exhausted_context_windows: 0
|
|
||||||
prompt_tokens: 5512458
|
|
||||||
completion_tokens: 514145
|
|
||||||
test_timeouts: 4
|
|
||||||
total_tests: 225
|
|
||||||
command: aider --model gemini/gemini-2.5-flash-preview-05-20
|
|
||||||
date: 2025-05-26
|
|
||||||
versions: 0.83.3.dev
|
|
||||||
seconds_per_case: 12.2
|
|
||||||
total_cost: 1.1354
|
|
||||||
|
|
||||||
- dirname: 2025-05-25-22-58-44--flash25-05-20-24k-think
|
|
||||||
test_cases: 225
|
|
||||||
model: gemini-2.5-flash-preview-05-20 (24k think)
|
|
||||||
edit_format: diff
|
|
||||||
commit_hash: a8568c3-dirty
|
|
||||||
thinking_tokens: 24576
|
|
||||||
pass_rate_1: 26.2
|
|
||||||
pass_rate_2: 55.1
|
|
||||||
pass_num_1: 59
|
|
||||||
pass_num_2: 124
|
|
||||||
percent_cases_well_formed: 95.6
|
|
||||||
error_outputs: 15
|
|
||||||
num_malformed_responses: 15
|
|
||||||
num_with_malformed_responses: 10
|
|
||||||
user_asks: 101
|
|
||||||
lazy_comments: 0
|
|
||||||
syntax_errors: 0
|
|
||||||
indentation_errors: 0
|
|
||||||
exhausted_context_windows: 0
|
|
||||||
prompt_tokens: 3666792
|
|
||||||
completion_tokens: 2703162
|
|
||||||
test_timeouts: 4
|
|
||||||
total_tests: 225
|
|
||||||
command: aider --model gemini/gemini-2.5-flash-preview-05-20
|
|
||||||
date: 2025-05-25
|
|
||||||
versions: 0.83.3.dev
|
|
||||||
seconds_per_case: 53.9
|
|
||||||
total_cost: 8.5625
|
|
|
@ -213,60 +213,3 @@
|
||||||
versions: 0.82.4.dev
|
versions: 0.82.4.dev
|
||||||
seconds_per_case: 635.2
|
seconds_per_case: 635.2
|
||||||
total_cost: 0.0000
|
total_cost: 0.0000
|
||||||
|
|
||||||
|
|
||||||
- dirname: 2025-05-09-17-02-02--qwen3-235b-a22b.unthink_16k_diff
|
|
||||||
test_cases: 225
|
|
||||||
model: Qwen3 235B A22B diff, no think, via official Alibaba API
|
|
||||||
edit_format: diff
|
|
||||||
commit_hash: 91d7fbd-dirty
|
|
||||||
pass_rate_1: 28.9
|
|
||||||
pass_rate_2: 59.6
|
|
||||||
pass_num_1: 65
|
|
||||||
pass_num_2: 134
|
|
||||||
percent_cases_well_formed: 92.9
|
|
||||||
error_outputs: 22
|
|
||||||
num_malformed_responses: 22
|
|
||||||
num_with_malformed_responses: 16
|
|
||||||
user_asks: 111
|
|
||||||
lazy_comments: 0
|
|
||||||
syntax_errors: 0
|
|
||||||
indentation_errors: 0
|
|
||||||
exhausted_context_windows: 0
|
|
||||||
prompt_tokens: 2816192
|
|
||||||
completion_tokens: 342062
|
|
||||||
test_timeouts: 1
|
|
||||||
total_tests: 225
|
|
||||||
command: aider --model openai/qwen3-235b-a22b
|
|
||||||
date: 2025-05-09
|
|
||||||
versions: 0.82.4.dev
|
|
||||||
seconds_per_case: 45.4
|
|
||||||
total_cost: 0.0000
|
|
||||||
|
|
||||||
- dirname: 2025-05-09-23-01-22--qwen3-235b-a22b.unthink_16k_whole
|
|
||||||
test_cases: 225
|
|
||||||
model: Qwen3 235B A22B whole, no think, via official Alibaba API
|
|
||||||
edit_format: whole
|
|
||||||
commit_hash: 425fb6d
|
|
||||||
pass_rate_1: 26.7
|
|
||||||
pass_rate_2: 61.8
|
|
||||||
pass_num_1: 60
|
|
||||||
pass_num_2: 139
|
|
||||||
percent_cases_well_formed: 100.0
|
|
||||||
error_outputs: 0
|
|
||||||
num_malformed_responses: 0
|
|
||||||
num_with_malformed_responses: 0
|
|
||||||
user_asks: 175
|
|
||||||
lazy_comments: 0
|
|
||||||
syntax_errors: 0
|
|
||||||
indentation_errors: 0
|
|
||||||
exhausted_context_windows: 0
|
|
||||||
prompt_tokens: 2768173
|
|
||||||
completion_tokens: 384000
|
|
||||||
test_timeouts: 1
|
|
||||||
total_tests: 225
|
|
||||||
command: aider --model openai/qwen3-235b-a22b
|
|
||||||
date: 2025-05-09
|
|
||||||
versions: 0.82.4.dev
|
|
||||||
seconds_per_case: 50.8
|
|
||||||
total_cost: 0.0000
|
|
|
@ -15,12 +15,12 @@ nav_exclude: true
|
||||||
I recently wanted to draw a graph showing how LLM code editing skill has been
|
I recently wanted to draw a graph showing how LLM code editing skill has been
|
||||||
changing over time as new models have been released by OpenAI, Anthropic and others.
|
changing over time as new models have been released by OpenAI, Anthropic and others.
|
||||||
I have all the
|
I have all the
|
||||||
[data in a YAML file](https://github.com/Aider-AI/aider/blob/main/website/_data/edit_leaderboard.yml) that is used to render
|
[data in a yaml file](https://github.com/Aider-AI/aider/blob/main/website/_data/edit_leaderboard.yml) that is used to render
|
||||||
[aider's LLM leaderboards](https://aider.chat/docs/leaderboards/).
|
[aider's LLM leaderboards](https://aider.chat/docs/leaderboards/).
|
||||||
|
|
||||||
Below is the aider chat transcript, which shows:
|
Below is the aider chat transcript, which shows:
|
||||||
|
|
||||||
- I launch aider with the YAML file, a file with other plots I've done recently (so GPT can crib the style) and an empty file called `over_time.py`.
|
- I launch aider with the yaml file, a file with other plots I've done recently (so GPT can crib the style) and an empty file called `over_time.py`.
|
||||||
- Then I ask GPT to draw the scatterplot I want.
|
- Then I ask GPT to draw the scatterplot I want.
|
||||||
- I run the resulting script and share the error output with GPT so it can fix a small bug.
|
- I run the resulting script and share the error output with GPT so it can fix a small bug.
|
||||||
- I ask it to color the points for GPT-4 and GPT-3.5 family models differently, to better see trends within those model families.
|
- I ask it to color the points for GPT-4 and GPT-3.5 family models differently, to better see trends within those model families.
|
||||||
|
@ -28,7 +28,7 @@ Below is the aider chat transcript, which shows:
|
||||||
- I work through a series of other small style changes, like changing fonts and the graph border.
|
- I work through a series of other small style changes, like changing fonts and the graph border.
|
||||||
|
|
||||||
In the end I have the graph, but I also have the python code in my repo.
|
In the end I have the graph, but I also have the python code in my repo.
|
||||||
So I can update this graph easily whenever I add new entries to the YAML data file.
|
So I can update this graph easily whenever I add new entries to the yaml data file.
|
||||||
|
|
||||||
|
|
||||||
## Aider chat transcript
|
## Aider chat transcript
|
||||||
|
|
|
@ -277,31 +277,6 @@ const LEADERBOARD_CUSTOM_TITLE = "Qwen3 results on the aider polyglot benchmark"
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
|
|
||||||
## No think, via official Alibaba API
|
|
||||||
|
|
||||||
These results were obtained running against `https://dashscope.aliyuncs.com/compatible-mode/v1`
|
|
||||||
with no thinking.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
export OPENAI_API_BASE=https://dashscope.aliyuncs.com/compatible-mode/v1
|
|
||||||
export OPENAI_API_KEY=<key>
|
|
||||||
```
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
- name: openai/qwen3-235b-a22b
|
|
||||||
use_temperature: 0.7
|
|
||||||
streaming: false
|
|
||||||
extra_params:
|
|
||||||
stream: false
|
|
||||||
max_tokens: 16384
|
|
||||||
top_p: 0.8
|
|
||||||
top_k: 20
|
|
||||||
temperature: 0.7
|
|
||||||
enable_thinking: false
|
|
||||||
extra_body:
|
|
||||||
enable_thinking: false
|
|
||||||
```
|
|
||||||
|
|
||||||
## OpenRouter only TogetherAI, recommended /no_think settings
|
## OpenRouter only TogetherAI, recommended /no_think settings
|
||||||
|
|
||||||
These results were obtained with the
|
These results were obtained with the
|
||||||
|
|
File diff suppressed because it is too large
Load diff
|
@ -4,7 +4,7 @@
|
||||||
# Place in your home dir, or at the root of your git repo.
|
# Place in your home dir, or at the root of your git repo.
|
||||||
##########################################################
|
##########################################################
|
||||||
|
|
||||||
# Note: You can only put OpenAI and Anthropic API keys in the YAML
|
# Note: You can only put OpenAI and Anthropic API keys in the yaml
|
||||||
# config file. Keys for all APIs can be stored in a .env file
|
# config file. Keys for all APIs can be stored in a .env file
|
||||||
# https://aider.chat/docs/config/dotenv.html
|
# https://aider.chat/docs/config/dotenv.html
|
||||||
|
|
||||||
|
@ -386,9 +386,6 @@
|
||||||
## Specify the language to use in the chat (default: None, uses system settings)
|
## Specify the language to use in the chat (default: None, uses system settings)
|
||||||
#chat-language: xxx
|
#chat-language: xxx
|
||||||
|
|
||||||
## Specify the language to use in the commit message (default: None, user language)
|
|
||||||
#commit-language: xxx
|
|
||||||
|
|
||||||
## Always say yes to every confirmation
|
## Always say yes to every confirmation
|
||||||
#yes-always: false
|
#yes-always: false
|
||||||
|
|
||||||
|
|
|
@ -357,9 +357,6 @@
|
||||||
## Specify the language to use in the chat (default: None, uses system settings)
|
## Specify the language to use in the chat (default: None, uses system settings)
|
||||||
#AIDER_CHAT_LANGUAGE=
|
#AIDER_CHAT_LANGUAGE=
|
||||||
|
|
||||||
## Specify the language to use in the commit message (default: None, user language)
|
|
||||||
#AIDER_COMMIT_LANGUAGE=
|
|
||||||
|
|
||||||
## Always say yes to every confirmation
|
## Always say yes to every confirmation
|
||||||
#AIDER_YES_ALWAYS=
|
#AIDER_YES_ALWAYS=
|
||||||
|
|
||||||
|
|
|
@ -81,7 +81,7 @@ You can override or add settings for any model by creating a `.aider.model.setti
|
||||||
If the files above exist, they will be loaded in that order.
|
If the files above exist, they will be loaded in that order.
|
||||||
Files loaded last will take priority.
|
Files loaded last will take priority.
|
||||||
|
|
||||||
The YAML file should be a list of dictionary objects for each model.
|
The yaml file should be a list of dictionary objects for each model.
|
||||||
|
|
||||||
|
|
||||||
### Passing extra params to litellm.completion
|
### Passing extra params to litellm.completion
|
||||||
|
@ -158,34 +158,6 @@ cog.out("```\n")
|
||||||
system_prompt_prefix: null
|
system_prompt_prefix: null
|
||||||
accepts_settings: null
|
accepts_settings: null
|
||||||
|
|
||||||
- name: anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: anthropic/claude-3-5-haiku-20241022
|
- name: anthropic/claude-3-5-haiku-20241022
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
weak_model_name: anthropic/claude-3-5-haiku-20241022
|
weak_model_name: anthropic/claude-3-5-haiku-20241022
|
||||||
|
@ -274,34 +246,6 @@ cog.out("```\n")
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25
|
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25
|
||||||
cache_control: true
|
cache_control: true
|
||||||
|
|
||||||
- name: anthropic/claude-opus-4-20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: anthropic/claude-3-5-haiku-20241022
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: anthropic/claude-sonnet-4-20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: anthropic/claude-sonnet-4-20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: anthropic/claude-3-5-haiku-20241022
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: anthropic/claude-sonnet-4-20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: azure/gpt-4.1
|
- name: azure/gpt-4.1
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
weak_model_name: azure/gpt-4.1-mini
|
weak_model_name: azure/gpt-4.1-mini
|
||||||
|
@ -463,20 +407,6 @@ cog.out("```\n")
|
||||||
accepts_settings:
|
accepts_settings:
|
||||||
- thinking_tokens
|
- thinking_tokens
|
||||||
|
|
||||||
- name: bedrock/anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock/anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock/anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0
|
- name: bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
weak_model_name: bedrock/us.anthropic.claude-3-5-haiku-20241022-v1:0
|
weak_model_name: bedrock/us.anthropic.claude-3-5-haiku-20241022-v1:0
|
||||||
|
@ -493,20 +423,6 @@ cog.out("```\n")
|
||||||
accepts_settings:
|
accepts_settings:
|
||||||
- thinking_tokens
|
- thinking_tokens
|
||||||
|
|
||||||
- name: bedrock/us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock/us.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock/us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: bedrock_converse/anthropic.claude-3-7-sonnet-20250219-v1:0
|
- name: bedrock_converse/anthropic.claude-3-7-sonnet-20250219-v1:0
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
weak_model_name: bedrock_converse/anthropic.claude-3-5-haiku-20241022-v1:0
|
weak_model_name: bedrock_converse/anthropic.claude-3-5-haiku-20241022-v1:0
|
||||||
|
@ -523,62 +439,6 @@ cog.out("```\n")
|
||||||
accepts_settings:
|
accepts_settings:
|
||||||
- thinking_tokens
|
- thinking_tokens
|
||||||
|
|
||||||
- name: bedrock_converse/anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: bedrock_converse/anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: bedrock_converse/eu.anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/eu.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: bedrock_converse/eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/eu.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: bedrock_converse/us.anthropic.claude-3-7-sonnet-20250219-v1:0
|
- name: bedrock_converse/us.anthropic.claude-3-7-sonnet-20250219-v1:0
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
weak_model_name: bedrock_converse/us.anthropic.claude-3-5-haiku-20241022-v1:0
|
weak_model_name: bedrock_converse/us.anthropic.claude-3-5-haiku-20241022-v1:0
|
||||||
|
@ -595,34 +455,6 @@ cog.out("```\n")
|
||||||
accepts_settings:
|
accepts_settings:
|
||||||
- thinking_tokens
|
- thinking_tokens
|
||||||
|
|
||||||
- name: bedrock_converse/us.anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/us.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: bedrock_converse/us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: bedrock_converse/us.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: bedrock_converse/us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: claude-3-5-haiku-20241022
|
- name: claude-3-5-haiku-20241022
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
weak_model_name: claude-3-5-haiku-20241022
|
weak_model_name: claude-3-5-haiku-20241022
|
||||||
|
@ -706,34 +538,6 @@ cog.out("```\n")
|
||||||
- name: claude-3-sonnet-20240229
|
- name: claude-3-sonnet-20240229
|
||||||
weak_model_name: claude-3-5-haiku-20241022
|
weak_model_name: claude-3-5-haiku-20241022
|
||||||
|
|
||||||
- name: claude-opus-4-20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: claude-3-5-haiku-20241022
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: claude-sonnet-4-20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: claude-sonnet-4-20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: claude-3-5-haiku-20241022
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: claude-sonnet-4-20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: cohere_chat/command-a-03-2025
|
- name: cohere_chat/command-a-03-2025
|
||||||
examples_as_sys_msg: true
|
examples_as_sys_msg: true
|
||||||
|
|
||||||
|
@ -796,34 +600,6 @@ cog.out("```\n")
|
||||||
editor_model_name: deepseek/deepseek-chat
|
editor_model_name: deepseek/deepseek-chat
|
||||||
editor_edit_format: editor-diff
|
editor_edit_format: editor-diff
|
||||||
|
|
||||||
- name: eu.anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: eu.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: eu.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: fireworks_ai/accounts/fireworks/models/deepseek-r1
|
- name: fireworks_ai/accounts/fireworks/models/deepseek-r1
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
weak_model_name: fireworks_ai/accounts/fireworks/models/deepseek-v3
|
weak_model_name: fireworks_ai/accounts/fireworks/models/deepseek-v3
|
||||||
|
@ -924,12 +700,6 @@ cog.out("```\n")
|
||||||
use_repo_map: true
|
use_repo_map: true
|
||||||
overeager: true
|
overeager: true
|
||||||
|
|
||||||
- name: gemini/gemini-2.5-pro-preview-06-05
|
|
||||||
edit_format: diff-fenced
|
|
||||||
weak_model_name: gemini/gemini-2.5-flash-preview-04-17
|
|
||||||
use_repo_map: true
|
|
||||||
overeager: true
|
|
||||||
|
|
||||||
- name: gemini/gemini-exp-1114
|
- name: gemini/gemini-exp-1114
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
use_repo_map: true
|
use_repo_map: true
|
||||||
|
@ -1375,34 +1145,6 @@ cog.out("```\n")
|
||||||
accepts_settings:
|
accepts_settings:
|
||||||
- thinking_tokens
|
- thinking_tokens
|
||||||
|
|
||||||
- name: openrouter/anthropic/claude-opus-4
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: openrouter/anthropic/claude-3-5-haiku
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: openrouter/anthropic/claude-sonnet-4
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: openrouter/anthropic/claude-sonnet-4
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: openrouter/anthropic/claude-3-5-haiku
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: openrouter/anthropic/claude-sonnet-4
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: openrouter/cohere/command-a-03-2025
|
- name: openrouter/cohere/command-a-03-2025
|
||||||
examples_as_sys_msg: true
|
examples_as_sys_msg: true
|
||||||
|
|
||||||
|
@ -1500,12 +1242,6 @@ cog.out("```\n")
|
||||||
use_repo_map: true
|
use_repo_map: true
|
||||||
overeager: true
|
overeager: true
|
||||||
|
|
||||||
- name: openrouter/google/gemini-2.5-pro-preview-06-05
|
|
||||||
edit_format: diff-fenced
|
|
||||||
weak_model_name: openrouter/google/gemini-2.0-flash-001
|
|
||||||
use_repo_map: true
|
|
||||||
overeager: true
|
|
||||||
|
|
||||||
- name: openrouter/google/gemma-3-27b-it
|
- name: openrouter/google/gemma-3-27b-it
|
||||||
use_system_prompt: false
|
use_system_prompt: false
|
||||||
|
|
||||||
|
@ -1698,34 +1434,6 @@ cog.out("```\n")
|
||||||
accepts_settings:
|
accepts_settings:
|
||||||
- reasoning_effort
|
- reasoning_effort
|
||||||
|
|
||||||
- name: us.anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: us.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 32000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: us.anthropic.claude-3-5-haiku-20241022-v1:0
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
extra_headers:
|
|
||||||
anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
|
|
||||||
max_tokens: 64000
|
|
||||||
cache_control: true
|
|
||||||
editor_model_name: us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: vertex_ai-anthropic_models/vertex_ai/claude-3-7-sonnet@20250219
|
- name: vertex_ai-anthropic_models/vertex_ai/claude-3-7-sonnet@20250219
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
||||||
|
@ -1739,28 +1447,6 @@ cog.out("```\n")
|
||||||
accepts_settings:
|
accepts_settings:
|
||||||
- thinking_tokens
|
- thinking_tokens
|
||||||
|
|
||||||
- name: vertex_ai-anthropic_models/vertex_ai/claude-opus-4@20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
max_tokens: 32000
|
|
||||||
editor_model_name: vertex_ai-anthropic_models/vertex_ai/claude-sonnet-4@20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: vertex_ai-anthropic_models/vertex_ai/claude-sonnet-4@20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
max_tokens: 64000
|
|
||||||
editor_model_name: vertex_ai-anthropic_models/vertex_ai/claude-sonnet-4@20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
- name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
||||||
edit_format: diff
|
edit_format: diff
|
||||||
use_repo_map: true
|
use_repo_map: true
|
||||||
|
@ -1816,35 +1502,6 @@ cog.out("```\n")
|
||||||
- name: vertex_ai/claude-3-sonnet@20240229
|
- name: vertex_ai/claude-3-sonnet@20240229
|
||||||
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
||||||
|
|
||||||
- name: vertex_ai/claude-opus-4@20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
max_tokens: 32000
|
|
||||||
editor_model_name: vertex_ai/claude-sonnet-4@20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: vertex_ai/claude-sonnet-4@20250514
|
|
||||||
edit_format: diff
|
|
||||||
weak_model_name: vertex_ai/claude-3-5-haiku@20241022
|
|
||||||
use_repo_map: true
|
|
||||||
extra_params:
|
|
||||||
max_tokens: 64000
|
|
||||||
editor_model_name: vertex_ai/claude-sonnet-4@20250514
|
|
||||||
editor_edit_format: editor-diff
|
|
||||||
accepts_settings:
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: vertex_ai/gemini-2.5-flash-preview-05-20
|
|
||||||
edit_format: diff
|
|
||||||
use_repo_map: true
|
|
||||||
accepts_settings:
|
|
||||||
- reasoning_effort
|
|
||||||
- thinking_tokens
|
|
||||||
|
|
||||||
- name: vertex_ai/gemini-2.5-pro-exp-03-25
|
- name: vertex_ai/gemini-2.5-pro-exp-03-25
|
||||||
edit_format: diff-fenced
|
edit_format: diff-fenced
|
||||||
weak_model_name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
weak_model_name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
||||||
|
@ -1866,13 +1523,6 @@ cog.out("```\n")
|
||||||
overeager: true
|
overeager: true
|
||||||
editor_model_name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
editor_model_name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
||||||
|
|
||||||
- name: vertex_ai/gemini-2.5-pro-preview-06-05
|
|
||||||
edit_format: diff-fenced
|
|
||||||
weak_model_name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
|
||||||
use_repo_map: true
|
|
||||||
overeager: true
|
|
||||||
editor_model_name: vertex_ai-language-models/gemini-2.5-flash-preview-04-17
|
|
||||||
|
|
||||||
- name: vertex_ai/gemini-pro-experimental
|
- name: vertex_ai/gemini-pro-experimental
|
||||||
edit_format: diff-fenced
|
edit_format: diff-fenced
|
||||||
use_repo_map: true
|
use_repo_map: true
|
||||||
|
|
|
@ -1,7 +1,7 @@
|
||||||
---
|
---
|
||||||
parent: Configuration
|
parent: Configuration
|
||||||
nav_order: 15
|
nav_order: 15
|
||||||
description: How to configure aider with a YAML config file.
|
description: How to configure aider with a yaml config file.
|
||||||
---
|
---
|
||||||
|
|
||||||
# YAML config file
|
# YAML config file
|
||||||
|
@ -58,7 +58,7 @@ cog.outl("```")
|
||||||
# Place in your home dir, or at the root of your git repo.
|
# Place in your home dir, or at the root of your git repo.
|
||||||
##########################################################
|
##########################################################
|
||||||
|
|
||||||
# Note: You can only put OpenAI and Anthropic API keys in the YAML
|
# Note: You can only put OpenAI and Anthropic API keys in the yaml
|
||||||
# config file. Keys for all APIs can be stored in a .env file
|
# config file. Keys for all APIs can be stored in a .env file
|
||||||
# https://aider.chat/docs/config/dotenv.html
|
# https://aider.chat/docs/config/dotenv.html
|
||||||
|
|
||||||
|
@ -440,9 +440,6 @@ cog.outl("```")
|
||||||
## Specify the language to use in the chat (default: None, uses system settings)
|
## Specify the language to use in the chat (default: None, uses system settings)
|
||||||
#chat-language: xxx
|
#chat-language: xxx
|
||||||
|
|
||||||
## Specify the language to use in the commit message (default: None, user language)
|
|
||||||
#commit-language: xxx
|
|
||||||
|
|
||||||
## Always say yes to every confirmation
|
## Always say yes to every confirmation
|
||||||
#yes-always: false
|
#yes-always: false
|
||||||
|
|
||||||
|
|
|
@ -40,9 +40,9 @@ OPENAI_API_KEY=<key>
|
||||||
ANTHROPIC_API_KEY=<key>
|
ANTHROPIC_API_KEY=<key>
|
||||||
```
|
```
|
||||||
|
|
||||||
#### YAML config file
|
#### Yaml config file
|
||||||
You can also set those API keys via special entries in the
|
You can also set those API keys via special entries in the
|
||||||
[YAML config file](/docs/config/aider_conf.html), like this:
|
[yaml config file](/docs/config/aider_conf.html), like this:
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
openai-api-key: <key>
|
openai-api-key: <key>
|
||||||
|
@ -74,7 +74,7 @@ OPENROUTER_API_KEY=bar
|
||||||
DEEPSEEK_API_KEY=baz
|
DEEPSEEK_API_KEY=baz
|
||||||
```
|
```
|
||||||
|
|
||||||
#### YAML config file
|
#### Yaml config file
|
||||||
|
|
||||||
|
|
||||||
You can also set API keys in the
|
You can also set API keys in the
|
||||||
|
|
|
@ -397,9 +397,6 @@ cog.outl("```")
|
||||||
## Specify the language to use in the chat (default: None, uses system settings)
|
## Specify the language to use in the chat (default: None, uses system settings)
|
||||||
#AIDER_CHAT_LANGUAGE=
|
#AIDER_CHAT_LANGUAGE=
|
||||||
|
|
||||||
## Specify the language to use in the commit message (default: None, user language)
|
|
||||||
#AIDER_COMMIT_LANGUAGE=
|
|
||||||
|
|
||||||
## Always say yes to every confirmation
|
## Always say yes to every confirmation
|
||||||
#AIDER_YES_ALWAYS=
|
#AIDER_YES_ALWAYS=
|
||||||
|
|
||||||
|
|
|
@ -12,7 +12,7 @@ Aider allows you to configure your preferred text editor for use with the `/edit
|
||||||
|
|
||||||
You can specify the text editor with the `--editor` switch or using
|
You can specify the text editor with the `--editor` switch or using
|
||||||
`editor:` in aider's
|
`editor:` in aider's
|
||||||
[YAML config file](https://aider.chat/docs/config/aider_conf.html).
|
[yaml config file](https://aider.chat/docs/config/aider_conf.html).
|
||||||
|
|
||||||
## Environment variables
|
## Environment variables
|
||||||
|
|
||||||
|
|
|
@ -80,16 +80,16 @@ for alias, model in sorted(MODEL_ALIASES.items()):
|
||||||
- `4o`: gpt-4o
|
- `4o`: gpt-4o
|
||||||
- `deepseek`: deepseek/deepseek-chat
|
- `deepseek`: deepseek/deepseek-chat
|
||||||
- `flash`: gemini/gemini-2.5-flash-preview-04-17
|
- `flash`: gemini/gemini-2.5-flash-preview-04-17
|
||||||
- `gemini`: gemini/gemini-2.5-pro-preview-06-05
|
- `gemini`: gemini/gemini-2.5-pro-preview-05-06
|
||||||
- `gemini-2.5-pro`: gemini/gemini-2.5-pro-preview-06-05
|
- `gemini-2.5-pro`: gemini/gemini-2.5-pro-preview-05-06
|
||||||
- `gemini-exp`: gemini/gemini-2.5-pro-exp-03-25
|
- `gemini-exp`: gemini/gemini-2.5-pro-exp-03-25
|
||||||
- `grok3`: xai/grok-3-beta
|
- `grok3`: xai/grok-3-beta
|
||||||
- `haiku`: claude-3-5-haiku-20241022
|
- `haiku`: claude-3-5-haiku-20241022
|
||||||
- `optimus`: openrouter/openrouter/optimus-alpha
|
- `optimus`: openrouter/openrouter/optimus-alpha
|
||||||
- `opus`: claude-opus-4-20250514
|
- `opus`: claude-3-opus-20240229
|
||||||
- `quasar`: openrouter/openrouter/quasar-alpha
|
- `quasar`: openrouter/openrouter/quasar-alpha
|
||||||
- `r1`: deepseek/deepseek-reasoner
|
- `r1`: deepseek/deepseek-reasoner
|
||||||
- `sonnet`: anthropic/claude-sonnet-4-20250514
|
- `sonnet`: anthropic/claude-3-7-sonnet-20250219
|
||||||
<!--[[[end]]]-->
|
<!--[[[end]]]-->
|
||||||
|
|
||||||
## Priority
|
## Priority
|
||||||
|
|
|
@ -74,9 +74,9 @@ usage: aider [-h] [--model] [--openai-api-key] [--anthropic-api-key]
|
||||||
[--apply-clipboard-edits] [--exit] [--show-repo-map]
|
[--apply-clipboard-edits] [--exit] [--show-repo-map]
|
||||||
[--show-prompts] [--voice-format] [--voice-language]
|
[--show-prompts] [--voice-format] [--voice-language]
|
||||||
[--voice-input-device] [--disable-playwright] [--file]
|
[--voice-input-device] [--disable-playwright] [--file]
|
||||||
[--read] [--vim] [--chat-language] [--commit-language]
|
[--read] [--vim] [--chat-language] [--yes-always] [-v]
|
||||||
[--yes-always] [-v] [--load] [--encoding]
|
[--load] [--encoding] [--line-endings] [-c]
|
||||||
[--line-endings] [-c] [--env-file]
|
[--env-file]
|
||||||
[--suggest-shell-commands | --no-suggest-shell-commands]
|
[--suggest-shell-commands | --no-suggest-shell-commands]
|
||||||
[--fancy-input | --no-fancy-input]
|
[--fancy-input | --no-fancy-input]
|
||||||
[--multiline | --no-multiline]
|
[--multiline | --no-multiline]
|
||||||
|
@ -683,10 +683,6 @@ Environment variable: `AIDER_VIM`
|
||||||
Specify the language to use in the chat (default: None, uses system settings)
|
Specify the language to use in the chat (default: None, uses system settings)
|
||||||
Environment variable: `AIDER_CHAT_LANGUAGE`
|
Environment variable: `AIDER_CHAT_LANGUAGE`
|
||||||
|
|
||||||
### `--commit-language COMMIT_LANGUAGE`
|
|
||||||
Specify the language to use in the commit message (default: None, user language)
|
|
||||||
Environment variable: `AIDER_COMMIT_LANGUAGE`
|
|
||||||
|
|
||||||
### `--yes-always`
|
### `--yes-always`
|
||||||
Always say yes to every confirmation
|
Always say yes to every confirmation
|
||||||
Environment variable: `AIDER_YES_ALWAYS`
|
Environment variable: `AIDER_YES_ALWAYS`
|
||||||
|
|
|
@ -264,12 +264,9 @@ tr:hover { background-color: #f5f5f5; }
|
||||||
</style>
|
</style>
|
||||||
<table>
|
<table>
|
||||||
<tr><th>Model Name</th><th class='right'>Total Tokens</th><th class='right'>Percent</th></tr>
|
<tr><th>Model Name</th><th class='right'>Total Tokens</th><th class='right'>Percent</th></tr>
|
||||||
<tr><td>o3</td><td class='right'>542,669</td><td class='right'>41.1%</td></tr>
|
<tr><td>gemini/gemini-2.5-pro-exp-03-25</td><td class='right'>890,057</td><td class='right'>69.9%</td></tr>
|
||||||
<tr><td>gemini/gemini-2.5-pro-exp-03-25</td><td class='right'>479,518</td><td class='right'>36.3%</td></tr>
|
<tr><td>o3</td><td class='right'>373,753</td><td class='right'>29.4%</td></tr>
|
||||||
<tr><td>anthropic/claude-sonnet-4-20250514</td><td class='right'>249,562</td><td class='right'>18.9%</td></tr>
|
<tr><td>openrouter/REDACTED</td><td class='right'>8,745</td><td class='right'>0.7%</td></tr>
|
||||||
<tr><td>gemini/gemini-2.5-pro-preview-05-06</td><td class='right'>40,256</td><td class='right'>3.0%</td></tr>
|
|
||||||
<tr><td>gemini/gemini-2.5-flash-preview-05-20</td><td class='right'>7,638</td><td class='right'>0.6%</td></tr>
|
|
||||||
<tr><td>gemini/REDACTED</td><td class='right'>643</td><td class='right'>0.0%</td></tr>
|
|
||||||
</table>
|
</table>
|
||||||
|
|
||||||
{: .note :}
|
{: .note :}
|
||||||
|
|
|
@ -28,6 +28,12 @@ These one-liners will install aider, along with python 3.12 if needed.
|
||||||
They are based on the
|
They are based on the
|
||||||
[uv installers](https://docs.astral.sh/uv/getting-started/installation/).
|
[uv installers](https://docs.astral.sh/uv/getting-started/installation/).
|
||||||
|
|
||||||
|
#### Windows
|
||||||
|
|
||||||
|
```powershell
|
||||||
|
powershell -ExecutionPolicy ByPass -c "irm https://aider.chat/install.ps1 | iex"
|
||||||
|
```
|
||||||
|
|
||||||
#### Mac & Linux
|
#### Mac & Linux
|
||||||
|
|
||||||
Use curl to download the script and execute it with sh:
|
Use curl to download the script and execute it with sh:
|
||||||
|
@ -42,12 +48,6 @@ If your system doesn't have curl, you can use wget:
|
||||||
wget -qO- https://aider.chat/install.sh | sh
|
wget -qO- https://aider.chat/install.sh | sh
|
||||||
```
|
```
|
||||||
|
|
||||||
#### Windows
|
|
||||||
|
|
||||||
```powershell
|
|
||||||
powershell -ExecutionPolicy ByPass -c "irm https://aider.chat/install.ps1 | iex"
|
|
||||||
```
|
|
||||||
|
|
||||||
|
|
||||||
## Install with uv
|
## Install with uv
|
||||||
|
|
||||||
|
@ -55,7 +55,7 @@ You can install aider with uv:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
python -m pip install uv # If you need to install uv
|
python -m pip install uv # If you need to install uv
|
||||||
uv tool install --force --python python3.12 --with pip aider-chat@latest
|
uv tool install --force --python python3.12 aider-chat@latest
|
||||||
```
|
```
|
||||||
|
|
||||||
This will install uv using your existing python version 3.8-3.13,
|
This will install uv using your existing python version 3.8-3.13,
|
||||||
|
|
|
@ -285,6 +285,6 @@ mod_dates = [get_last_modified_date(file) for file in files]
|
||||||
latest_mod_date = max(mod_dates)
|
latest_mod_date = max(mod_dates)
|
||||||
cog.out(f"{latest_mod_date.strftime('%B %d, %Y.')}")
|
cog.out(f"{latest_mod_date.strftime('%B %d, %Y.')}")
|
||||||
]]]-->
|
]]]-->
|
||||||
May 26, 2025.
|
May 08, 2025.
|
||||||
<!--[[[end]]]-->
|
<!--[[[end]]]-->
|
||||||
</p>
|
</p>
|
||||||
|
|
|
@ -1,105 +0,0 @@
|
||||||
---
|
|
||||||
parent: Connecting to LLMs
|
|
||||||
nav_order: 510
|
|
||||||
---
|
|
||||||
|
|
||||||
# GitHub Copilot
|
|
||||||
|
|
||||||
Aider can connect to GitHub Copilot’s LLMs because Copilot exposes a standard **OpenAI-style**
|
|
||||||
endpoint at:
|
|
||||||
|
|
||||||
```
|
|
||||||
https://api.githubcopilot.com
|
|
||||||
```
|
|
||||||
|
|
||||||
First, install aider:
|
|
||||||
|
|
||||||
{% include install.md %}
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Configure your environment
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# macOS/Linux
|
|
||||||
export OPENAI_API_BASE=https://api.githubcopilot.com
|
|
||||||
export OPENAI_API_KEY=<oauth_token>
|
|
||||||
|
|
||||||
# Windows (PowerShell)
|
|
||||||
setx OPENAI_API_BASE https://api.githubcopilot.com
|
|
||||||
setx OPENAI_API_KEY <oauth_token>
|
|
||||||
# …restart the shell after setx commands
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### Where do I get the token?
|
|
||||||
The easiest path is to sign in to Copilot from any JetBrains IDE (PyCharm, GoLand, etc).
|
|
||||||
After you authenticate a file appears:
|
|
||||||
|
|
||||||
```
|
|
||||||
~/.config/github-copilot/apps.json
|
|
||||||
```
|
|
||||||
|
|
||||||
Copy the `oauth_token` value – that string is your `OPENAI_API_KEY`.
|
|
||||||
|
|
||||||
*Note:* tokens created by the Neovim **copilot.lua** plugin (old `hosts.json`) sometimes lack the
|
|
||||||
needed scopes. If you see “access to this endpoint is forbidden”, regenerate the token with a
|
|
||||||
JetBrains IDE.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Discover available models
|
|
||||||
|
|
||||||
Copilot hosts many models (OpenAI, Anthropic, Google, etc).
|
|
||||||
List the models your subscription allows with:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
curl -s https://api.githubcopilot.com/models \
|
|
||||||
-H "Authorization: Bearer $OPENAI_API_KEY" \
|
|
||||||
-H "Content-Type: application/json" \
|
|
||||||
-H "Copilot-Integration-Id: vscode-chat" | jq -r '.data[].id'
|
|
||||||
```
|
|
||||||
|
|
||||||
Each returned ID can be used with aider by **prefixing it with `openai/`**:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
aider --model openai/gpt-4o
|
|
||||||
# or
|
|
||||||
aider --model openai/claude-3.7-sonnet-thought
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Quick start
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# change into your project
|
|
||||||
cd /to/your/project
|
|
||||||
|
|
||||||
# talk to Copilot
|
|
||||||
aider --model openai/gpt-4o
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Optional config file (`~/.aider.conf.yml`)
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
openai-api-base: https://api.githubcopilot.com
|
|
||||||
openai-api-key: "<oauth_token>"
|
|
||||||
model: openai/gpt-4o
|
|
||||||
weak-model: openai/gpt-4o-mini
|
|
||||||
show-model-warnings: false
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## FAQ
|
|
||||||
|
|
||||||
* Calls made through aider are billed through your Copilot subscription
|
|
||||||
(aider will still print *estimated* costs).
|
|
||||||
* The Copilot docs explicitly allow third-party “agents” that hit this API – aider is playing by
|
|
||||||
the rules.
|
|
||||||
* Aider talks directly to the REST endpoint—no web-UI scraping or browser automation.
|
|
||||||
|
|
|
@ -72,7 +72,6 @@ cog.out(''.join(lines))
|
||||||
- DATABRICKS_API_KEY
|
- DATABRICKS_API_KEY
|
||||||
- DEEPINFRA_API_KEY
|
- DEEPINFRA_API_KEY
|
||||||
- DEEPSEEK_API_KEY
|
- DEEPSEEK_API_KEY
|
||||||
- FEATHERLESS_AI_API_KEY
|
|
||||||
- FIREWORKS_AI_API_KEY
|
- FIREWORKS_AI_API_KEY
|
||||||
- FIREWORKS_API_KEY
|
- FIREWORKS_API_KEY
|
||||||
- FIREWORKSAI_API_KEY
|
- FIREWORKSAI_API_KEY
|
||||||
|
@ -82,9 +81,7 @@ cog.out(''.join(lines))
|
||||||
- INFINITY_API_KEY
|
- INFINITY_API_KEY
|
||||||
- MARITALK_API_KEY
|
- MARITALK_API_KEY
|
||||||
- MISTRAL_API_KEY
|
- MISTRAL_API_KEY
|
||||||
- NEBIUS_API_KEY
|
|
||||||
- NLP_CLOUD_API_KEY
|
- NLP_CLOUD_API_KEY
|
||||||
- NOVITA_API_KEY
|
|
||||||
- NVIDIA_NIM_API_KEY
|
- NVIDIA_NIM_API_KEY
|
||||||
- OLLAMA_API_KEY
|
- OLLAMA_API_KEY
|
||||||
- OPENAI_API_KEY
|
- OPENAI_API_KEY
|
||||||
|
|
|
@ -40,7 +40,7 @@ cd /to/your/project
|
||||||
aider --model vertex_ai/claude-3-5-sonnet@20240620
|
aider --model vertex_ai/claude-3-5-sonnet@20240620
|
||||||
```
|
```
|
||||||
|
|
||||||
Or you can use the [YAML config](/docs/config/aider_conf.html) to set the model to any of the
|
Or you can use the [yaml config](/docs/config/aider_conf.html) to set the model to any of the
|
||||||
models supported by Vertex AI.
|
models supported by Vertex AI.
|
||||||
|
|
||||||
Example `.aider.conf.yml` file:
|
Example `.aider.conf.yml` file:
|
||||||
|
|
|
@ -58,9 +58,6 @@ cog.out(model_list)
|
||||||
- anthropic.claude-3-5-haiku-20241022-v1:0
|
- anthropic.claude-3-5-haiku-20241022-v1:0
|
||||||
- anthropic.claude-3-5-sonnet-20241022-v2:0
|
- anthropic.claude-3-5-sonnet-20241022-v2:0
|
||||||
- anthropic.claude-3-7-sonnet-20250219-v1:0
|
- anthropic.claude-3-7-sonnet-20250219-v1:0
|
||||||
- anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
- anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
- azure_ai/mistral-medium-2505
|
|
||||||
- claude-3-5-haiku-20241022
|
- claude-3-5-haiku-20241022
|
||||||
- claude-3-5-haiku-latest
|
- claude-3-5-haiku-latest
|
||||||
- claude-3-5-sonnet-20240620
|
- claude-3-5-sonnet-20240620
|
||||||
|
@ -72,10 +69,6 @@ cog.out(model_list)
|
||||||
- claude-3-opus-20240229
|
- claude-3-opus-20240229
|
||||||
- claude-3-opus-latest
|
- claude-3-opus-latest
|
||||||
- claude-3-sonnet-20240229
|
- claude-3-sonnet-20240229
|
||||||
- claude-4-opus-20250514
|
|
||||||
- claude-4-sonnet-20250514
|
|
||||||
- claude-opus-4-20250514
|
|
||||||
- claude-sonnet-4-20250514
|
|
||||||
- codestral/codestral-2405
|
- codestral/codestral-2405
|
||||||
- codestral/codestral-latest
|
- codestral/codestral-latest
|
||||||
- databricks/databricks-claude-3-7-sonnet
|
- databricks/databricks-claude-3-7-sonnet
|
||||||
|
@ -84,20 +77,15 @@ cog.out(model_list)
|
||||||
- deepseek/deepseek-reasoner
|
- deepseek/deepseek-reasoner
|
||||||
- eu.anthropic.claude-3-5-haiku-20241022-v1:0
|
- eu.anthropic.claude-3-5-haiku-20241022-v1:0
|
||||||
- eu.anthropic.claude-3-5-sonnet-20241022-v2:0
|
- eu.anthropic.claude-3-5-sonnet-20241022-v2:0
|
||||||
- eu.anthropic.claude-3-7-sonnet-20250219-v1:0
|
|
||||||
- eu.anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
- eu.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
- mistral/codestral-2405
|
- mistral/codestral-2405
|
||||||
- mistral/codestral-latest
|
- mistral/codestral-latest
|
||||||
- mistral/codestral-mamba-latest
|
- mistral/codestral-mamba-latest
|
||||||
- mistral/devstral-small-2505
|
|
||||||
- mistral/mistral-large-2402
|
- mistral/mistral-large-2402
|
||||||
- mistral/mistral-large-2407
|
- mistral/mistral-large-2407
|
||||||
- mistral/mistral-large-2411
|
- mistral/mistral-large-2411
|
||||||
- mistral/mistral-large-latest
|
- mistral/mistral-large-latest
|
||||||
- mistral/mistral-medium
|
- mistral/mistral-medium
|
||||||
- mistral/mistral-medium-2312
|
- mistral/mistral-medium-2312
|
||||||
- mistral/mistral-medium-2505
|
|
||||||
- mistral/mistral-medium-latest
|
- mistral/mistral-medium-latest
|
||||||
- mistral/mistral-small
|
- mistral/mistral-small
|
||||||
- mistral/mistral-small-latest
|
- mistral/mistral-small-latest
|
||||||
|
@ -117,8 +105,6 @@ cog.out(model_list)
|
||||||
- us.anthropic.claude-3-5-haiku-20241022-v1:0
|
- us.anthropic.claude-3-5-haiku-20241022-v1:0
|
||||||
- us.anthropic.claude-3-5-sonnet-20241022-v2:0
|
- us.anthropic.claude-3-5-sonnet-20241022-v2:0
|
||||||
- us.anthropic.claude-3-7-sonnet-20250219-v1:0
|
- us.anthropic.claude-3-7-sonnet-20250219-v1:0
|
||||||
- us.anthropic.claude-opus-4-20250514-v1:0
|
|
||||||
- us.anthropic.claude-sonnet-4-20250514-v1:0
|
|
||||||
- vertex_ai/claude-3-5-haiku
|
- vertex_ai/claude-3-5-haiku
|
||||||
- vertex_ai/claude-3-5-haiku@20241022
|
- vertex_ai/claude-3-5-haiku@20241022
|
||||||
- vertex_ai/claude-3-5-sonnet
|
- vertex_ai/claude-3-5-sonnet
|
||||||
|
@ -132,8 +118,6 @@ cog.out(model_list)
|
||||||
- vertex_ai/claude-3-opus@20240229
|
- vertex_ai/claude-3-opus@20240229
|
||||||
- vertex_ai/claude-3-sonnet
|
- vertex_ai/claude-3-sonnet
|
||||||
- vertex_ai/claude-3-sonnet@20240229
|
- vertex_ai/claude-3-sonnet@20240229
|
||||||
- vertex_ai/claude-opus-4@20250514
|
|
||||||
- vertex_ai/claude-sonnet-4@20250514
|
|
||||||
<!--[[[end]]]-->
|
<!--[[[end]]]-->
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -69,11 +69,11 @@ cog.out(text)
|
||||||
]]]-->
|
]]]-->
|
||||||
<a href="https://github.com/Aider-AI/aider" class="github-badge badge-stars" title="Total number of GitHub stars the Aider project has received">
|
<a href="https://github.com/Aider-AI/aider" class="github-badge badge-stars" title="Total number of GitHub stars the Aider project has received">
|
||||||
<span class="badge-label">⭐ GitHub Stars</span>
|
<span class="badge-label">⭐ GitHub Stars</span>
|
||||||
<span class="badge-value">34K</span>
|
<span class="badge-value">33K</span>
|
||||||
</a>
|
</a>
|
||||||
<a href="https://pypi.org/project/aider-chat/" class="github-badge badge-installs" title="Total number of installations via pip from PyPI">
|
<a href="https://pypi.org/project/aider-chat/" class="github-badge badge-installs" title="Total number of installations via pip from PyPI">
|
||||||
<span class="badge-label">📦 Installs</span>
|
<span class="badge-label">📦 Installs</span>
|
||||||
<span class="badge-value">2.5M</span>
|
<span class="badge-value">2.2M</span>
|
||||||
</a>
|
</a>
|
||||||
<div class="github-badge badge-tokens" title="Number of tokens processed weekly by Aider users">
|
<div class="github-badge badge-tokens" title="Number of tokens processed weekly by Aider users">
|
||||||
<span class="badge-label">📈 Tokens/week</span>
|
<span class="badge-label">📈 Tokens/week</span>
|
||||||
|
@ -85,7 +85,7 @@ cog.out(text)
|
||||||
</a>
|
</a>
|
||||||
<a href="/HISTORY.html" class="github-badge badge-coded" title="Percentage of the new code in Aider's last release written by Aider itself">
|
<a href="/HISTORY.html" class="github-badge badge-coded" title="Percentage of the new code in Aider's last release written by Aider itself">
|
||||||
<span class="badge-label">🔄 Singularity</span>
|
<span class="badge-label">🔄 Singularity</span>
|
||||||
<span class="badge-value">79%</span>
|
<span class="badge-value">92%</span>
|
||||||
</a>
|
</a>
|
||||||
<!--[[[end]]]-->
|
<!--[[[end]]]-->
|
||||||
</div>
|
</div>
|
||||||
|
@ -269,178 +269,178 @@ cog.out(text)
|
||||||
<script>
|
<script>
|
||||||
const testimonials = [
|
const testimonials = [
|
||||||
{
|
{
|
||||||
text: "My life has changed... Aider... It's going to rock your world.",
|
text: "My life has changed... There's finally an AI coding tool that's good enough to keep up with me... Aider... It's going to rock your world.",
|
||||||
author: "Eric S. Raymond on X",
|
author: "Eric S. Raymond",
|
||||||
link: "https://x.com/esrtweet/status/1910809356381413593"
|
link: "https://x.com/esrtweet/status/1910809356381413593"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "The best free open source AI coding assistant.",
|
text: "The best free open source AI coding assistant.",
|
||||||
author: "IndyDevDan on YouTube",
|
author: "IndyDevDan",
|
||||||
link: "https://youtu.be/YALpX8oOn78"
|
link: "https://youtu.be/YALpX8oOn78"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "The best AI coding assistant so far.",
|
text: "The best AI coding assistant so far.",
|
||||||
author: "Matthew Berman on YouTube",
|
author: "Matthew Berman",
|
||||||
link: "https://www.youtube.com/watch?v=df8afeb1FY8"
|
link: "https://www.youtube.com/watch?v=df8afeb1FY8"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Aider ... has easily quadrupled my coding productivity.",
|
text: "Aider ... has easily quadrupled my coding productivity.",
|
||||||
author: "SOLAR_FIELDS on Hacker News",
|
author: "SOLAR_FIELDS",
|
||||||
link: "https://news.ycombinator.com/item?id=36212100"
|
link: "https://news.ycombinator.com/item?id=36212100"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "It's a cool workflow... Aider's ergonomics are perfect for me.",
|
text: "It's a cool workflow... Aider's ergonomics are perfect for me.",
|
||||||
author: "qup on Hacker News",
|
author: "qup",
|
||||||
link: "https://news.ycombinator.com/item?id=38185326"
|
link: "https://news.ycombinator.com/item?id=38185326"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "It's really like having your senior developer live right in your Git repo - truly amazing!",
|
text: "It's really like having your senior developer live right in your Git repo - truly amazing!",
|
||||||
author: "rappster on GitHub",
|
author: "rappster",
|
||||||
link: "https://github.com/Aider-AI/aider/issues/124"
|
link: "https://github.com/Aider-AI/aider/issues/124"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "What an amazing tool. It's incredible.",
|
text: "What an amazing tool. It's incredible.",
|
||||||
author: "valyagolev on GitHub",
|
author: "valyagolev",
|
||||||
link: "https://github.com/Aider-AI/aider/issues/6#issue-1722897858"
|
link: "https://github.com/Aider-AI/aider/issues/6#issue-1722897858"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Aider is such an astounding thing!",
|
text: "Aider is such an astounding thing!",
|
||||||
author: "cgrothaus on GitHub",
|
author: "cgrothaus",
|
||||||
link: "https://github.com/Aider-AI/aider/issues/82#issuecomment-1631876700"
|
link: "https://github.com/Aider-AI/aider/issues/82#issuecomment-1631876700"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "It was WAY faster than I would be getting off the ground and making the first few working versions.",
|
text: "It was WAY faster than I would be getting off the ground and making the first few working versions.",
|
||||||
author: "Daniel Feldman on X",
|
author: "Daniel Feldman",
|
||||||
link: "https://twitter.com/d_feldman/status/1662295077387923456"
|
link: "https://twitter.com/d_feldman/status/1662295077387923456"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "THANK YOU for Aider! It really feels like a glimpse into the future of coding.",
|
text: "THANK YOU for Aider! It really feels like a glimpse into the future of coding.",
|
||||||
author: "derwiki on Hacker News",
|
author: "derwiki",
|
||||||
link: "https://news.ycombinator.com/item?id=38205643"
|
link: "https://news.ycombinator.com/item?id=38205643"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "It's just amazing. It is freeing me to do things I felt were out my comfort zone before.",
|
text: "It's just amazing. It is freeing me to do things I felt were out my comfort zone before.",
|
||||||
author: "Dougie on Discord",
|
author: "Dougie",
|
||||||
link: "https://discord.com/channels/1131200896827654144/1174002618058678323/1174084556257775656"
|
link: "https://discord.com/channels/1131200896827654144/1174002618058678323/1174084556257775656"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "This project is stellar.",
|
text: "This project is stellar.",
|
||||||
author: "funkytaco on GitHub",
|
author: "funkytaco",
|
||||||
link: "https://github.com/Aider-AI/aider/issues/112#issuecomment-1637429008"
|
link: "https://github.com/Aider-AI/aider/issues/112#issuecomment-1637429008"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Amazing project, definitely the best AI coding assistant I've used.",
|
text: "Amazing project, definitely the best AI coding assistant I've used.",
|
||||||
author: "joshuavial on GitHub",
|
author: "joshuavial",
|
||||||
link: "https://github.com/Aider-AI/aider/issues/84"
|
link: "https://github.com/Aider-AI/aider/issues/84"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "I absolutely love using Aider ... It makes software development feel so much lighter as an experience.",
|
text: "I absolutely love using Aider ... It makes software development feel so much lighter as an experience.",
|
||||||
author: "principalideal0 on Discord",
|
author: "principalideal0",
|
||||||
link: "https://discord.com/channels/1131200896827654144/1133421607499595858/1229689636012691468"
|
link: "https://discord.com/channels/1131200896827654144/1133421607499595858/1229689636012691468"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "I have been recovering from ... surgeries ... aider ... has allowed me to continue productivity.",
|
text: "I have been recovering from multiple shoulder surgeries ... and have used aider extensively. It has allowed me to continue productivity.",
|
||||||
author: "codeninja on Reddit",
|
author: "codeninja",
|
||||||
link: "https://www.reddit.com/r/OpenAI/s/nmNwkHy1zG"
|
link: "https://www.reddit.com/r/OpenAI/s/nmNwkHy1zG"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "I am an aider addict. I'm getting so much more work done, but in less time.",
|
text: "I am an aider addict. I'm getting so much more work done, but in less time.",
|
||||||
author: "dandandan on Discord",
|
author: "dandandan",
|
||||||
link: "https://discord.com/channels/1131200896827654144/1131200896827654149/1135913253483069470"
|
link: "https://discord.com/channels/1131200896827654144/1131200896827654149/1135913253483069470"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Aider... blows everything else out of the water hands down, there's no competition whatsoever.",
|
text: "After wasting $100 on tokens trying to find something better, I'm back to Aider. It blows everything else out of the water hands down, there's no competition whatsoever.",
|
||||||
author: "SystemSculpt on Discord",
|
author: "SystemSculpt",
|
||||||
link: "https://discord.com/channels/1131200896827654144/1131200896827654149/1178736602797846548"
|
link: "https://discord.com/channels/1131200896827654144/1131200896827654149/1178736602797846548"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Aider is amazing, coupled with Sonnet 3.5 it's quite mind blowing.",
|
text: "Aider is amazing, coupled with Sonnet 3.5 it's quite mind blowing.",
|
||||||
author: "Josh Dingus on Discord",
|
author: "Josh Dingus",
|
||||||
link: "https://discord.com/channels/1131200896827654144/1133060684540813372/1262374225298198548"
|
link: "https://discord.com/channels/1131200896827654144/1133060684540813372/1262374225298198548"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Hands down, this is the best AI coding assistant tool so far.",
|
text: "Hands down, this is the best AI coding assistant tool so far.",
|
||||||
author: "IndyDevDan on YouTube",
|
author: "IndyDevDan",
|
||||||
link: "https://www.youtube.com/watch?v=MPYFPvxfGZs"
|
link: "https://www.youtube.com/watch?v=MPYFPvxfGZs"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "[Aider] changed my daily coding workflows. It's mind-blowing how ...(it)... can change your life.",
|
text: "[Aider] changed my daily coding workflows. It's mind-blowing how a single Python application can change your life.",
|
||||||
author: "maledorak on Discord",
|
author: "maledorak",
|
||||||
link: "https://discord.com/channels/1131200896827654144/1131200896827654149/1258453375620747264"
|
link: "https://discord.com/channels/1131200896827654144/1131200896827654149/1258453375620747264"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Best agent for actual dev work in existing codebases.",
|
text: "Best agent for actual dev work in existing codebases.",
|
||||||
author: "Nick Dobos on X",
|
author: "Nick Dobos",
|
||||||
link: "https://twitter.com/NickADobos/status/1690408967963652097?s=20"
|
link: "https://twitter.com/NickADobos/status/1690408967963652097?s=20"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "One of my favorite pieces of software. Blazing trails on new paradigms!",
|
text: "One of my favorite pieces of software. Blazing trails on new paradigms!",
|
||||||
author: "Chris Wall on X",
|
author: "Chris Wall",
|
||||||
link: "https://x.com/chris65536/status/1905053299251798432"
|
link: "https://x.com/chris65536/status/1905053299251798432"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Aider has been revolutionary for me and my work.",
|
text: "Aider has been revolutionary for me and my work.",
|
||||||
author: "Starry Hope on X",
|
author: "Starry Hope",
|
||||||
link: "https://x.com/starryhopeblog/status/1904985812137132056"
|
link: "https://x.com/starryhopeblog/status/1904985812137132056"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Try aider! One of the best ways to vibe code.",
|
text: "Try aider! One of the best ways to vibe code.",
|
||||||
author: "Chris Wall on X",
|
author: "Chris Wall",
|
||||||
link: "https://x.com/Chris65536/status/1905053418961391929"
|
link: "https://x.com/Chris65536/status/1905053418961391929"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Aider is hands down the best. And it's free and opensource.",
|
text: "Aider is hands down the best. And it's free and opensource.",
|
||||||
author: "AriyaSavakaLurker on Reddit",
|
author: "AriyaSavakaLurker",
|
||||||
link: "https://www.reddit.com/r/ChatGPTCoding/comments/1ik16y6/whats_your_take_on_aider/mbip39n/"
|
link: "https://www.reddit.com/r/ChatGPTCoding/comments/1ik16y6/whats_your_take_on_aider/mbip39n/"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Aider is also my best friend.",
|
text: "Aider is also my best friend.",
|
||||||
author: "jzn21 on Reddit",
|
author: "jzn21",
|
||||||
link: "https://www.reddit.com/r/ChatGPTCoding/comments/1heuvuo/aider_vs_cline_vs_windsurf_vs_cursor/m27dcnb/"
|
link: "https://www.reddit.com/r/ChatGPTCoding/comments/1heuvuo/aider_vs_cline_vs_windsurf_vs_cursor/m27dcnb/"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Try Aider, it's worth it.",
|
text: "Try Aider, it's worth it.",
|
||||||
author: "jorgejhms on Reddit",
|
author: "jorgejhms",
|
||||||
link: "https://www.reddit.com/r/ChatGPTCoding/comments/1heuvuo/aider_vs_cline_vs_windsurf_vs_cursor/m27cp99/"
|
link: "https://www.reddit.com/r/ChatGPTCoding/comments/1heuvuo/aider_vs_cline_vs_windsurf_vs_cursor/m27cp99/"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "I like aider :)",
|
text: "I like aider :)",
|
||||||
author: "Chenwei Cui on X",
|
author: "Chenwei Cui",
|
||||||
link: "https://x.com/ccui42/status/1904965344999145698"
|
link: "https://x.com/ccui42/status/1904965344999145698"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Aider is the precision tool of LLM code gen... Minimal, thoughtful and capable of surgical changes ... while keeping the developer in control.",
|
text: "Aider is the precision tool of LLM code gen... Minimal, thoughtful and capable of surgical changes to your codebase all while keeping the developer in control.",
|
||||||
author: "Reilly Sweetland on X",
|
author: "Reilly Sweetland",
|
||||||
link: "https://x.com/rsweetland/status/1904963807237259586"
|
link: "https://x.com/rsweetland/status/1904963807237259586"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Cannot believe aider vibe coded a 650 LOC feature across service and cli today in 1 shot.",
|
text: "Cannot believe aider vibe coded a 650 LOC feature across service and cli today in 1 shot.",
|
||||||
author: "autopoietist on Discord",
|
author: "autopoietist",
|
||||||
link: "https://discord.com/channels/1131200896827654144/1131200896827654149/1355675042259796101"
|
link: "https://discord.com/channels/1131200896827654144/1131200896827654149/1355675042259796101"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Oh no the secret is out! Yes, Aider is the best coding tool around. I highly, highly recommend it to anyone.",
|
text: "Oh no the secret is out! Yes, Aider is the best coding tool around. I highly, highly recommend it to anyone.",
|
||||||
author: "Joshua D Vander Hook on X",
|
author: "Joshua D Vander Hook",
|
||||||
link: "https://x.com/jodavaho/status/1911154899057795218"
|
link: "https://x.com/jodavaho/status/1911154899057795218"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "thanks to aider, i have started and finished three personal projects within the last two days",
|
text: "thanks to aider, i have started and finished three personal projects within the last two days",
|
||||||
author: "joseph stalzyn on X",
|
author: "joseph stalzyn",
|
||||||
link: "https://x.com/anitaheeder/status/1908338609645904160"
|
link: "https://x.com/anitaheeder/status/1908338609645904160"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Been using aider as my daily driver for over a year ... I absolutely love the tool, like beyond words.",
|
text: "Been using aider as my daily driver for over a year ... I absolutely love the tool, like beyond words.",
|
||||||
author: "koleok on Discord",
|
author: "koleok",
|
||||||
link: "https://discord.com/channels/1131200896827654144/1273248471394291754/1356727448372252783"
|
link: "https://discord.com/channels/1131200896827654144/1273248471394291754/1356727448372252783"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "Aider ... is the tool to benchmark against.",
|
text: "Aider ... is the tool to benchmark against.",
|
||||||
author: "BeetleB on Hacker News",
|
author: "BeetleB",
|
||||||
link: "https://news.ycombinator.com/item?id=43930201"
|
link: "https://news.ycombinator.com/item?id=43930201"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
text: "aider is really cool",
|
text: "aider is really cool",
|
||||||
author: "kache on X",
|
author: "kache (@yacineMTB)",
|
||||||
link: "https://x.com/yacineMTB/status/1911224442430124387"
|
link: "https://x.com/yacineMTB/status/1911224442430124387"
|
||||||
}
|
}
|
||||||
];
|
];
|
||||||
|
@ -642,7 +642,6 @@ const testimonials = [
|
||||||
<li><a href="/docs/leaderboards/">LLM Leaderboards</a></li>
|
<li><a href="/docs/leaderboards/">LLM Leaderboards</a></li>
|
||||||
<li><a href="https://github.com/Aider-AI/aider">GitHub Repository</a></li>
|
<li><a href="https://github.com/Aider-AI/aider">GitHub Repository</a></li>
|
||||||
<li><a href="https://discord.gg/Y7X7bhMQFV">Discord Community</a></li>
|
<li><a href="https://discord.gg/Y7X7bhMQFV">Discord Community</a></li>
|
||||||
<li><a href="https://aider.chat/HISTORY.html">Release notes</a></li>
|
|
||||||
<li><a href="/blog/">Blog</a></li>
|
<li><a href="/blog/">Blog</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -425,7 +425,7 @@ function Invoke-Installer($artifacts, $platforms) {
|
||||||
|
|
||||||
Write-Information ""
|
Write-Information ""
|
||||||
Write-Information "Installing aider-chat..."
|
Write-Information "Installing aider-chat..."
|
||||||
& "$dest_dir\uv.exe" tool install --force --python python3.12 --with pip aider-chat@latest
|
& "$dest_dir\uv.exe" tool install --force --python python3.12 aider-chat@latest
|
||||||
|
|
||||||
if (-not $NoModifyPath) {
|
if (-not $NoModifyPath) {
|
||||||
Add-Ci-Path $dest_dir
|
Add-Ci-Path $dest_dir
|
||||||
|
|
|
@ -1178,7 +1178,7 @@ install() {
|
||||||
say "Installing aider..."
|
say "Installing aider..."
|
||||||
say ""
|
say ""
|
||||||
# Install aider-chat using the newly installed uv
|
# Install aider-chat using the newly installed uv
|
||||||
ensure "${_install_dir}/uv" tool install --force --python python3.12 --with pip aider-chat@latest
|
ensure "${_install_dir}/uv" tool install --force --python python3.12 aider-chat@latest
|
||||||
|
|
||||||
# Avoid modifying the users PATH if they are managing their PATH manually
|
# Avoid modifying the users PATH if they are managing their PATH manually
|
||||||
case :$PATH:
|
case :$PATH:
|
||||||
|
|
|
@ -4,7 +4,7 @@ aiohappyeyeballs==2.6.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# aiohttp
|
# aiohttp
|
||||||
aiohttp==3.12.9
|
aiohttp==3.11.18
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# litellm
|
# litellm
|
||||||
|
@ -56,11 +56,11 @@ charset-normalizer==3.4.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# requests
|
# requests
|
||||||
click==8.2.1
|
click==8.1.8
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# litellm
|
# litellm
|
||||||
configargparse==1.7.1
|
configargparse==1.7
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
|
@ -85,12 +85,12 @@ flake8==7.2.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
frozenlist==1.6.2
|
frozenlist==1.6.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# aiohttp
|
# aiohttp
|
||||||
# aiosignal
|
# aiosignal
|
||||||
fsspec==2025.5.1
|
fsspec==2025.3.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# huggingface-hub
|
# huggingface-hub
|
||||||
|
@ -106,17 +106,17 @@ google-ai-generativelanguage==0.6.15
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-generativeai
|
# google-generativeai
|
||||||
google-api-core[grpc]==2.25.0
|
google-api-core[grpc]==2.24.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-ai-generativelanguage
|
# google-ai-generativelanguage
|
||||||
# google-api-python-client
|
# google-api-python-client
|
||||||
# google-generativeai
|
# google-generativeai
|
||||||
google-api-python-client==2.171.0
|
google-api-python-client==2.169.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-generativeai
|
# google-generativeai
|
||||||
google-auth==2.40.3
|
google-auth==2.40.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-ai-generativelanguage
|
# google-ai-generativelanguage
|
||||||
|
@ -141,7 +141,7 @@ grep-ast==0.9.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
grpcio==1.72.1
|
grpcio==1.71.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-api-core
|
# google-api-core
|
||||||
|
@ -154,7 +154,7 @@ h11==0.16.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# httpcore
|
# httpcore
|
||||||
hf-xet==1.1.3
|
hf-xet==1.1.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# huggingface-hub
|
# huggingface-hub
|
||||||
|
@ -172,7 +172,7 @@ httpx==0.28.1
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# litellm
|
# litellm
|
||||||
# openai
|
# openai
|
||||||
huggingface-hub==0.32.4
|
huggingface-hub==0.31.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# tokenizers
|
# tokenizers
|
||||||
|
@ -196,7 +196,7 @@ jinja2==3.1.6
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# litellm
|
# litellm
|
||||||
jiter==0.10.0
|
jiter==0.9.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# openai
|
# openai
|
||||||
|
@ -204,7 +204,7 @@ json5==0.12.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
jsonschema==4.24.0
|
jsonschema==4.23.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
|
@ -213,7 +213,7 @@ jsonschema-specifications==2025.4.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# jsonschema
|
# jsonschema
|
||||||
litellm==1.72.1
|
litellm==1.68.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
|
@ -241,7 +241,7 @@ mslex==1.3.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# oslex
|
# oslex
|
||||||
multidict==6.4.4
|
multidict==6.4.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# aiohttp
|
# aiohttp
|
||||||
|
@ -255,7 +255,7 @@ numpy==1.26.4
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# scipy
|
# scipy
|
||||||
# soundfile
|
# soundfile
|
||||||
openai==1.84.0
|
openai==1.75.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# litellm
|
# litellm
|
||||||
|
@ -281,7 +281,7 @@ pillow==11.2.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
posthog==4.3.2
|
posthog==4.0.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
|
@ -299,7 +299,7 @@ proto-plus==1.26.1
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-ai-generativelanguage
|
# google-ai-generativelanguage
|
||||||
# google-api-core
|
# google-api-core
|
||||||
protobuf==5.29.5
|
protobuf==5.29.4
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-ai-generativelanguage
|
# google-ai-generativelanguage
|
||||||
|
@ -333,7 +333,7 @@ pycparser==2.22
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# cffi
|
# cffi
|
||||||
pydantic==2.11.5
|
pydantic==2.11.4
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-generativeai
|
# google-generativeai
|
||||||
|
@ -401,7 +401,7 @@ rich==14.0.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
rpds-py==0.25.1
|
rpds-py==0.24.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# jsonschema
|
# jsonschema
|
||||||
|
@ -437,7 +437,7 @@ socksio==1.0.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
sounddevice==0.5.2
|
sounddevice==0.5.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
|
@ -478,11 +478,11 @@ tree-sitter-language-pack==0.7.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# grep-ast
|
# grep-ast
|
||||||
tree-sitter-yaml==0.7.1
|
tree-sitter-yaml==0.7.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# tree-sitter-language-pack
|
# tree-sitter-language-pack
|
||||||
typing-extensions==4.14.0
|
typing-extensions==4.13.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# anyio
|
# anyio
|
||||||
|
@ -494,11 +494,11 @@ typing-extensions==4.14.0
|
||||||
# pydantic-core
|
# pydantic-core
|
||||||
# referencing
|
# referencing
|
||||||
# typing-inspection
|
# typing-inspection
|
||||||
typing-inspection==0.4.1
|
typing-inspection==0.4.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# pydantic
|
# pydantic
|
||||||
uritemplate==4.2.0
|
uritemplate==4.1.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-api-python-client
|
# google-api-python-client
|
||||||
|
@ -519,7 +519,7 @@ yarl==1.20.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# aiohttp
|
# aiohttp
|
||||||
zipp==3.22.0
|
zipp==3.21.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# importlib-metadata
|
# importlib-metadata
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
# uv pip compile --no-strip-extras --output-file=requirements/common-constraints.txt requirements/requirements.in requirements/requirements-browser.in requirements/requirements-dev.in requirements/requirements-help.in requirements/requirements-playwright.in
|
# uv pip compile --no-strip-extras --output-file=requirements/common-constraints.txt requirements/requirements.in requirements/requirements-browser.in requirements/requirements-dev.in requirements/requirements-help.in requirements/requirements-playwright.in
|
||||||
aiohappyeyeballs==2.6.1
|
aiohappyeyeballs==2.6.1
|
||||||
# via aiohttp
|
# via aiohttp
|
||||||
aiohttp==3.12.9
|
aiohttp==3.11.18
|
||||||
# via
|
# via
|
||||||
# huggingface-hub
|
# huggingface-hub
|
||||||
# litellm
|
# litellm
|
||||||
|
@ -52,7 +52,7 @@ cfgv==3.4.0
|
||||||
# via pre-commit
|
# via pre-commit
|
||||||
charset-normalizer==3.4.2
|
charset-normalizer==3.4.2
|
||||||
# via requests
|
# via requests
|
||||||
click==8.2.1
|
click==8.1.8
|
||||||
# via
|
# via
|
||||||
# litellm
|
# litellm
|
||||||
# nltk
|
# nltk
|
||||||
|
@ -61,11 +61,11 @@ click==8.2.1
|
||||||
# typer
|
# typer
|
||||||
codespell==2.4.1
|
codespell==2.4.1
|
||||||
# via -r requirements/requirements-dev.in
|
# via -r requirements/requirements-dev.in
|
||||||
cogapp==3.5.0
|
cogapp==3.4.1
|
||||||
# via -r requirements/requirements-dev.in
|
# via -r requirements/requirements-dev.in
|
||||||
colorama==0.4.6
|
colorama==0.4.6
|
||||||
# via griffe
|
# via griffe
|
||||||
configargparse==1.7.1
|
configargparse==1.7
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
contourpy==1.3.2
|
contourpy==1.3.2
|
||||||
# via matplotlib
|
# via matplotlib
|
||||||
|
@ -103,13 +103,13 @@ filetype==1.2.0
|
||||||
# via llama-index-core
|
# via llama-index-core
|
||||||
flake8==7.2.0
|
flake8==7.2.0
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
fonttools==4.58.2
|
fonttools==4.57.0
|
||||||
# via matplotlib
|
# via matplotlib
|
||||||
frozenlist==1.6.2
|
frozenlist==1.6.0
|
||||||
# via
|
# via
|
||||||
# aiohttp
|
# aiohttp
|
||||||
# aiosignal
|
# aiosignal
|
||||||
fsspec==2025.5.1
|
fsspec==2025.3.2
|
||||||
# via
|
# via
|
||||||
# huggingface-hub
|
# huggingface-hub
|
||||||
# llama-index-core
|
# llama-index-core
|
||||||
|
@ -122,16 +122,16 @@ gitpython==3.1.44
|
||||||
# streamlit
|
# streamlit
|
||||||
google-ai-generativelanguage==0.6.15
|
google-ai-generativelanguage==0.6.15
|
||||||
# via google-generativeai
|
# via google-generativeai
|
||||||
google-api-core[grpc]==2.25.0
|
google-api-core[grpc]==2.24.2
|
||||||
# via
|
# via
|
||||||
# google-ai-generativelanguage
|
# google-ai-generativelanguage
|
||||||
# google-api-python-client
|
# google-api-python-client
|
||||||
# google-cloud-bigquery
|
# google-cloud-bigquery
|
||||||
# google-cloud-core
|
# google-cloud-core
|
||||||
# google-generativeai
|
# google-generativeai
|
||||||
google-api-python-client==2.171.0
|
google-api-python-client==2.169.0
|
||||||
# via google-generativeai
|
# via google-generativeai
|
||||||
google-auth==2.40.3
|
google-auth==2.40.1
|
||||||
# via
|
# via
|
||||||
# google-ai-generativelanguage
|
# google-ai-generativelanguage
|
||||||
# google-api-core
|
# google-api-core
|
||||||
|
@ -142,7 +142,7 @@ google-auth==2.40.3
|
||||||
# google-generativeai
|
# google-generativeai
|
||||||
google-auth-httplib2==0.2.0
|
google-auth-httplib2==0.2.0
|
||||||
# via google-api-python-client
|
# via google-api-python-client
|
||||||
google-cloud-bigquery==3.34.0
|
google-cloud-bigquery==3.31.0
|
||||||
# via -r requirements/requirements-dev.in
|
# via -r requirements/requirements-dev.in
|
||||||
google-cloud-core==2.4.3
|
google-cloud-core==2.4.3
|
||||||
# via google-cloud-bigquery
|
# via google-cloud-bigquery
|
||||||
|
@ -156,7 +156,7 @@ googleapis-common-protos==1.70.0
|
||||||
# via
|
# via
|
||||||
# google-api-core
|
# google-api-core
|
||||||
# grpcio-status
|
# grpcio-status
|
||||||
greenlet==3.2.3
|
greenlet==3.2.2
|
||||||
# via
|
# via
|
||||||
# playwright
|
# playwright
|
||||||
# sqlalchemy
|
# sqlalchemy
|
||||||
|
@ -164,7 +164,7 @@ grep-ast==0.9.0
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
griffe==1.7.3
|
griffe==1.7.3
|
||||||
# via banks
|
# via banks
|
||||||
grpcio==1.72.1
|
grpcio==1.71.0
|
||||||
# via
|
# via
|
||||||
# google-api-core
|
# google-api-core
|
||||||
# grpcio-status
|
# grpcio-status
|
||||||
|
@ -172,7 +172,7 @@ grpcio-status==1.71.0
|
||||||
# via google-api-core
|
# via google-api-core
|
||||||
h11==0.16.0
|
h11==0.16.0
|
||||||
# via httpcore
|
# via httpcore
|
||||||
hf-xet==1.1.3
|
hf-xet==1.1.0
|
||||||
# via huggingface-hub
|
# via huggingface-hub
|
||||||
httpcore==1.0.9
|
httpcore==1.0.9
|
||||||
# via httpx
|
# via httpx
|
||||||
|
@ -185,13 +185,13 @@ httpx==0.28.1
|
||||||
# litellm
|
# litellm
|
||||||
# llama-index-core
|
# llama-index-core
|
||||||
# openai
|
# openai
|
||||||
huggingface-hub[inference]==0.32.4
|
huggingface-hub[inference]==0.31.1
|
||||||
# via
|
# via
|
||||||
# llama-index-embeddings-huggingface
|
# llama-index-embeddings-huggingface
|
||||||
# sentence-transformers
|
# sentence-transformers
|
||||||
# tokenizers
|
# tokenizers
|
||||||
# transformers
|
# transformers
|
||||||
identify==2.6.12
|
identify==2.6.10
|
||||||
# via pre-commit
|
# via pre-commit
|
||||||
idna==3.10
|
idna==3.10
|
||||||
# via
|
# via
|
||||||
|
@ -216,15 +216,15 @@ jinja2==3.1.6
|
||||||
# litellm
|
# litellm
|
||||||
# pydeck
|
# pydeck
|
||||||
# torch
|
# torch
|
||||||
jiter==0.10.0
|
jiter==0.9.0
|
||||||
# via openai
|
# via openai
|
||||||
joblib==1.5.1
|
joblib==1.5.0
|
||||||
# via
|
# via
|
||||||
# nltk
|
# nltk
|
||||||
# scikit-learn
|
# scikit-learn
|
||||||
json5==0.12.0
|
json5==0.12.0
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
jsonschema==4.24.0
|
jsonschema==4.23.0
|
||||||
# via
|
# via
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
# altair
|
# altair
|
||||||
|
@ -233,13 +233,13 @@ jsonschema-specifications==2025.4.1
|
||||||
# via jsonschema
|
# via jsonschema
|
||||||
kiwisolver==1.4.8
|
kiwisolver==1.4.8
|
||||||
# via matplotlib
|
# via matplotlib
|
||||||
litellm==1.72.1
|
litellm==1.68.1
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
llama-index-core==0.12.26
|
llama-index-core==0.12.26
|
||||||
# via
|
# via
|
||||||
# -r requirements/requirements-help.in
|
# -r requirements/requirements-help.in
|
||||||
# llama-index-embeddings-huggingface
|
# llama-index-embeddings-huggingface
|
||||||
llama-index-embeddings-huggingface==0.5.4
|
llama-index-embeddings-huggingface==0.5.3
|
||||||
# via -r requirements/requirements-help.in
|
# via -r requirements/requirements-help.in
|
||||||
lox==0.13.0
|
lox==0.13.0
|
||||||
# via -r requirements/requirements-dev.in
|
# via -r requirements/requirements-dev.in
|
||||||
|
@ -261,7 +261,7 @@ mpmath==1.3.0
|
||||||
# via sympy
|
# via sympy
|
||||||
mslex==1.3.0
|
mslex==1.3.0
|
||||||
# via oslex
|
# via oslex
|
||||||
multidict==6.4.4
|
multidict==6.4.3
|
||||||
# via
|
# via
|
||||||
# aiohttp
|
# aiohttp
|
||||||
# yarl
|
# yarl
|
||||||
|
@ -269,7 +269,7 @@ multiprocess==0.70.18
|
||||||
# via pathos
|
# via pathos
|
||||||
mypy-extensions==1.1.0
|
mypy-extensions==1.1.0
|
||||||
# via typing-inspect
|
# via typing-inspect
|
||||||
narwhals==1.41.1
|
narwhals==1.38.2
|
||||||
# via altair
|
# via altair
|
||||||
nest-asyncio==1.6.0
|
nest-asyncio==1.6.0
|
||||||
# via llama-index-core
|
# via llama-index-core
|
||||||
|
@ -295,7 +295,7 @@ numpy==1.26.4
|
||||||
# soundfile
|
# soundfile
|
||||||
# streamlit
|
# streamlit
|
||||||
# transformers
|
# transformers
|
||||||
openai==1.84.0
|
openai==1.75.0
|
||||||
# via litellm
|
# via litellm
|
||||||
oslex==0.1.3
|
oslex==0.1.3
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
|
@ -311,7 +311,7 @@ packaging==24.2
|
||||||
# pytest
|
# pytest
|
||||||
# streamlit
|
# streamlit
|
||||||
# transformers
|
# transformers
|
||||||
pandas==2.3.0
|
pandas==2.2.3
|
||||||
# via
|
# via
|
||||||
# -r requirements/requirements-dev.in
|
# -r requirements/requirements-dev.in
|
||||||
# streamlit
|
# streamlit
|
||||||
|
@ -340,9 +340,9 @@ platformdirs==4.3.8
|
||||||
# virtualenv
|
# virtualenv
|
||||||
playwright==1.52.0
|
playwright==1.52.0
|
||||||
# via -r requirements/requirements-playwright.in
|
# via -r requirements/requirements-playwright.in
|
||||||
pluggy==1.6.0
|
pluggy==1.5.0
|
||||||
# via pytest
|
# via pytest
|
||||||
posthog==4.3.2
|
posthog==4.0.1
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
pox==0.3.6
|
pox==0.3.6
|
||||||
# via pathos
|
# via pathos
|
||||||
|
@ -360,7 +360,7 @@ proto-plus==1.26.1
|
||||||
# via
|
# via
|
||||||
# google-ai-generativelanguage
|
# google-ai-generativelanguage
|
||||||
# google-api-core
|
# google-api-core
|
||||||
protobuf==5.29.5
|
protobuf==5.29.4
|
||||||
# via
|
# via
|
||||||
# google-ai-generativelanguage
|
# google-ai-generativelanguage
|
||||||
# google-api-core
|
# google-api-core
|
||||||
|
@ -385,7 +385,7 @@ pycodestyle==2.13.0
|
||||||
# via flake8
|
# via flake8
|
||||||
pycparser==2.22
|
pycparser==2.22
|
||||||
# via cffi
|
# via cffi
|
||||||
pydantic==2.11.5
|
pydantic==2.11.4
|
||||||
# via
|
# via
|
||||||
# banks
|
# banks
|
||||||
# google-generativeai
|
# google-generativeai
|
||||||
|
@ -403,9 +403,7 @@ pyee==13.0.0
|
||||||
pyflakes==3.3.2
|
pyflakes==3.3.2
|
||||||
# via flake8
|
# via flake8
|
||||||
pygments==2.19.1
|
pygments==2.19.1
|
||||||
# via
|
# via rich
|
||||||
# pytest
|
|
||||||
# rich
|
|
||||||
pypandoc==1.15
|
pypandoc==1.15
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
pyparsing==3.2.3
|
pyparsing==3.2.3
|
||||||
|
@ -418,7 +416,7 @@ pyproject-hooks==1.2.0
|
||||||
# via
|
# via
|
||||||
# build
|
# build
|
||||||
# pip-tools
|
# pip-tools
|
||||||
pytest==8.4.0
|
pytest==8.3.5
|
||||||
# via
|
# via
|
||||||
# -r requirements/requirements-dev.in
|
# -r requirements/requirements-dev.in
|
||||||
# pytest-env
|
# pytest-env
|
||||||
|
@ -465,7 +463,7 @@ rich==14.0.0
|
||||||
# via
|
# via
|
||||||
# -r requirements/requirements.in
|
# -r requirements/requirements.in
|
||||||
# typer
|
# typer
|
||||||
rpds-py==0.25.1
|
rpds-py==0.24.0
|
||||||
# via
|
# via
|
||||||
# jsonschema
|
# jsonschema
|
||||||
# referencing
|
# referencing
|
||||||
|
@ -473,7 +471,7 @@ rsa==4.9.1
|
||||||
# via google-auth
|
# via google-auth
|
||||||
safetensors==0.5.3
|
safetensors==0.5.3
|
||||||
# via transformers
|
# via transformers
|
||||||
scikit-learn==1.7.0
|
scikit-learn==1.6.1
|
||||||
# via sentence-transformers
|
# via sentence-transformers
|
||||||
scipy==1.15.3
|
scipy==1.15.3
|
||||||
# via
|
# via
|
||||||
|
@ -484,7 +482,7 @@ semver==3.0.4
|
||||||
# via -r requirements/requirements-dev.in
|
# via -r requirements/requirements-dev.in
|
||||||
sentence-transformers==4.1.0
|
sentence-transformers==4.1.0
|
||||||
# via llama-index-embeddings-huggingface
|
# via llama-index-embeddings-huggingface
|
||||||
setuptools==80.9.0
|
setuptools==80.3.1
|
||||||
# via pip-tools
|
# via pip-tools
|
||||||
shellingham==1.5.4
|
shellingham==1.5.4
|
||||||
# via typer
|
# via typer
|
||||||
|
@ -503,15 +501,15 @@ sniffio==1.3.1
|
||||||
# openai
|
# openai
|
||||||
socksio==1.0.0
|
socksio==1.0.0
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
sounddevice==0.5.2
|
sounddevice==0.5.1
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
soundfile==0.13.1
|
soundfile==0.13.1
|
||||||
# via -r requirements/requirements.in
|
# via -r requirements/requirements.in
|
||||||
soupsieve==2.7
|
soupsieve==2.7
|
||||||
# via beautifulsoup4
|
# via beautifulsoup4
|
||||||
sqlalchemy[asyncio]==2.0.41
|
sqlalchemy[asyncio]==2.0.40
|
||||||
# via llama-index-core
|
# via llama-index-core
|
||||||
streamlit==1.45.1
|
streamlit==1.45.0
|
||||||
# via -r requirements/requirements-browser.in
|
# via -r requirements/requirements-browser.in
|
||||||
sympy==1.14.0
|
sympy==1.14.0
|
||||||
# via torch
|
# via torch
|
||||||
|
@ -535,7 +533,7 @@ torch==2.2.2
|
||||||
# via
|
# via
|
||||||
# -r requirements/requirements-help.in
|
# -r requirements/requirements-help.in
|
||||||
# sentence-transformers
|
# sentence-transformers
|
||||||
tornado==6.5.1
|
tornado==6.4.2
|
||||||
# via streamlit
|
# via streamlit
|
||||||
tqdm==4.67.1
|
tqdm==4.67.1
|
||||||
# via
|
# via
|
||||||
|
@ -546,7 +544,7 @@ tqdm==4.67.1
|
||||||
# openai
|
# openai
|
||||||
# sentence-transformers
|
# sentence-transformers
|
||||||
# transformers
|
# transformers
|
||||||
transformers==4.52.4
|
transformers==4.51.3
|
||||||
# via sentence-transformers
|
# via sentence-transformers
|
||||||
tree-sitter==0.24.0
|
tree-sitter==0.24.0
|
||||||
# via tree-sitter-language-pack
|
# via tree-sitter-language-pack
|
||||||
|
@ -556,11 +554,11 @@ tree-sitter-embedded-template==0.23.2
|
||||||
# via tree-sitter-language-pack
|
# via tree-sitter-language-pack
|
||||||
tree-sitter-language-pack==0.7.3
|
tree-sitter-language-pack==0.7.3
|
||||||
# via grep-ast
|
# via grep-ast
|
||||||
tree-sitter-yaml==0.7.1
|
tree-sitter-yaml==0.7.0
|
||||||
# via tree-sitter-language-pack
|
# via tree-sitter-language-pack
|
||||||
typer==0.16.0
|
typer==0.15.3
|
||||||
# via -r requirements/requirements-dev.in
|
# via -r requirements/requirements-dev.in
|
||||||
typing-extensions==4.14.0
|
typing-extensions==4.13.2
|
||||||
# via
|
# via
|
||||||
# altair
|
# altair
|
||||||
# anyio
|
# anyio
|
||||||
|
@ -584,17 +582,17 @@ typing-inspect==0.9.0
|
||||||
# via
|
# via
|
||||||
# dataclasses-json
|
# dataclasses-json
|
||||||
# llama-index-core
|
# llama-index-core
|
||||||
typing-inspection==0.4.1
|
typing-inspection==0.4.0
|
||||||
# via pydantic
|
# via pydantic
|
||||||
tzdata==2025.2
|
tzdata==2025.2
|
||||||
# via pandas
|
# via pandas
|
||||||
uritemplate==4.2.0
|
uritemplate==4.1.1
|
||||||
# via google-api-python-client
|
# via google-api-python-client
|
||||||
urllib3==2.4.0
|
urllib3==2.4.0
|
||||||
# via
|
# via
|
||||||
# mixpanel
|
# mixpanel
|
||||||
# requests
|
# requests
|
||||||
uv==0.7.11
|
uv==0.7.3
|
||||||
# via -r requirements/requirements-dev.in
|
# via -r requirements/requirements-dev.in
|
||||||
virtualenv==20.31.2
|
virtualenv==20.31.2
|
||||||
# via pre-commit
|
# via pre-commit
|
||||||
|
@ -610,5 +608,5 @@ wrapt==1.17.2
|
||||||
# llama-index-core
|
# llama-index-core
|
||||||
yarl==1.20.0
|
yarl==1.20.0
|
||||||
# via aiohttp
|
# via aiohttp
|
||||||
zipp==3.22.0
|
zipp==3.21.0
|
||||||
# via importlib-metadata
|
# via importlib-metadata
|
||||||
|
|
|
@ -25,7 +25,7 @@ charset-normalizer==3.4.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# requests
|
# requests
|
||||||
click==8.2.1
|
click==8.1.8
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# streamlit
|
# streamlit
|
||||||
|
@ -46,7 +46,7 @@ jinja2==3.1.6
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# altair
|
# altair
|
||||||
# pydeck
|
# pydeck
|
||||||
jsonschema==4.24.0
|
jsonschema==4.23.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# altair
|
# altair
|
||||||
|
@ -58,7 +58,7 @@ markupsafe==3.0.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# jinja2
|
# jinja2
|
||||||
narwhals==1.41.1
|
narwhals==1.38.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# altair
|
# altair
|
||||||
|
@ -73,7 +73,7 @@ packaging==24.2
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# altair
|
# altair
|
||||||
# streamlit
|
# streamlit
|
||||||
pandas==2.3.0
|
pandas==2.2.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# streamlit
|
# streamlit
|
||||||
|
@ -81,7 +81,7 @@ pillow==11.2.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# streamlit
|
# streamlit
|
||||||
protobuf==5.29.5
|
protobuf==5.29.4
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# streamlit
|
# streamlit
|
||||||
|
@ -110,7 +110,7 @@ requests==2.32.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# streamlit
|
# streamlit
|
||||||
rpds-py==0.25.1
|
rpds-py==0.24.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# jsonschema
|
# jsonschema
|
||||||
|
@ -123,7 +123,7 @@ smmap==5.0.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# gitdb
|
# gitdb
|
||||||
streamlit==1.45.1
|
streamlit==1.45.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-browser.in
|
# -r requirements/requirements-browser.in
|
||||||
|
@ -135,11 +135,11 @@ toml==0.10.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# streamlit
|
# streamlit
|
||||||
tornado==6.5.1
|
tornado==6.4.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# streamlit
|
# streamlit
|
||||||
typing-extensions==4.14.0
|
typing-extensions==4.13.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# altair
|
# altair
|
||||||
|
|
|
@ -20,7 +20,7 @@ charset-normalizer==3.4.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# requests
|
# requests
|
||||||
click==8.2.1
|
click==8.1.8
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# pip-tools
|
# pip-tools
|
||||||
|
@ -29,7 +29,7 @@ codespell==2.4.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-dev.in
|
# -r requirements/requirements-dev.in
|
||||||
cogapp==3.5.0
|
cogapp==3.4.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-dev.in
|
# -r requirements/requirements-dev.in
|
||||||
|
@ -54,22 +54,22 @@ filelock==3.18.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# virtualenv
|
# virtualenv
|
||||||
fonttools==4.58.2
|
fonttools==4.57.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# matplotlib
|
# matplotlib
|
||||||
google-api-core[grpc]==2.25.0
|
google-api-core[grpc]==2.24.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-cloud-bigquery
|
# google-cloud-bigquery
|
||||||
# google-cloud-core
|
# google-cloud-core
|
||||||
google-auth==2.40.3
|
google-auth==2.40.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-api-core
|
# google-api-core
|
||||||
# google-cloud-bigquery
|
# google-cloud-bigquery
|
||||||
# google-cloud-core
|
# google-cloud-core
|
||||||
google-cloud-bigquery==3.34.0
|
google-cloud-bigquery==3.31.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-dev.in
|
# -r requirements/requirements-dev.in
|
||||||
|
@ -90,7 +90,7 @@ googleapis-common-protos==1.70.0
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-api-core
|
# google-api-core
|
||||||
# grpcio-status
|
# grpcio-status
|
||||||
grpcio==1.72.1
|
grpcio==1.71.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-api-core
|
# google-api-core
|
||||||
|
@ -99,7 +99,7 @@ grpcio-status==1.71.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-api-core
|
# google-api-core
|
||||||
identify==2.6.12
|
identify==2.6.10
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# pre-commit
|
# pre-commit
|
||||||
|
@ -156,7 +156,7 @@ packaging==24.2
|
||||||
# google-cloud-bigquery
|
# google-cloud-bigquery
|
||||||
# matplotlib
|
# matplotlib
|
||||||
# pytest
|
# pytest
|
||||||
pandas==2.3.0
|
pandas==2.2.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-dev.in
|
# -r requirements/requirements-dev.in
|
||||||
|
@ -180,7 +180,7 @@ platformdirs==4.3.8
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# virtualenv
|
# virtualenv
|
||||||
pluggy==1.6.0
|
pluggy==1.5.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# pytest
|
# pytest
|
||||||
|
@ -200,7 +200,7 @@ proto-plus==1.26.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-api-core
|
# google-api-core
|
||||||
protobuf==5.29.5
|
protobuf==5.29.4
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# google-api-core
|
# google-api-core
|
||||||
|
@ -219,7 +219,6 @@ pyasn1-modules==0.4.2
|
||||||
pygments==2.19.1
|
pygments==2.19.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# pytest
|
|
||||||
# rich
|
# rich
|
||||||
pyparsing==3.2.3
|
pyparsing==3.2.3
|
||||||
# via
|
# via
|
||||||
|
@ -230,7 +229,7 @@ pyproject-hooks==1.2.0
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# build
|
# build
|
||||||
# pip-tools
|
# pip-tools
|
||||||
pytest==8.4.0
|
pytest==8.3.5
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-dev.in
|
# -r requirements/requirements-dev.in
|
||||||
|
@ -270,7 +269,7 @@ semver==3.0.4
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-dev.in
|
# -r requirements/requirements-dev.in
|
||||||
setuptools==80.9.0
|
setuptools==80.3.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# pip-tools
|
# pip-tools
|
||||||
|
@ -282,11 +281,11 @@ six==1.17.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# python-dateutil
|
# python-dateutil
|
||||||
typer==0.16.0
|
typer==0.15.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-dev.in
|
# -r requirements/requirements-dev.in
|
||||||
typing-extensions==4.14.0
|
typing-extensions==4.13.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# typer
|
# typer
|
||||||
|
@ -298,7 +297,7 @@ urllib3==2.4.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# requests
|
# requests
|
||||||
uv==0.7.11
|
uv==0.7.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-dev.in
|
# -r requirements/requirements-dev.in
|
||||||
|
|
|
@ -4,7 +4,7 @@ aiohappyeyeballs==2.6.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# aiohttp
|
# aiohttp
|
||||||
aiohttp==3.12.9
|
aiohttp==3.11.18
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# huggingface-hub
|
# huggingface-hub
|
||||||
|
@ -39,7 +39,7 @@ charset-normalizer==3.4.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# requests
|
# requests
|
||||||
click==8.2.1
|
click==8.1.8
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# nltk
|
# nltk
|
||||||
|
@ -70,18 +70,18 @@ filetype==1.2.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# llama-index-core
|
# llama-index-core
|
||||||
frozenlist==1.6.2
|
frozenlist==1.6.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# aiohttp
|
# aiohttp
|
||||||
# aiosignal
|
# aiosignal
|
||||||
fsspec==2025.5.1
|
fsspec==2025.3.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# huggingface-hub
|
# huggingface-hub
|
||||||
# llama-index-core
|
# llama-index-core
|
||||||
# torch
|
# torch
|
||||||
greenlet==3.2.3
|
greenlet==3.2.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# sqlalchemy
|
# sqlalchemy
|
||||||
|
@ -93,7 +93,7 @@ h11==0.16.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# httpcore
|
# httpcore
|
||||||
hf-xet==1.1.3
|
hf-xet==1.1.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# huggingface-hub
|
# huggingface-hub
|
||||||
|
@ -105,7 +105,7 @@ httpx==0.28.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# llama-index-core
|
# llama-index-core
|
||||||
huggingface-hub[inference]==0.32.4
|
huggingface-hub[inference]==0.31.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# llama-index-embeddings-huggingface
|
# llama-index-embeddings-huggingface
|
||||||
|
@ -124,7 +124,7 @@ jinja2==3.1.6
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# banks
|
# banks
|
||||||
# torch
|
# torch
|
||||||
joblib==1.5.1
|
joblib==1.5.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# nltk
|
# nltk
|
||||||
|
@ -134,7 +134,7 @@ llama-index-core==0.12.26
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-help.in
|
# -r requirements/requirements-help.in
|
||||||
# llama-index-embeddings-huggingface
|
# llama-index-embeddings-huggingface
|
||||||
llama-index-embeddings-huggingface==0.5.4
|
llama-index-embeddings-huggingface==0.5.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# -r requirements/requirements-help.in
|
# -r requirements/requirements-help.in
|
||||||
|
@ -150,7 +150,7 @@ mpmath==1.3.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# sympy
|
# sympy
|
||||||
multidict==6.4.4
|
multidict==6.4.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# aiohttp
|
# aiohttp
|
||||||
|
@ -200,7 +200,7 @@ propcache==0.3.1
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# aiohttp
|
# aiohttp
|
||||||
# yarl
|
# yarl
|
||||||
pydantic==2.11.5
|
pydantic==2.11.4
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# banks
|
# banks
|
||||||
|
@ -232,7 +232,7 @@ safetensors==0.5.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# transformers
|
# transformers
|
||||||
scikit-learn==1.7.0
|
scikit-learn==1.6.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# sentence-transformers
|
# sentence-transformers
|
||||||
|
@ -249,7 +249,7 @@ sniffio==1.3.1
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# anyio
|
# anyio
|
||||||
sqlalchemy[asyncio]==2.0.41
|
sqlalchemy[asyncio]==2.0.40
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# llama-index-core
|
# llama-index-core
|
||||||
|
@ -286,11 +286,11 @@ tqdm==4.67.1
|
||||||
# nltk
|
# nltk
|
||||||
# sentence-transformers
|
# sentence-transformers
|
||||||
# transformers
|
# transformers
|
||||||
transformers==4.52.4
|
transformers==4.51.3
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# sentence-transformers
|
# sentence-transformers
|
||||||
typing-extensions==4.14.0
|
typing-extensions==4.13.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# anyio
|
# anyio
|
||||||
|
@ -308,7 +308,7 @@ typing-inspect==0.9.0
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# dataclasses-json
|
# dataclasses-json
|
||||||
# llama-index-core
|
# llama-index-core
|
||||||
typing-inspection==0.4.1
|
typing-inspection==0.4.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# pydantic
|
# pydantic
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
# This file was autogenerated by uv via the following command:
|
# This file was autogenerated by uv via the following command:
|
||||||
# uv pip compile --no-strip-extras --constraint=requirements/common-constraints.txt --output-file=requirements/requirements-playwright.txt requirements/requirements-playwright.in
|
# uv pip compile --no-strip-extras --constraint=requirements/common-constraints.txt --output-file=requirements/requirements-playwright.txt requirements/requirements-playwright.in
|
||||||
greenlet==3.2.3
|
greenlet==3.2.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# playwright
|
# playwright
|
||||||
|
@ -12,7 +12,7 @@ pyee==13.0.0
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# playwright
|
# playwright
|
||||||
typing-extensions==4.14.0
|
typing-extensions==4.13.2
|
||||||
# via
|
# via
|
||||||
# -c requirements/common-constraints.txt
|
# -c requirements/common-constraints.txt
|
||||||
# pyee
|
# pyee
|
||||||
|
|
|
@ -35,9 +35,7 @@ google-generativeai
|
||||||
# in matplotlib and a bunch of other deps
|
# in matplotlib and a bunch of other deps
|
||||||
# https://github.com/networkx/networkx/blob/d7132daa8588f653eacac7a5bae1ee85a183fa43/pyproject.toml#L57
|
# https://github.com/networkx/networkx/blob/d7132daa8588f653eacac7a5bae1ee85a183fa43/pyproject.toml#L57
|
||||||
# We really only need networkx itself and scipy for the repomap.
|
# We really only need networkx itself and scipy for the repomap.
|
||||||
#
|
networkx
|
||||||
# >3.5 seems to not be available for py3.10
|
|
||||||
networkx<3.5
|
|
||||||
|
|
||||||
# This is the one networkx dependency that we need.
|
# This is the one networkx dependency that we need.
|
||||||
# Including it here explicitly because we
|
# Including it here explicitly because we
|
||||||
|
|
|
@ -5,7 +5,7 @@ FROM bretfisher/jekyll-serve
|
||||||
WORKDIR /site
|
WORKDIR /site
|
||||||
|
|
||||||
# Copy the current directory contents into the container at /srv/jekyll
|
# Copy the current directory contents into the container at /srv/jekyll
|
||||||
COPY aider/website /site
|
COPY website /site
|
||||||
|
|
||||||
RUN apt-get update && apt-get install libcurl4
|
RUN apt-get update && apt-get install libcurl4
|
||||||
|
|
||||||
|
|
|
@ -112,8 +112,6 @@ def main():
|
||||||
|
|
||||||
cmd = [
|
cmd = [
|
||||||
"aider",
|
"aider",
|
||||||
"--model",
|
|
||||||
"sonnet",
|
|
||||||
hist_path,
|
hist_path,
|
||||||
"--read",
|
"--read",
|
||||||
log_path,
|
log_path,
|
||||||
|
|
|
@ -834,36 +834,6 @@ two
|
||||||
self.assertNotIn(fname2, str(coder.abs_fnames))
|
self.assertNotIn(fname2, str(coder.abs_fnames))
|
||||||
self.assertNotIn(fname3, str(coder.abs_fnames))
|
self.assertNotIn(fname3, str(coder.abs_fnames))
|
||||||
|
|
||||||
def test_skip_gitignored_files_on_init(self):
|
|
||||||
with GitTemporaryDirectory() as _:
|
|
||||||
repo_path = Path(".")
|
|
||||||
repo = git.Repo.init(repo_path)
|
|
||||||
|
|
||||||
ignored_file = repo_path / "ignored_by_git.txt"
|
|
||||||
ignored_file.write_text("This file should be ignored by git.")
|
|
||||||
|
|
||||||
regular_file = repo_path / "regular_file.txt"
|
|
||||||
regular_file.write_text("This is a regular file.")
|
|
||||||
|
|
||||||
gitignore_content = "ignored_by_git.txt\n"
|
|
||||||
(repo_path / ".gitignore").write_text(gitignore_content)
|
|
||||||
|
|
||||||
repo.index.add([str(regular_file), ".gitignore"])
|
|
||||||
repo.index.commit("Initial commit with gitignore and regular file")
|
|
||||||
|
|
||||||
mock_io = MagicMock()
|
|
||||||
mock_io.tool_warning = MagicMock()
|
|
||||||
|
|
||||||
fnames_to_add = [str(ignored_file), str(regular_file)]
|
|
||||||
|
|
||||||
coder = Coder.create(self.GPT35, None, mock_io, fnames=fnames_to_add)
|
|
||||||
|
|
||||||
self.assertNotIn(str(ignored_file.resolve()), coder.abs_fnames)
|
|
||||||
self.assertIn(str(regular_file.resolve()), coder.abs_fnames)
|
|
||||||
mock_io.tool_warning.assert_any_call(
|
|
||||||
f"Skipping {ignored_file.name} that matches gitignore spec."
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_check_for_urls(self):
|
def test_check_for_urls(self):
|
||||||
io = InputOutput(yes=True)
|
io = InputOutput(yes=True)
|
||||||
coder = Coder.create(self.GPT35, None, io=io)
|
coder = Coder.create(self.GPT35, None, io=io)
|
||||||
|
@ -1211,122 +1181,6 @@ This command will print 'Hello, World!' to the console."""
|
||||||
sanity_check_messages(coder.cur_messages)
|
sanity_check_messages(coder.cur_messages)
|
||||||
self.assertEqual(coder.cur_messages[-1]["role"], "assistant")
|
self.assertEqual(coder.cur_messages[-1]["role"], "assistant")
|
||||||
|
|
||||||
def test_normalize_language(self):
|
|
||||||
coder = Coder.create(self.GPT35, None, io=InputOutput())
|
|
||||||
|
|
||||||
# Test None and empty
|
|
||||||
self.assertIsNone(coder.normalize_language(None))
|
|
||||||
self.assertIsNone(coder.normalize_language(""))
|
|
||||||
|
|
||||||
# Test "C" and "POSIX"
|
|
||||||
self.assertIsNone(coder.normalize_language("C"))
|
|
||||||
self.assertIsNone(coder.normalize_language("POSIX"))
|
|
||||||
|
|
||||||
# Test already formatted names
|
|
||||||
self.assertEqual(coder.normalize_language("English"), "English")
|
|
||||||
self.assertEqual(coder.normalize_language("French"), "French")
|
|
||||||
|
|
||||||
# Test common locale codes (fallback map, assuming babel is not installed or fails)
|
|
||||||
with patch("aider.coders.base_coder.Locale", None):
|
|
||||||
self.assertEqual(coder.normalize_language("en_US"), "English")
|
|
||||||
self.assertEqual(coder.normalize_language("fr_FR"), "French")
|
|
||||||
self.assertEqual(coder.normalize_language("es"), "Spanish")
|
|
||||||
self.assertEqual(coder.normalize_language("de_DE.UTF-8"), "German")
|
|
||||||
self.assertEqual(
|
|
||||||
coder.normalize_language("zh-CN"), "Chinese"
|
|
||||||
) # Test hyphen in fallback
|
|
||||||
self.assertEqual(coder.normalize_language("ja"), "Japanese")
|
|
||||||
self.assertEqual(
|
|
||||||
coder.normalize_language("unknown_code"), "unknown_code"
|
|
||||||
) # Fallback to original
|
|
||||||
|
|
||||||
# Test with babel.Locale mocked (available)
|
|
||||||
mock_babel_locale = MagicMock()
|
|
||||||
mock_locale_instance = MagicMock()
|
|
||||||
mock_babel_locale.parse.return_value = mock_locale_instance
|
|
||||||
|
|
||||||
with patch("aider.coders.base_coder.Locale", mock_babel_locale):
|
|
||||||
mock_locale_instance.get_display_name.return_value = "english" # For en_US
|
|
||||||
self.assertEqual(coder.normalize_language("en_US"), "English")
|
|
||||||
mock_babel_locale.parse.assert_called_with("en_US")
|
|
||||||
mock_locale_instance.get_display_name.assert_called_with("en")
|
|
||||||
|
|
||||||
mock_locale_instance.get_display_name.return_value = "french" # For fr-FR
|
|
||||||
self.assertEqual(coder.normalize_language("fr-FR"), "French") # Test with hyphen
|
|
||||||
mock_babel_locale.parse.assert_called_with("fr_FR") # Hyphen replaced
|
|
||||||
mock_locale_instance.get_display_name.assert_called_with("en")
|
|
||||||
|
|
||||||
# Test with babel.Locale raising an exception (simulating parse failure)
|
|
||||||
mock_babel_locale_error = MagicMock()
|
|
||||||
mock_babel_locale_error.parse.side_effect = Exception("Babel parse error")
|
|
||||||
with patch("aider.coders.base_coder.Locale", mock_babel_locale_error):
|
|
||||||
self.assertEqual(coder.normalize_language("en_US"), "English") # Falls back to map
|
|
||||||
|
|
||||||
def test_get_user_language(self):
|
|
||||||
io = InputOutput()
|
|
||||||
coder = Coder.create(self.GPT35, None, io=io)
|
|
||||||
|
|
||||||
# 1. Test with self.chat_language set
|
|
||||||
coder.chat_language = "fr_CA"
|
|
||||||
with patch.object(coder, "normalize_language", return_value="French Canadian") as mock_norm:
|
|
||||||
self.assertEqual(coder.get_user_language(), "French Canadian")
|
|
||||||
mock_norm.assert_called_once_with("fr_CA")
|
|
||||||
coder.chat_language = None # Reset
|
|
||||||
|
|
||||||
# 2. Test with locale.getlocale()
|
|
||||||
with patch("locale.getlocale", return_value=("en_GB", "UTF-8")) as mock_getlocale:
|
|
||||||
with patch.object(
|
|
||||||
coder, "normalize_language", return_value="British English"
|
|
||||||
) as mock_norm:
|
|
||||||
self.assertEqual(coder.get_user_language(), "British English")
|
|
||||||
mock_getlocale.assert_called_once()
|
|
||||||
mock_norm.assert_called_once_with("en_GB")
|
|
||||||
|
|
||||||
# Test with locale.getlocale() returning None or empty
|
|
||||||
with patch("locale.getlocale", return_value=(None, None)) as mock_getlocale:
|
|
||||||
with patch("os.environ.get") as mock_env_get: # Ensure env vars are not used yet
|
|
||||||
mock_env_get.return_value = None
|
|
||||||
self.assertIsNone(coder.get_user_language()) # Should be None if nothing found
|
|
||||||
|
|
||||||
# 3. Test with environment variables: LANG
|
|
||||||
with patch(
|
|
||||||
"locale.getlocale", side_effect=Exception("locale error")
|
|
||||||
): # Mock locale to fail
|
|
||||||
with patch("os.environ.get") as mock_env_get:
|
|
||||||
mock_env_get.side_effect = lambda key: "de_DE.UTF-8" if key == "LANG" else None
|
|
||||||
with patch.object(coder, "normalize_language", return_value="German") as mock_norm:
|
|
||||||
self.assertEqual(coder.get_user_language(), "German")
|
|
||||||
mock_env_get.assert_any_call("LANG")
|
|
||||||
mock_norm.assert_called_once_with("de_DE")
|
|
||||||
|
|
||||||
# Test LANGUAGE (takes precedence over LANG if both were hypothetically checked
|
|
||||||
# by os.environ.get, but our code checks in order, so we mock the first one it finds)
|
|
||||||
with patch("locale.getlocale", side_effect=Exception("locale error")):
|
|
||||||
with patch("os.environ.get") as mock_env_get:
|
|
||||||
mock_env_get.side_effect = lambda key: "es_ES" if key == "LANGUAGE" else None
|
|
||||||
with patch.object(coder, "normalize_language", return_value="Spanish") as mock_norm:
|
|
||||||
self.assertEqual(coder.get_user_language(), "Spanish")
|
|
||||||
mock_env_get.assert_any_call("LANGUAGE") # LANG would be called first
|
|
||||||
mock_norm.assert_called_once_with("es_ES")
|
|
||||||
|
|
||||||
# 4. Test priority: chat_language > locale > env
|
|
||||||
coder.chat_language = "it_IT"
|
|
||||||
with patch("locale.getlocale", return_value=("en_US", "UTF-8")) as mock_getlocale:
|
|
||||||
with patch("os.environ.get", return_value="de_DE") as mock_env_get:
|
|
||||||
with patch.object(
|
|
||||||
coder, "normalize_language", side_effect=lambda x: x.upper()
|
|
||||||
) as mock_norm:
|
|
||||||
self.assertEqual(coder.get_user_language(), "IT_IT") # From chat_language
|
|
||||||
mock_norm.assert_called_once_with("it_IT")
|
|
||||||
mock_getlocale.assert_not_called()
|
|
||||||
mock_env_get.assert_not_called()
|
|
||||||
coder.chat_language = None
|
|
||||||
|
|
||||||
# 5. Test when no language is found
|
|
||||||
with patch("locale.getlocale", side_effect=Exception("locale error")):
|
|
||||||
with patch("os.environ.get", return_value=None) as mock_env_get:
|
|
||||||
self.assertIsNone(coder.get_user_language())
|
|
||||||
|
|
||||||
def test_architect_coder_auto_accept_true(self):
|
def test_architect_coder_auto_accept_true(self):
|
||||||
with GitTemporaryDirectory():
|
with GitTemporaryDirectory():
|
||||||
io = InputOutput(yes=True)
|
io = InputOutput(yes=True)
|
||||||
|
|
|
@ -949,19 +949,16 @@ class TestMain(TestCase):
|
||||||
|
|
||||||
def test_invalid_edit_format(self):
|
def test_invalid_edit_format(self):
|
||||||
with GitTemporaryDirectory():
|
with GitTemporaryDirectory():
|
||||||
# Suppress stderr for this test as argparse prints an error message
|
with patch("aider.io.InputOutput.offer_url") as mock_offer_url:
|
||||||
with patch("sys.stderr", new_callable=StringIO) as mock_stderr:
|
result = main(
|
||||||
with self.assertRaises(SystemExit) as cm:
|
|
||||||
_ = main(
|
|
||||||
["--edit-format", "not-a-real-format", "--exit", "--yes"],
|
["--edit-format", "not-a-real-format", "--exit", "--yes"],
|
||||||
input=DummyInput(),
|
input=DummyInput(),
|
||||||
output=DummyOutput(),
|
output=DummyOutput(),
|
||||||
)
|
)
|
||||||
# argparse.ArgumentParser.exit() is called with status 2 for invalid choice
|
self.assertEqual(result, 1) # main() should return 1 on error
|
||||||
self.assertEqual(cm.exception.code, 2)
|
mock_offer_url.assert_called_once()
|
||||||
stderr_output = mock_stderr.getvalue()
|
args, _ = mock_offer_url.call_args
|
||||||
self.assertIn("invalid choice", stderr_output)
|
self.assertEqual(args[0], "https://aider.chat/docs/more/edit-formats.html")
|
||||||
self.assertIn("not-a-real-format", stderr_output)
|
|
||||||
|
|
||||||
def test_default_model_selection(self):
|
def test_default_model_selection(self):
|
||||||
with GitTemporaryDirectory():
|
with GitTemporaryDirectory():
|
||||||
|
@ -1035,16 +1032,6 @@ class TestMain(TestCase):
|
||||||
system_info = coder.get_platform_info()
|
system_info = coder.get_platform_info()
|
||||||
self.assertIn("Spanish", system_info)
|
self.assertIn("Spanish", system_info)
|
||||||
|
|
||||||
def test_commit_language_japanese(self):
|
|
||||||
with GitTemporaryDirectory():
|
|
||||||
coder = main(
|
|
||||||
["--commit-language", "japanese", "--exit", "--yes"],
|
|
||||||
input=DummyInput(),
|
|
||||||
output=DummyOutput(),
|
|
||||||
return_coder=True,
|
|
||||||
)
|
|
||||||
self.assertIn("japanese", coder.commit_language)
|
|
||||||
|
|
||||||
@patch("git.Repo.init")
|
@patch("git.Repo.init")
|
||||||
def test_main_exit_with_git_command_not_found(self, mock_git_init):
|
def test_main_exit_with_git_command_not_found(self, mock_git_init):
|
||||||
mock_git_init.side_effect = git.exc.GitCommandNotFound("git", "Command 'git' not found")
|
mock_git_init.side_effect = git.exc.GitCommandNotFound("git", "Command 'git' not found")
|
||||||
|
@ -1288,21 +1275,6 @@ class TestMain(TestCase):
|
||||||
for call in mock_io_instance.tool_warning.call_args_list:
|
for call in mock_io_instance.tool_warning.call_args_list:
|
||||||
self.assertNotIn("Cost estimates may be inaccurate", call[0][0])
|
self.assertNotIn("Cost estimates may be inaccurate", call[0][0])
|
||||||
|
|
||||||
def test_argv_file_respects_git(self):
|
|
||||||
with GitTemporaryDirectory():
|
|
||||||
fname = Path("not_in_git.txt")
|
|
||||||
fname.touch()
|
|
||||||
with open(".gitignore", "w+") as f:
|
|
||||||
f.write("not_in_git.txt")
|
|
||||||
coder = main(
|
|
||||||
argv=["--file", "not_in_git.txt"],
|
|
||||||
input=DummyInput(),
|
|
||||||
output=DummyOutput(),
|
|
||||||
return_coder=True,
|
|
||||||
)
|
|
||||||
self.assertNotIn("not_in_git.txt", str(coder.abs_fnames))
|
|
||||||
self.assertFalse(coder.allowed_to_edit("not_in_git.txt"))
|
|
||||||
|
|
||||||
def test_load_dotenv_files_override(self):
|
def test_load_dotenv_files_override(self):
|
||||||
with GitTemporaryDirectory() as git_dir:
|
with GitTemporaryDirectory() as git_dir:
|
||||||
git_dir = Path(git_dir)
|
git_dir = Path(git_dir)
|
||||||
|
|
|
@ -138,13 +138,13 @@ class TestModels(unittest.TestCase):
|
||||||
self.assertEqual(model.name, "gpt-3.5-turbo")
|
self.assertEqual(model.name, "gpt-3.5-turbo")
|
||||||
|
|
||||||
model = Model("sonnet")
|
model = Model("sonnet")
|
||||||
self.assertEqual(model.name, "anthropic/claude-sonnet-4-20250514")
|
self.assertEqual(model.name, "anthropic/claude-3-7-sonnet-20250219")
|
||||||
|
|
||||||
model = Model("haiku")
|
model = Model("haiku")
|
||||||
self.assertEqual(model.name, "claude-3-5-haiku-20241022")
|
self.assertEqual(model.name, "claude-3-5-haiku-20241022")
|
||||||
|
|
||||||
model = Model("opus")
|
model = Model("opus")
|
||||||
self.assertEqual(model.name, "claude-opus-4-20250514")
|
self.assertEqual(model.name, "claude-3-opus-20240229")
|
||||||
|
|
||||||
# Test non-alias passes through unchanged
|
# Test non-alias passes through unchanged
|
||||||
model = Model("gpt-4")
|
model = Model("gpt-4")
|
||||||
|
|
|
@ -93,14 +93,16 @@ class TestOnboarding(unittest.TestCase):
|
||||||
@patch.dict(os.environ, {"OPENROUTER_API_KEY": "or_key"}, clear=True)
|
@patch.dict(os.environ, {"OPENROUTER_API_KEY": "or_key"}, clear=True)
|
||||||
def test_try_select_default_model_openrouter_free(self, mock_check_tier):
|
def test_try_select_default_model_openrouter_free(self, mock_check_tier):
|
||||||
"""Test OpenRouter free model selection."""
|
"""Test OpenRouter free model selection."""
|
||||||
self.assertEqual(try_to_select_default_model(), "openrouter/deepseek/deepseek-r1:free")
|
self.assertEqual(
|
||||||
|
try_to_select_default_model(), "openrouter/google/gemini-2.5-pro-exp-03-25:free"
|
||||||
|
)
|
||||||
mock_check_tier.assert_called_once_with("or_key")
|
mock_check_tier.assert_called_once_with("or_key")
|
||||||
|
|
||||||
@patch("aider.onboarding.check_openrouter_tier", return_value=False) # Assume paid tier
|
@patch("aider.onboarding.check_openrouter_tier", return_value=False) # Assume paid tier
|
||||||
@patch.dict(os.environ, {"OPENROUTER_API_KEY": "or_key"}, clear=True)
|
@patch.dict(os.environ, {"OPENROUTER_API_KEY": "or_key"}, clear=True)
|
||||||
def test_try_select_default_model_openrouter_paid(self, mock_check_tier):
|
def test_try_select_default_model_openrouter_paid(self, mock_check_tier):
|
||||||
"""Test OpenRouter paid model selection."""
|
"""Test OpenRouter paid model selection."""
|
||||||
self.assertEqual(try_to_select_default_model(), "openrouter/anthropic/claude-sonnet-4")
|
self.assertEqual(try_to_select_default_model(), "openrouter/anthropic/claude-3.7-sonnet")
|
||||||
mock_check_tier.assert_called_once_with("or_key")
|
mock_check_tier.assert_called_once_with("or_key")
|
||||||
|
|
||||||
@patch("aider.onboarding.check_openrouter_tier")
|
@patch("aider.onboarding.check_openrouter_tier")
|
||||||
|
@ -144,7 +146,7 @@ class TestOnboarding(unittest.TestCase):
|
||||||
)
|
)
|
||||||
def test_try_select_default_model_priority_openrouter(self, mock_check_tier):
|
def test_try_select_default_model_priority_openrouter(self, mock_check_tier):
|
||||||
"""Test OpenRouter key takes priority."""
|
"""Test OpenRouter key takes priority."""
|
||||||
self.assertEqual(try_to_select_default_model(), "openrouter/anthropic/claude-sonnet-4")
|
self.assertEqual(try_to_select_default_model(), "openrouter/anthropic/claude-3.7-sonnet")
|
||||||
mock_check_tier.assert_called_once_with("or_key")
|
mock_check_tier.assert_called_once_with("or_key")
|
||||||
|
|
||||||
@patch("aider.onboarding.check_openrouter_tier")
|
@patch("aider.onboarding.check_openrouter_tier")
|
||||||
|
@ -344,7 +346,7 @@ class TestOnboarding(unittest.TestCase):
|
||||||
|
|
||||||
@patch(
|
@patch(
|
||||||
"aider.onboarding.try_to_select_default_model",
|
"aider.onboarding.try_to_select_default_model",
|
||||||
side_effect=[None, "openrouter/deepseek/deepseek-r1:free"],
|
side_effect=[None, "openrouter/google/gemini-2.5-pro-exp-03-25:free"],
|
||||||
) # Fails first, succeeds after oauth
|
) # Fails first, succeeds after oauth
|
||||||
@patch(
|
@patch(
|
||||||
"aider.onboarding.offer_openrouter_oauth", return_value=True
|
"aider.onboarding.offer_openrouter_oauth", return_value=True
|
||||||
|
@ -358,7 +360,7 @@ class TestOnboarding(unittest.TestCase):
|
||||||
|
|
||||||
selected_model = select_default_model(args, io_mock, analytics_mock)
|
selected_model = select_default_model(args, io_mock, analytics_mock)
|
||||||
|
|
||||||
self.assertEqual(selected_model, "openrouter/deepseek/deepseek-r1:free")
|
self.assertEqual(selected_model, "openrouter/google/gemini-2.5-pro-exp-03-25:free")
|
||||||
self.assertEqual(mock_try_select.call_count, 2) # Called before and after oauth
|
self.assertEqual(mock_try_select.call_count, 2) # Called before and after oauth
|
||||||
mock_offer_oauth.assert_called_once_with(io_mock, analytics_mock)
|
mock_offer_oauth.assert_called_once_with(io_mock, analytics_mock)
|
||||||
# Only one warning is expected: "No LLM model..."
|
# Only one warning is expected: "No LLM model..."
|
||||||
|
|
|
@ -1,73 +0,0 @@
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
from aider.models import ModelInfoManager
|
|
||||||
from aider.openrouter import OpenRouterModelManager
|
|
||||||
|
|
||||||
|
|
||||||
class DummyResponse:
|
|
||||||
"""Minimal stand-in for requests.Response used in tests."""
|
|
||||||
|
|
||||||
def __init__(self, json_data):
|
|
||||||
self.status_code = 200
|
|
||||||
self._json_data = json_data
|
|
||||||
|
|
||||||
def json(self):
|
|
||||||
return self._json_data
|
|
||||||
|
|
||||||
|
|
||||||
def test_openrouter_get_model_info_from_cache(monkeypatch, tmp_path):
|
|
||||||
"""
|
|
||||||
OpenRouterModelManager should return correct metadata taken from the
|
|
||||||
downloaded (and locally cached) models JSON payload.
|
|
||||||
"""
|
|
||||||
payload = {
|
|
||||||
"data": [
|
|
||||||
{
|
|
||||||
"id": "mistralai/mistral-medium-3",
|
|
||||||
"context_length": 32768,
|
|
||||||
"pricing": {"prompt": "100", "completion": "200"},
|
|
||||||
"top_provider": {"context_length": 32768},
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
|
|
||||||
# Fake out the network call and the HOME directory used for the cache file
|
|
||||||
monkeypatch.setattr("requests.get", lambda *a, **k: DummyResponse(payload))
|
|
||||||
monkeypatch.setattr(Path, "home", staticmethod(lambda: tmp_path))
|
|
||||||
|
|
||||||
manager = OpenRouterModelManager()
|
|
||||||
info = manager.get_model_info("openrouter/mistralai/mistral-medium-3")
|
|
||||||
|
|
||||||
assert info["max_input_tokens"] == 32768
|
|
||||||
assert info["input_cost_per_token"] == 100.0
|
|
||||||
assert info["output_cost_per_token"] == 200.0
|
|
||||||
assert info["litellm_provider"] == "openrouter"
|
|
||||||
|
|
||||||
|
|
||||||
def test_model_info_manager_uses_openrouter_manager(monkeypatch):
|
|
||||||
"""
|
|
||||||
ModelInfoManager should delegate to OpenRouterModelManager when litellm
|
|
||||||
provides no data for an OpenRouter-prefixed model.
|
|
||||||
"""
|
|
||||||
# Ensure litellm path returns no info so that fallback logic triggers
|
|
||||||
monkeypatch.setattr("aider.models.litellm.get_model_info", lambda *a, **k: {})
|
|
||||||
|
|
||||||
stub_info = {
|
|
||||||
"max_input_tokens": 512,
|
|
||||||
"max_tokens": 512,
|
|
||||||
"max_output_tokens": 512,
|
|
||||||
"input_cost_per_token": 100.0,
|
|
||||||
"output_cost_per_token": 200.0,
|
|
||||||
"litellm_provider": "openrouter",
|
|
||||||
}
|
|
||||||
|
|
||||||
# Force OpenRouterModelManager to return our stub info
|
|
||||||
monkeypatch.setattr(
|
|
||||||
"aider.models.OpenRouterModelManager.get_model_info",
|
|
||||||
lambda self, model: stub_info,
|
|
||||||
)
|
|
||||||
|
|
||||||
mim = ModelInfoManager()
|
|
||||||
info = mim.get_model_info("openrouter/fake/model")
|
|
||||||
|
|
||||||
assert info == stub_info
|
|
|
@ -59,28 +59,6 @@ class TestRepo(unittest.TestCase):
|
||||||
self.assertIn("index", diffs)
|
self.assertIn("index", diffs)
|
||||||
self.assertIn("workingdir", diffs)
|
self.assertIn("workingdir", diffs)
|
||||||
|
|
||||||
def test_diffs_with_single_byte_encoding(self):
|
|
||||||
with GitTemporaryDirectory():
|
|
||||||
encoding = "cp1251"
|
|
||||||
|
|
||||||
repo = git.Repo()
|
|
||||||
|
|
||||||
fname = Path("foo.txt")
|
|
||||||
fname.write_text("index\n", encoding=encoding)
|
|
||||||
repo.git.add(str(fname))
|
|
||||||
|
|
||||||
# Make a change with non-ASCII symbols in the working dir
|
|
||||||
fname.write_text("АБВ\n", encoding=encoding)
|
|
||||||
|
|
||||||
git_repo = GitRepo(InputOutput(encoding=encoding), None, ".")
|
|
||||||
diffs = git_repo.get_diffs()
|
|
||||||
|
|
||||||
# check that all diff output can be converted to utf-8 for sending to model
|
|
||||||
diffs.encode("utf-8")
|
|
||||||
|
|
||||||
self.assertIn("index", diffs)
|
|
||||||
self.assertIn("АБВ", diffs)
|
|
||||||
|
|
||||||
def test_diffs_detached_head(self):
|
def test_diffs_detached_head(self):
|
||||||
with GitTemporaryDirectory():
|
with GitTemporaryDirectory():
|
||||||
repo = git.Repo()
|
repo = git.Repo()
|
||||||
|
@ -683,34 +661,3 @@ class TestRepo(unittest.TestCase):
|
||||||
# Verify the commit was actually made
|
# Verify the commit was actually made
|
||||||
latest_commit_msg = raw_repo.head.commit.message
|
latest_commit_msg = raw_repo.head.commit.message
|
||||||
self.assertEqual(latest_commit_msg.strip(), "Should succeed")
|
self.assertEqual(latest_commit_msg.strip(), "Should succeed")
|
||||||
|
|
||||||
@patch("aider.models.Model.simple_send_with_retries")
|
|
||||||
def test_get_commit_message_uses_system_prompt_prefix(self, mock_send):
|
|
||||||
"""
|
|
||||||
Verify that GitRepo.get_commit_message() prepends the model.system_prompt_prefix
|
|
||||||
to the system prompt sent to the LLM.
|
|
||||||
"""
|
|
||||||
mock_send.return_value = "good commit message"
|
|
||||||
|
|
||||||
prefix = "MY-CUSTOM-PREFIX"
|
|
||||||
model = Model("gpt-3.5-turbo")
|
|
||||||
model.system_prompt_prefix = prefix
|
|
||||||
|
|
||||||
with GitTemporaryDirectory():
|
|
||||||
repo = GitRepo(InputOutput(), None, None, models=[model])
|
|
||||||
|
|
||||||
# Call the function under test
|
|
||||||
repo.get_commit_message("dummy diff", "dummy context")
|
|
||||||
|
|
||||||
# Ensure the LLM was invoked once
|
|
||||||
mock_send.assert_called_once()
|
|
||||||
|
|
||||||
# Grab the system message sent to the model
|
|
||||||
messages = mock_send.call_args[0][0]
|
|
||||||
system_msg_content = messages[0]["content"]
|
|
||||||
|
|
||||||
# Verify the prefix is at the start of the system message
|
|
||||||
self.assertTrue(
|
|
||||||
system_msg_content.startswith(prefix),
|
|
||||||
"system_prompt_prefix should be prepended to the system prompt",
|
|
||||||
)
|
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue