Document new model support

This commit is contained in:
Paul Gauthier 2024-04-19 13:18:12 -07:00
parent 99f43b4088
commit 6cecbd02d6
6 changed files with 182 additions and 100 deletions

View file

@ -9,7 +9,7 @@ with sensible commit messages.
You can start a new project or work with an existing git repo.
Aider is unique in that it lets you ask for changes to [pre-existing, larger codebases](https://aider.chat/docs/repomap.html).
Aider works well with GPT 3.5, GPT-4, GPT-4 Turbo with Vision,
Claude 3 Opus and has support for connecting to almost any LLM.
and Claude 3 Opus; it also has support for [connecting to almost any LLM](https://aider.chat/docs/connect.html).
<p align="center">
<img src="assets/screencast.svg" alt="aider screencast">
@ -44,13 +44,14 @@ get started quickly like this:
```
$ pip install aider-chat
# To work with GPT-4 Turbo:
$ export OPENAI_API_KEY=your-key-goes-here
$ aider hello.js
$ aider
Using git repo: .git
Added hello.js to the chat.
hello.js> write a js script that prints hello world
# To work with Claude 3 Opus:
$ export ANTHROPIC_API_KEY=your-key-goes-here
$ aider --opus
```
## Example chat transcripts
@ -72,6 +73,8 @@ You can find more chat transcripts on the [examples page](https://aider.chat/exa
* Chat with aider about your code by launching `aider` from the command line with set of source files to discuss and edit together. Aider lets the LLM see and edit the content of those files.
* Aider can write and edit code in most popular languages: python, javascript, typescript, php, html, css, etc.
* Aider works well with GPT 3.5, GPT-4, GPT-4 Turbo with Vision,
and Claude 3 Opus; it also has support for [connecting to almost any LLM](https://aider.chat/docs/connect.html).
* Request new features, changes, improvements, or bug fixes to your code. Ask for new test cases, updated documentation or code refactors.
* Aider will apply the edits suggested by the LLM directly to your source files.
* Aider will [automatically commit each changeset to your local git repo](https://aider.chat/docs/faq.html#how-does-aider-use-git) with a descriptive commit message. These frequent, automatic commits provide a safety net. It's easy to undo changes or use standard git workflows to manage longer sequences of changes.

View file

@ -371,8 +371,9 @@ class Commands:
else:
if is_image_file(matched_file) and not self.coder.main_model.accepts_images:
self.io.tool_error(
f"Cannot add image file {matched_file} as the model does not support image"
" files"
f"Cannot add image file {matched_file} as the"
f" {self.coder.main_model.name} does not support image.\nYou can run `aider"
" --4turbo` to use GPT-4 Turbo with Vision."
)
continue
content = self.io.read_text(abs_file_path)

View file

@ -163,6 +163,12 @@ def main(argv=None, input=None, output=None, force_git_root=None):
env_var="OPENAI_API_KEY",
help="Specify the OpenAI API key",
)
core_group.add_argument(
"--anthropic-api-key",
metavar="ANTHROPIC_API_KEY",
env_var="ANTHROPIC_API_KEY",
help="Specify the OpenAI API key",
)
default_model = models.DEFAULT_MODEL_NAME
core_group.add_argument(
"--model",
@ -178,6 +184,14 @@ def main(argv=None, input=None, output=None, force_git_root=None):
const=opus_model,
help=f"Use {opus_model} model for the main chat",
)
sonnet_model = "claude-3-sonnet-20240229"
core_group.add_argument(
"--sonnet",
action="store_const",
dest="model",
const=sonnet_model,
help=f"Use {sonnet_model} model for the main chat",
)
default_4_model = "gpt-4-0613"
core_group.add_argument(
"--4",
@ -187,7 +201,7 @@ def main(argv=None, input=None, output=None, force_git_root=None):
const=default_4_model,
help=f"Use {default_4_model} model for the main chat",
)
default_4_turbo_model = "gpt-4-1106-preview"
default_4_turbo_model = "gpt-4-turbo"
core_group.add_argument(
"--4turbo",
"--4-turbo",
@ -553,7 +567,9 @@ def main(argv=None, input=None, output=None, force_git_root=None):
def scrub_sensitive_info(text):
# Replace sensitive information with placeholder
if text and args.openai_api_key:
return text.replace(args.openai_api_key, "***")
text = text.replace(args.openai_api_key, "***")
if text and args.anthropic_api_key:
text = text.replace(args.anthropic_api_key, "***")
return text
if args.verbose:
@ -567,6 +583,9 @@ def main(argv=None, input=None, output=None, force_git_root=None):
io.tool_output(*map(scrub_sensitive_info, sys.argv), log_only=True)
if args.anthropic_api_key:
os.environ["ANTHROPIC_API_KEY"] = args.anthropic_api_key
if args.openai_api_key:
os.environ["OPENAI_API_KEY"] = args.openai_api_key
if args.openai_api_base:

121
docs/connect.md Normal file
View file

@ -0,0 +1,121 @@
# Connecting aider to LLMs
Aider works well with OpenAI's GPT 3.5, GPT-4, GPT-4 Turbo with Vision and
Anthropic's Claude 3 Opus and Sonnet.
Aider also has support for connecting to almost any LLM, but may not be as effective
because of the reduced capabilities of such alternative models.
For comparison, GPT-3.5 is just barely capable of *editing code* to provide aider's
interactive "pair programming" style workflow.
So models that are less capable than GPT-3.5 may struggle to perform well with aider.
## OpenAI
To work with OpenAI's models, you need to provide your
[OpenAI API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key)
either in the `OPENAI_API_KEY` environment variable or
via the `--openai-api-key` command line switch.
Aider has some built in shortcuts for the most popular OpenAI models and
has been tested and benchmarked to work well with them:
- OpenAI's GPT-4 Turbo: `aider` with no args uses GPT-4 Turbo by default.
- OpenAI's GPT-4 Turbo with Vision: `aider --4turbo` will use this vision capable model, allowing you to share images with GPT by adding them to the chat with `/add` or by naming them on the command line.
- OpenAI's GPT-3.5 Turbo: `aider --35turbo`
You can use `aider --model <model-name>` to use any other OpenAI model.
## Anthropic
To work with Anthropic's models, you need to provide your
[Anthropic API key](https://docs.anthropic.com/claude/reference/getting-started-with-the-api)
either in the `ANTHROPIC_API_KEY` environment variable or
via the `--anthropic-api-key` command line switch.
Aider has some built in shortcuts for the most popular Anthropic models and
has been tested and benchmarked to work well with them:
- Anthropic's Claude 3 Opus: `aider --opus`
- Anthropic's Claude 3 Sonnet: `aider --sonnet`
You can use `aider --model <model-name>` to use any other Anthropic model.
## Azure
Aider can be configured to connect to the OpenAI models on Azure.
You can run aider with the following arguments to connect to Azure:
```
$ aider \
--openai-api-type azure \
--openai-api-key your-key-goes-here \
--openai-api-base https://example-endpoint.openai.azure.com \
--openai-api-version 2023-05-15 \
--openai-api-deployment-id deployment-name \
...
```
You could also store those values in an `.aider.conf.yml` file in your home directory:
```
openai-api-type: azure
openai-api-key: your-key-goes-here
openai-api-base: https://example-endpoint.openai.azure.com
openai-api-version: 2023-05-15
openai-api-deployment-id: deployment-name
```
Or you can populate the following environment variables instead:
```
OPENAI_API_TYPE=azure
OPENAI_API_KEY=your-key-goes-here
OPENAI_API_BASE=https://example-endpoint.openai.azure.com
OPENAI_API_VERSION=2023-05-15
OPENAI_API_DEPLOYMENT_ID=deployment-name
```
See the
[official Azure documentation on using OpenAI models](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/chatgpt-quickstart?tabs=command-line&pivots=programming-language-python)
for more information on how to populate the above configuration values.
## OpenAI compatible APIs
If you can make an LLM accessible via an OpenAI compatible API,
you can use `--openai-api-base` to connect to a different API endpoint.
## Other LLMs
Aider uses the [litellm](https://docs.litellm.ai/docs/providers) package
to provide connections to hundreds of other models.
You can use `aider --model <provider-name>/<model-name>` to use any supported model.
Depending on which model you access, you may need to provide an API key
or other configuration parameters by setting certain environment variables.
If any required variables are not set, aider will print a brief
error message listing which parameters are needed.
To explore the list of supported models you can run `aider --model <name>`.
If it's not an exact match for a model, aider will
return a list of possible matching models.
For example `aider --model 3.5` will return the following list of models:
- gpt-3.5-turbo
- gpt-3.5-turbo-0301
- gpt-3.5-turbo-0613
- gpt-3.5-turbo-1106
- gpt-3.5-turbo-0125
- gpt-3.5-turbo-16k
- gpt-3.5-turbo-16k-0613
- ft:gpt-3.5-turbo
- azure/gpt-3.5-turbo-instruct-0914
- gpt-3.5-turbo-instruct
- gpt-3.5-turbo-instruct-0914
- openrouter/openai/gpt-3.5-turbo
- openrouter/openai/gpt-3.5-turbo-16k
- deepinfra/openchat/openchat_3.5
Or, see the [list of providers supported by litellm](https://docs.litellm.ai/docs/providers)
for more details.

View file

@ -2,7 +2,6 @@
# Frequently asked questions
- [How does aider use git?](#how-does-aider-use-git)
- [GPT-4 vs GPT-3.5](#gpt-4-vs-gpt-35)
- [Can I use aider with other LLMs, local LLMs, etc?](#can-i-use-aider-with-other-llms-local-llms-etc)
- [Accessing other LLMs with OpenRouter](#accessing-other-llms-with-openrouter)
- [Aider isn't editing my files?](#aider-isnt-editing-my-files)
@ -40,42 +39,6 @@ While it is not recommended, you can disable aider's use of git in a few ways:
- `--no-dirty-commits` will stop aider from committing dirty files before applying GPT's edits.
- `--no-git` will completely stop aider from using git on your files. You should ensure you are keeping sensible backups of the files you are working with.
## GPT-4 vs GPT-3.5
Aider supports all of OpenAI's chat models,
and uses GPT-4 Turbo by default.
It has a large context window, good coding skills and
generally obeys the instructions in the system prompt.
You can choose another model with the `--model` command line argument
or one of these shortcuts:
```
aider -4 # to use gpt-4-0613
aider -3 # to use gpt-3.5-turbo-0125
```
The older `gpt-4-0613` model is a great choice if GPT-4 Turbo is having
trouble with your coding task, although it has a smaller context window
which can be a real limitation.
All the GPT-4 models are able to structure code edits as "diffs"
and use a
[repository map](https://aider.chat/docs/repomap.html)
to improve its ability to make changes in larger codebases.
GPT-3.5 is
limited to editing somewhat smaller codebases.
It is less able to follow instructions and
so can't reliably return code edits as "diffs".
Aider disables the
repository map
when using GPT-3.5.
For detailed quantitative comparisons of the various models, please see the
[aider blog](https://aider.chat/blog/)
which contains many benchmarking articles.
## Can I use aider with other LLMs, local LLMs, etc?
Aider provides experimental support for LLMs other than OpenAI's GPT-3.5 and GPT-4. The support is currently only experimental for two reasons:
@ -105,54 +68,13 @@ are relevant tools to serve local models via an OpenAI compatible API.
### Azure
Aider can be configured to connect to the OpenAI models on Azure.
Aider supports the configuration changes specified in the
[official openai python library docs](https://github.com/openai/openai-python#microsoft-azure-endpoints).
You should be able to run aider with the following arguments to connect to Azure:
```
$ aider \
--openai-api-type azure \
--openai-api-key your-key-goes-here \
--openai-api-base https://example-endpoint.openai.azure.com \
--openai-api-version 2023-05-15 \
--openai-api-deployment-id deployment-name \
...
```
You could also store those values in an `.aider.conf.yml` file in your home directory:
```
openai-api-type: azure
openai-api-key: your-key-goes-here
openai-api-base: https://example-endpoint.openai.azure.com
openai-api-version: 2023-05-15
openai-api-deployment-id: deployment-name
```
See the
[official Azure documentation on using OpenAI models](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/chatgpt-quickstart?tabs=command-line&pivots=programming-language-python)
for more information on how to populate the above configuration values.
See the documentation on connection to LLMs for details on
[connecting aider to Azure]().
## Accessing other LLMs with OpenRouter
[OpenRouter](https://openrouter.ai) provide an interface to [many models](https://openrouter.ai/models) which are not widely accessible, in particular Claude 3 Opus.
To access the OpenRouter models, simply:
```
# Install aider
pip install aider-chat
# Setup OpenRouter access
export OPENAI_API_KEY=<your-openrouter-key>
export OPENAI_API_BASE=https://openrouter.ai/api/v1
# For example, run aider with Claude 3 Opus using the diff editing format
aider --model anthropic/claude-3-opus --edit-format diff
```
See the documentation on connection to LLMs for details on
[connecting aider to OpenRouter]().
## Aider isn't editing my files?

View file

@ -2,9 +2,10 @@
# Installing aider
- [Install git](#install-git)
- [Get your OpenAI API key](#get-your-openai-api-key)
- [Get your API key](#get-your-api-key)
- [Windows install](#windows-install)
- [Mac/Linux install](#maclinux-install)
- [Working with other LLMs](https://aider.chat/docs/connect.html)
- [Tutorial videos](#tutorial-videos)
## Install git
@ -13,33 +14,48 @@ Make sure you have git installed.
Here are
[instructions for installing git in various environments](https://github.com/git-guides/install-git).
## Get your OpenAI API key
## Get your API key
You need a paid
To work with OpenAI's GPT 3.5 or GPT-4 models you need a paid
[OpenAI API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key).
Note that this is different than being a "ChatGPT Plus" subscriber.
To work with Anthropic's models like Claude 3 Opus you need a paid
[Anthropic API key](https://docs.anthropic.com/claude/reference/getting-started-with-the-api).
## Windows install
```
# Install aider
py -m pip install aider-chat
# Launch aider
aider --openai-api-key sk-xxxxxxxxxxxxxxx
# To work with GPT-4 Turbo:
$ aider --openai-api-key sk-xxx... --4turbo
# To work with Claude 3 Opus:
$ aider --anthropic-api-key sk-xxx... --opus
```
## Mac/Linux install
```
# Install aider
python -m pip install aider-chat
# Launch aider
aider --openai-api-key sk-xxxxxxxxxxxxxxx
# To work with GPT-4 Turbo:
$ aider --openai-api-key sk-xxx... --4turbo
# To work with Claude 3 Opus:
$ aider --anthropic-api-key sk-xxx... --opus
```
## Working with other LLMs
Aider works well with GPT 3.5, GPT-4, GPT-4 Turbo with Vision,
and Claude 3 Opus.
It also has support for [connecting to almost any LLM](https://aider.chat/docs/connect.html).
## Tutorial videos
Here are a few tutorial videos: