mirror of
https://github.com/Aider-AI/aider.git
synced 2025-05-30 17:24:59 +00:00
Broke apart llms
This commit is contained in:
parent
65f80e965f
commit
c5e127d6fa
15 changed files with 444 additions and 384 deletions
|
@ -44,8 +44,8 @@ $ aider --opus
|
|||
- New features, changes, improvements, or bug fixes to your code.
|
||||
- New test cases, updated documentation or code refactors.
|
||||
- Paste in a GitHub issue url that needs to be solved.
|
||||
- Aider will edit your files.
|
||||
- Aider [automatically git commits changes](https://aider.chat/docs/faq.html#how-does-aider-use-git) with a sensible commit message.
|
||||
- Aider will edit your files to complete your request.
|
||||
- Aider [automatically git commits](https://aider.chat/docs/faq.html#how-does-aider-use-git) changes with a sensible commit message.
|
||||
- Aider works with [most popular languages](https://aider.chat/docs/languages.html): python, javascript, typescript, php, html, css, and more...
|
||||
- Aider works well with GPT-4o, Claude 3 Opus, GPT-3.5 and supports [connecting to many LLMs](https://aider.chat/docs/llms.html).
|
||||
- Aider can make coordinated changes across multiple files at once.
|
||||
|
@ -55,6 +55,7 @@ Aider will notice and always use the latest version.
|
|||
So you can bounce back and forth between aider and your editor, to collaboratively code with AI.
|
||||
- Images can be added to the chat (GPT-4o, GPT-4 Turbo, etc).
|
||||
- URLs can be added to the chat and aider will read their content.
|
||||
- [Code with your voice](https://aider.chat/docs/voice.html) using speech recognition.
|
||||
|
||||
|
||||
## Documentation
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
---
|
||||
title: Connecting to LLMs
|
||||
nav_order: 70
|
||||
has_children: true
|
||||
---
|
||||
|
||||
# Aider can connect to most LLMs
|
||||
|
@ -47,388 +48,6 @@ this is usually because the model isn't capable of properly
|
|||
returning "code edits".
|
||||
Models weaker than GPT 3.5 may have problems working well with aider.
|
||||
|
||||
## Configuring models
|
||||
{: .no_toc }
|
||||
|
||||
- TOC
|
||||
{:toc}
|
||||
|
||||
Aider uses the LiteLLM package to connect to LLM providers.
|
||||
The [LiteLLM provider docs](https://docs.litellm.ai/docs/providers)
|
||||
contain more detail on all the supported providers,
|
||||
their models and any required environment variables.
|
||||
|
||||
## OpenAI
|
||||
|
||||
To work with OpenAI's models, you need to provide your
|
||||
[OpenAI API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key)
|
||||
either in the `OPENAI_API_KEY` environment variable or
|
||||
via the `--openai-api-key` command line switch.
|
||||
|
||||
Aider has some built in shortcuts for the most popular OpenAI models and
|
||||
has been tested and benchmarked to work well with them:
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export OPENAI_API_KEY=<key> # Mac/Linux
|
||||
setx OPENAI_API_KEY <key> # Windows
|
||||
|
||||
# GPT-4o is the best model, used by default
|
||||
aider
|
||||
|
||||
# GPT-4 Turbo (1106)
|
||||
aider --4-turbo
|
||||
|
||||
# GPT-3.5 Turbo
|
||||
aider --35-turbo
|
||||
|
||||
# List models available from OpenAI
|
||||
aider --models openai/
|
||||
```
|
||||
|
||||
You can use `aider --model <model-name>` to use any other OpenAI model.
|
||||
For example, if you want to use a specific version of GPT-4 Turbo
|
||||
you could do `aider --model gpt-4-0125-preview`.
|
||||
|
||||
## Anthropic
|
||||
|
||||
To work with Anthropic's models, you need to provide your
|
||||
[Anthropic API key](https://docs.anthropic.com/claude/reference/getting-started-with-the-api)
|
||||
either in the `ANTHROPIC_API_KEY` environment variable or
|
||||
via the `--anthropic-api-key` command line switch.
|
||||
|
||||
Aider has some built in shortcuts for the most popular Anthropic models and
|
||||
has been tested and benchmarked to work well with them:
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export ANTHROPIC_API_KEY=<key> # Mac/Linux
|
||||
setx ANTHROPIC_API_KEY <key> # Windows
|
||||
|
||||
# Claude 3 Opus
|
||||
aider --opus
|
||||
|
||||
# Claude 3 Sonnet
|
||||
aider --sonnet
|
||||
|
||||
# List models available from Anthropic
|
||||
aider --models anthropic/
|
||||
```
|
||||
|
||||
You can use `aider --model <model-name>` to use any other Anthropic model.
|
||||
For example, if you want to use a specific version of Opus
|
||||
you could do `aider --model claude-3-opus-20240229`.
|
||||
|
||||
## Gemini
|
||||
|
||||
Google currently offers
|
||||
[*free* API access to the Gemini 1.5 Pro model](https://ai.google.dev/pricing).
|
||||
This is the most capable free model to use with aider,
|
||||
with code editing capability that's comparable to GPT-3.5.
|
||||
You'll need a [Gemini API key](https://aistudio.google.com/app/u/2/apikey).
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export GEMINI_API_KEY=<key> # Mac/Linux
|
||||
setx GEMINI_API_KEY <key> # Windows
|
||||
|
||||
aider --model gemini/gemini-1.5-pro-latest
|
||||
|
||||
# List models available from Gemini
|
||||
aider --models gemini/
|
||||
```
|
||||
|
||||
## GROQ
|
||||
|
||||
Groq currently offers *free* API access to the models they host.
|
||||
The Llama 3 70B model works
|
||||
well with aider and is comparable to GPT-3.5 in code editing performance.
|
||||
You'll need a [Groq API key](https://console.groq.com/keys).
|
||||
|
||||
To use **Llama3 70B**:
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export GROQ_API_KEY=<key> # Mac/Linux
|
||||
setx GROQ_API_KEY <key> # Windows
|
||||
|
||||
aider --model groq/llama3-70b-8192
|
||||
|
||||
# List models available from Groq
|
||||
aider --models groq/
|
||||
```
|
||||
|
||||
|
||||
## Cohere
|
||||
|
||||
Cohere offers *free* API access to their models.
|
||||
Their Command-R+ model works well with aider
|
||||
as a *very basic* coding assistant.
|
||||
You'll need a [Cohere API key](https://dashboard.cohere.com/welcome/login).
|
||||
|
||||
To use **Command-R+**:
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export COHERE_API_KEY=<key> # Mac/Linux
|
||||
setx COHERE_API_KEY <key> # Windows
|
||||
|
||||
aider --model command-r-plus
|
||||
|
||||
# List models available from Cohere
|
||||
aider --models cohere_chat/
|
||||
```
|
||||
|
||||
## Azure
|
||||
|
||||
Aider can connect to the OpenAI models on Azure.
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
# Mac/Linux:
|
||||
export AZURE_API_KEY=<key>
|
||||
export AZURE_API_VERSION=2023-05-15
|
||||
export AZURE_API_BASE=https://myendpt.openai.azure.com
|
||||
|
||||
# Windows:
|
||||
setx AZURE_API_KEY <key>
|
||||
setx AZURE_API_VERSION 2023-05-15
|
||||
setx AZURE_API_BASE https://myendpt.openai.azure.com
|
||||
|
||||
aider --model azure/<your_deployment_name>
|
||||
|
||||
# List models available from Azure
|
||||
aider --models azure/
|
||||
```
|
||||
|
||||
## OpenRouter
|
||||
|
||||
Aider can connect to [models provided by OpenRouter](https://openrouter.ai/models?o=top-weekly):
|
||||
You'll need an [OpenRouter API key](https://openrouter.ai/keys).
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export OPENROUTER_API_KEY=<key> # Mac/Linux
|
||||
setx OPENROUTER_API_KEY <key> # Windows
|
||||
|
||||
# Or any other open router model
|
||||
aider --model openrouter/<provider>/<model>
|
||||
|
||||
# List models available from OpenRouter
|
||||
aider --models openrouter/
|
||||
```
|
||||
|
||||
In particular, Llama3 70B works well with aider, at low cost:
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export OPENROUTER_API_KEY=<key> # Mac/Linux
|
||||
setx OPENROUTER_API_KEY <key> # Windows
|
||||
|
||||
aider --model openrouter/meta-llama/llama-3-70b-instruct
|
||||
```
|
||||
|
||||
|
||||
## Ollama
|
||||
|
||||
Aider can connect to local Ollama models.
|
||||
|
||||
```
|
||||
# Pull the model
|
||||
ollama pull <model>
|
||||
|
||||
# Start your ollama server
|
||||
ollama serve
|
||||
|
||||
# In another terminal window...
|
||||
pip install aider-chat
|
||||
|
||||
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
|
||||
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows
|
||||
|
||||
aider --model ollama/<model>
|
||||
```
|
||||
|
||||
In particular, `llama3:70b` works very well with aider:
|
||||
|
||||
|
||||
```
|
||||
ollama pull llama3:70b
|
||||
ollama serve
|
||||
|
||||
# In another terminal window...
|
||||
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
|
||||
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows
|
||||
|
||||
aider --model ollama/llama3:70b
|
||||
```
|
||||
|
||||
Also see the [model warnings](#model-warnings)
|
||||
section for information on warnings which will occur
|
||||
when working with models that aider is not familiar with.
|
||||
|
||||
|
||||
## Deepseek
|
||||
|
||||
Aider can connect to the Deepseek.com API.
|
||||
Deepseek appears to grant 5M tokens of free API usage to new accounts.
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export DEEPSEEK_API_KEY=<key> # Mac/Linux
|
||||
setx DEEPSEEK_API_KEY <key> # Windows
|
||||
|
||||
# Use Deepseek Chat v2
|
||||
aider --model deepseek/deepseek-chat
|
||||
```
|
||||
|
||||
See the [model warnings](#model-warnings)
|
||||
section for information on warnings which will occur
|
||||
when working with models that aider is not familiar with.
|
||||
|
||||
|
||||
## OpenAI compatible APIs
|
||||
|
||||
Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint.
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
# Mac/Linux:
|
||||
export OPENAI_API_BASE=<endpoint>
|
||||
export OPENAI_API_KEY=<key>
|
||||
|
||||
# Windows:
|
||||
setx OPENAI_API_BASE <endpoint>
|
||||
setx OPENAI_API_KEY <key>
|
||||
|
||||
# Prefix the model name with openai/
|
||||
aider --model openai/<model-name>
|
||||
```
|
||||
|
||||
See the [model warnings](#model-warnings)
|
||||
section for information on warnings which will occur
|
||||
when working with models that aider is not familiar with.
|
||||
|
||||
## Other LLMs
|
||||
|
||||
Aider uses the [litellm](https://docs.litellm.ai/docs/providers) package
|
||||
to connect to hundreds of other models.
|
||||
You can use `aider --model <model-name>` to use any supported model.
|
||||
|
||||
To explore the list of supported models you can run `aider --models <model-name>`
|
||||
with a partial model name.
|
||||
If the supplied name is not an exact match for a known model, aider will
|
||||
return a list of possible matching models.
|
||||
For example:
|
||||
|
||||
```
|
||||
$ aider --models turbo
|
||||
|
||||
Aider v0.29.3-dev
|
||||
Models which match "turbo":
|
||||
- gpt-4-turbo-preview (openai/gpt-4-turbo-preview)
|
||||
- gpt-4-turbo (openai/gpt-4-turbo)
|
||||
- gpt-4-turbo-2024-04-09 (openai/gpt-4-turbo-2024-04-09)
|
||||
- gpt-3.5-turbo (openai/gpt-3.5-turbo)
|
||||
- ...
|
||||
```
|
||||
|
||||
See the [list of providers supported by litellm](https://docs.litellm.ai/docs/providers)
|
||||
for more details.
|
||||
|
||||
## Model warnings
|
||||
|
||||
Aider supports connecting to almost any LLM,
|
||||
but it may not work well with less capable models.
|
||||
If you see the model returning code, but aider isn't able to edit your files
|
||||
and commit the changes...
|
||||
this is usually because the model isn't capable of properly
|
||||
returning "code edits".
|
||||
Models weaker than GPT 3.5 may have problems working well with aider.
|
||||
|
||||
Aider tries to sanity check that it is configured correctly
|
||||
to work with the specified model:
|
||||
|
||||
- It checks to see that all required environment variables are set for the model. These variables are required to configure things like API keys, API base URLs, etc.
|
||||
- It checks a metadata database to look up the context window size and token costs for the model.
|
||||
|
||||
Sometimes one or both of these checks will fail, so aider will issue
|
||||
some of the following warnings.
|
||||
|
||||
#### Missing environment variables
|
||||
|
||||
```
|
||||
Model azure/gpt-4-turbo: Missing these environment variables:
|
||||
- AZURE_API_BASE
|
||||
- AZURE_API_VERSION
|
||||
- AZURE_API_KEY
|
||||
```
|
||||
|
||||
You need to set the listed environment variables.
|
||||
Otherwise you will get error messages when you start chatting with the model.
|
||||
|
||||
|
||||
#### Unknown which environment variables are required
|
||||
|
||||
```
|
||||
Model gpt-5: Unknown which environment variables are required.
|
||||
```
|
||||
|
||||
Aider is unable verify the environment because it doesn't know
|
||||
which variables are required for the model.
|
||||
If required variables are missing,
|
||||
you may get errors when you attempt to chat with the model.
|
||||
You can look in the
|
||||
[litellm provider documentation](https://docs.litellm.ai/docs/providers)
|
||||
to see if the required variables are listed there.
|
||||
|
||||
#### Unknown model, did you mean?
|
||||
|
||||
```
|
||||
Model gpt-5: Unknown model, context window size and token costs unavailable.
|
||||
Did you mean one of these?
|
||||
- gpt-4
|
||||
```
|
||||
|
||||
If you specify a model that aider has never heard of, you will get an
|
||||
"unknown model" warning.
|
||||
This means aider doesn't know the context window size and token costs
|
||||
for that model.
|
||||
Some minor functionality will be limited when using such models, but
|
||||
it's not really a significant problem.
|
||||
|
||||
Aider will also try to suggest similarly named models,
|
||||
in case you made a typo or mistake when specifying the model name.
|
||||
|
||||
|
||||
## Editing format
|
||||
|
||||
Aider uses different "edit formats" to collect code edits from different LLMs.
|
||||
The "whole" format is the easiest for an LLM to use, but it uses a lot of tokens
|
||||
and may limit how large a file can be edited.
|
||||
Models which can use one of the diff formats are much more efficient,
|
||||
using far fewer tokens.
|
||||
Models that use a diff-like format are able to
|
||||
edit larger files with less cost and without hitting token limits.
|
||||
|
||||
Aider is configured to use the best edit format for the popular OpenAI and Anthropic models
|
||||
and the [other models recommended on the LLM page](https://aider.chat/docs/llms.html).
|
||||
For lesser known models aider will default to using the "whole" editing format
|
||||
since it is the easiest format for an LLM to use.
|
||||
|
||||
If you would like to experiment with the more advanced formats, you can
|
||||
use these switches: `--edit-format diff` or `--edit-format udiff`.
|
||||
|
||||
# Using a .env file
|
||||
|
||||
Aider will read environment variables from a `.env` file in
|
||||
|
@ -452,3 +71,8 @@ AZURE_API_BASE=https://example-endpoint.openai.azure.com
|
|||
|
||||
OLLAMA_API_BASE=http://127.0.0.1:11434
|
||||
```
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
|
34
website/docs/llms/anthropic.md
Normal file
34
website/docs/llms/anthropic.md
Normal file
|
@ -0,0 +1,34 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 200
|
||||
---
|
||||
|
||||
# Anthropic
|
||||
|
||||
To work with Anthropic's models, you need to provide your
|
||||
[Anthropic API key](https://docs.anthropic.com/claude/reference/getting-started-with-the-api)
|
||||
either in the `ANTHROPIC_API_KEY` environment variable or
|
||||
via the `--anthropic-api-key` command line switch.
|
||||
|
||||
Aider has some built in shortcuts for the most popular Anthropic models and
|
||||
has been tested and benchmarked to work well with them:
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export ANTHROPIC_API_KEY=<key> # Mac/Linux
|
||||
setx ANTHROPIC_API_KEY <key> # Windows
|
||||
|
||||
# Claude 3 Opus
|
||||
aider --opus
|
||||
|
||||
# Claude 3 Sonnet
|
||||
aider --sonnet
|
||||
|
||||
# List models available from Anthropic
|
||||
aider --models anthropic/
|
||||
```
|
||||
|
||||
You can use `aider --model <model-name>` to use any other Anthropic model.
|
||||
For example, if you want to use a specific version of Opus
|
||||
you could do `aider --model claude-3-opus-20240229`.
|
27
website/docs/llms/azure.md
Normal file
27
website/docs/llms/azure.md
Normal file
|
@ -0,0 +1,27 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 500
|
||||
---
|
||||
|
||||
# Azure
|
||||
|
||||
Aider can connect to the OpenAI models on Azure.
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
# Mac/Linux:
|
||||
export AZURE_API_KEY=<key>
|
||||
export AZURE_API_VERSION=2023-05-15
|
||||
export AZURE_API_BASE=https://myendpt.openai.azure.com
|
||||
|
||||
# Windows:
|
||||
setx AZURE_API_KEY <key>
|
||||
setx AZURE_API_VERSION 2023-05-15
|
||||
setx AZURE_API_BASE https://myendpt.openai.azure.com
|
||||
|
||||
aider --model azure/<your_deployment_name>
|
||||
|
||||
# List models available from Azure
|
||||
aider --models azure/
|
||||
```
|
25
website/docs/llms/cohere.md
Normal file
25
website/docs/llms/cohere.md
Normal file
|
@ -0,0 +1,25 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 500
|
||||
---
|
||||
|
||||
# Cohere
|
||||
|
||||
Cohere offers *free* API access to their models.
|
||||
Their Command-R+ model works well with aider
|
||||
as a *very basic* coding assistant.
|
||||
You'll need a [Cohere API key](https://dashboard.cohere.com/welcome/login).
|
||||
|
||||
To use **Command-R+**:
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export COHERE_API_KEY=<key> # Mac/Linux
|
||||
setx COHERE_API_KEY <key> # Windows
|
||||
|
||||
aider --model command-r-plus
|
||||
|
||||
# List models available from Cohere
|
||||
aider --models cohere_chat/
|
||||
```
|
24
website/docs/llms/deepseek.md
Normal file
24
website/docs/llms/deepseek.md
Normal file
|
@ -0,0 +1,24 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 500
|
||||
---
|
||||
|
||||
# Deepseek
|
||||
|
||||
Aider can connect to the Deepseek.com API.
|
||||
Deepseek appears to grant 5M tokens of free API usage to new accounts.
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export DEEPSEEK_API_KEY=<key> # Mac/Linux
|
||||
setx DEEPSEEK_API_KEY <key> # Windows
|
||||
|
||||
# Use Deepseek Chat v2
|
||||
aider --model deepseek/deepseek-chat
|
||||
```
|
||||
|
||||
See the [model warnings](warnings.html)
|
||||
section for information on warnings which will occur
|
||||
when working with models that aider is not familiar with.
|
||||
|
22
website/docs/llms/editing-format.md
Normal file
22
website/docs/llms/editing-format.md
Normal file
|
@ -0,0 +1,22 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 850
|
||||
---
|
||||
|
||||
# Editing format
|
||||
|
||||
Aider uses different "edit formats" to collect code edits from different LLMs.
|
||||
The "whole" format is the easiest for an LLM to use, but it uses a lot of tokens
|
||||
and may limit how large a file can be edited.
|
||||
Models which can use one of the diff formats are much more efficient,
|
||||
using far fewer tokens.
|
||||
Models that use a diff-like format are able to
|
||||
edit larger files with less cost and without hitting token limits.
|
||||
|
||||
Aider is configured to use the best edit format for the popular OpenAI and Anthropic models
|
||||
and the [other models recommended on the LLM page](https://aider.chat/docs/llms.html).
|
||||
For lesser known models aider will default to using the "whole" editing format
|
||||
since it is the easiest format for an LLM to use.
|
||||
|
||||
If you would like to experiment with the more advanced formats, you can
|
||||
use these switches: `--edit-format diff` or `--edit-format udiff`.
|
25
website/docs/llms/gemini.md
Normal file
25
website/docs/llms/gemini.md
Normal file
|
@ -0,0 +1,25 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 300
|
||||
---
|
||||
|
||||
# Gemini
|
||||
|
||||
Google currently offers
|
||||
[*free* API access to the Gemini 1.5 Pro model](https://ai.google.dev/pricing).
|
||||
This is the most capable free model to use with aider,
|
||||
with code editing capability that's comparable to GPT-3.5.
|
||||
You'll need a [Gemini API key](https://aistudio.google.com/app/u/2/apikey).
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export GEMINI_API_KEY=<key> # Mac/Linux
|
||||
setx GEMINI_API_KEY <key> # Windows
|
||||
|
||||
aider --model gemini/gemini-1.5-pro-latest
|
||||
|
||||
# List models available from Gemini
|
||||
aider --models gemini/
|
||||
```
|
||||
|
27
website/docs/llms/groq.md
Normal file
27
website/docs/llms/groq.md
Normal file
|
@ -0,0 +1,27 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 400
|
||||
---
|
||||
|
||||
# GROQ
|
||||
|
||||
Groq currently offers *free* API access to the models they host.
|
||||
The Llama 3 70B model works
|
||||
well with aider and is comparable to GPT-3.5 in code editing performance.
|
||||
You'll need a [Groq API key](https://console.groq.com/keys).
|
||||
|
||||
To use **Llama3 70B**:
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export GROQ_API_KEY=<key> # Mac/Linux
|
||||
setx GROQ_API_KEY <key> # Windows
|
||||
|
||||
aider --model groq/llama3-70b-8192
|
||||
|
||||
# List models available from Groq
|
||||
aider --models groq/
|
||||
```
|
||||
|
||||
|
43
website/docs/llms/ollama.md
Normal file
43
website/docs/llms/ollama.md
Normal file
|
@ -0,0 +1,43 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 500
|
||||
---
|
||||
|
||||
# Ollama
|
||||
|
||||
Aider can connect to local Ollama models.
|
||||
|
||||
```
|
||||
# Pull the model
|
||||
ollama pull <model>
|
||||
|
||||
# Start your ollama server
|
||||
ollama serve
|
||||
|
||||
# In another terminal window...
|
||||
pip install aider-chat
|
||||
|
||||
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
|
||||
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows
|
||||
|
||||
aider --model ollama/<model>
|
||||
```
|
||||
|
||||
In particular, `llama3:70b` works well with aider:
|
||||
|
||||
|
||||
```
|
||||
ollama pull llama3:70b
|
||||
ollama serve
|
||||
|
||||
# In another terminal window...
|
||||
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
|
||||
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows
|
||||
|
||||
aider --model ollama/llama3:70b
|
||||
```
|
||||
|
||||
See the [model warnings](warnings.html)
|
||||
section for information on warnings which will occur
|
||||
when working with models that aider is not familiar with.
|
||||
|
27
website/docs/llms/openai-compat.md
Normal file
27
website/docs/llms/openai-compat.md
Normal file
|
@ -0,0 +1,27 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 500
|
||||
---
|
||||
|
||||
# OpenAI compatible APIs
|
||||
|
||||
Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint.
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
# Mac/Linux:
|
||||
export OPENAI_API_BASE=<endpoint>
|
||||
export OPENAI_API_KEY=<key>
|
||||
|
||||
# Windows:
|
||||
setx OPENAI_API_BASE <endpoint>
|
||||
setx OPENAI_API_KEY <key>
|
||||
|
||||
# Prefix the model name with openai/
|
||||
aider --model openai/<model-name>
|
||||
```
|
||||
|
||||
See the [model warnings](warnings.html)
|
||||
section for information on warnings which will occur
|
||||
when working with models that aider is not familiar with.
|
37
website/docs/llms/openai.md
Normal file
37
website/docs/llms/openai.md
Normal file
|
@ -0,0 +1,37 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 100
|
||||
---
|
||||
|
||||
# OpenAI
|
||||
|
||||
To work with OpenAI's models, you need to provide your
|
||||
[OpenAI API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key)
|
||||
either in the `OPENAI_API_KEY` environment variable or
|
||||
via the `--openai-api-key` command line switch.
|
||||
|
||||
Aider has some built in shortcuts for the most popular OpenAI models and
|
||||
has been tested and benchmarked to work well with them:
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export OPENAI_API_KEY=<key> # Mac/Linux
|
||||
setx OPENAI_API_KEY <key> # Windows
|
||||
|
||||
# GPT-4o is the best model, used by default
|
||||
aider
|
||||
|
||||
# GPT-4 Turbo (1106)
|
||||
aider --4-turbo
|
||||
|
||||
# GPT-3.5 Turbo
|
||||
aider --35-turbo
|
||||
|
||||
# List models available from OpenAI
|
||||
aider --models openai/
|
||||
```
|
||||
|
||||
You can use `aider --model <model-name>` to use any other OpenAI model.
|
||||
For example, if you want to use a specific version of GPT-4 Turbo
|
||||
you could do `aider --model gpt-4-0125-preview`.
|
35
website/docs/llms/openrouter.md
Normal file
35
website/docs/llms/openrouter.md
Normal file
|
@ -0,0 +1,35 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 500
|
||||
---
|
||||
|
||||
# OpenRouter
|
||||
|
||||
Aider can connect to [models provided by OpenRouter](https://openrouter.ai/models?o=top-weekly):
|
||||
You'll need an [OpenRouter API key](https://openrouter.ai/keys).
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export OPENROUTER_API_KEY=<key> # Mac/Linux
|
||||
setx OPENROUTER_API_KEY <key> # Windows
|
||||
|
||||
# Or any other open router model
|
||||
aider --model openrouter/<provider>/<model>
|
||||
|
||||
# List models available from OpenRouter
|
||||
aider --models openrouter/
|
||||
```
|
||||
|
||||
In particular, Llama3 70B works well with aider, at low cost:
|
||||
|
||||
```
|
||||
pip install aider-chat
|
||||
|
||||
export OPENROUTER_API_KEY=<key> # Mac/Linux
|
||||
setx OPENROUTER_API_KEY <key> # Windows
|
||||
|
||||
aider --model openrouter/meta-llama/llama-3-70b-instruct
|
||||
```
|
||||
|
||||
|
39
website/docs/llms/other.md
Normal file
39
website/docs/llms/other.md
Normal file
|
@ -0,0 +1,39 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 800
|
||||
---
|
||||
|
||||
# Other LLMs
|
||||
|
||||
Aider uses the [litellm](https://docs.litellm.ai/docs/providers) package
|
||||
to connect to hundreds of other models.
|
||||
You can use `aider --model <model-name>` to use any supported model.
|
||||
|
||||
To explore the list of supported models you can run `aider --models <model-name>`
|
||||
with a partial model name.
|
||||
If the supplied name is not an exact match for a known model, aider will
|
||||
return a list of possible matching models.
|
||||
For example:
|
||||
|
||||
```
|
||||
$ aider --models turbo
|
||||
|
||||
Aider v0.29.3-dev
|
||||
Models which match "turbo":
|
||||
- gpt-4-turbo-preview (openai/gpt-4-turbo-preview)
|
||||
- gpt-4-turbo (openai/gpt-4-turbo)
|
||||
- gpt-4-turbo-2024-04-09 (openai/gpt-4-turbo-2024-04-09)
|
||||
- gpt-3.5-turbo (openai/gpt-3.5-turbo)
|
||||
- ...
|
||||
```
|
||||
|
||||
See the [model warnings](warnings.html)
|
||||
section for information on warnings which will occur
|
||||
when working with models that aider is not familiar with.
|
||||
|
||||
## LiteLLM
|
||||
|
||||
Aider uses the LiteLLM package to connect to LLM providers.
|
||||
The [LiteLLM provider docs](https://docs.litellm.ai/docs/providers)
|
||||
contain more detail on all the supported providers,
|
||||
their models and any required environment variables.
|
70
website/docs/llms/warnings.md
Normal file
70
website/docs/llms/warnings.md
Normal file
|
@ -0,0 +1,70 @@
|
|||
---
|
||||
parent: Connecting to LLMs
|
||||
nav_order: 900
|
||||
---
|
||||
|
||||
# Model warnings
|
||||
|
||||
Aider supports connecting to almost any LLM,
|
||||
but it may not work well with less capable models.
|
||||
If you see the model returning code, but aider isn't able to edit your files
|
||||
and commit the changes...
|
||||
this is usually because the model isn't capable of properly
|
||||
returning "code edits".
|
||||
Models weaker than GPT 3.5 may have problems working well with aider.
|
||||
|
||||
Aider tries to sanity check that it is configured correctly
|
||||
to work with the specified model:
|
||||
|
||||
- It checks to see that all required environment variables are set for the model. These variables are required to configure things like API keys, API base URLs, etc.
|
||||
- It checks a metadata database to look up the context window size and token costs for the model.
|
||||
|
||||
Sometimes one or both of these checks will fail, so aider will issue
|
||||
some of the following warnings.
|
||||
|
||||
## Missing environment variables
|
||||
|
||||
```
|
||||
Model azure/gpt-4-turbo: Missing these environment variables:
|
||||
- AZURE_API_BASE
|
||||
- AZURE_API_VERSION
|
||||
- AZURE_API_KEY
|
||||
```
|
||||
|
||||
You need to set the listed environment variables.
|
||||
Otherwise you will get error messages when you start chatting with the model.
|
||||
|
||||
|
||||
## Unknown which environment variables are required
|
||||
|
||||
```
|
||||
Model gpt-5: Unknown which environment variables are required.
|
||||
```
|
||||
|
||||
Aider is unable verify the environment because it doesn't know
|
||||
which variables are required for the model.
|
||||
If required variables are missing,
|
||||
you may get errors when you attempt to chat with the model.
|
||||
You can look in the
|
||||
[litellm provider documentation](https://docs.litellm.ai/docs/providers)
|
||||
to see if the required variables are listed there.
|
||||
|
||||
## Unknown model, did you mean?
|
||||
|
||||
```
|
||||
Model gpt-5: Unknown model, context window size and token costs unavailable.
|
||||
Did you mean one of these?
|
||||
- gpt-4
|
||||
```
|
||||
|
||||
If you specify a model that aider has never heard of, you will get an
|
||||
"unknown model" warning.
|
||||
This means aider doesn't know the context window size and token costs
|
||||
for that model.
|
||||
Some minor functionality will be limited when using such models, but
|
||||
it's not really a significant problem.
|
||||
|
||||
Aider will also try to suggest similarly named models,
|
||||
in case you made a typo or mistake when specifying the model name.
|
||||
|
||||
|
Loading…
Add table
Add a link
Reference in a new issue