Refactor: Update LLM docs to match gemini.md structure

This commit is contained in:
Paul Gauthier (aider) 2025-04-20 10:28:01 -07:00
parent ec7ac60cfc
commit 21fa54d792
13 changed files with 173 additions and 65 deletions

View file

@ -10,21 +10,26 @@ To work with Anthropic's models, you need to provide your
either in the `ANTHROPIC_API_KEY` environment variable or
via the `--anthropic-api-key` command line switch.
Aider has some built in shortcuts for the most popular Anthropic models and
has been tested and benchmarked to work well with them:
First, install aider:
{% include install.md %}
Then configure your API keys:
```
python -m pip install -U aider-chat
export ANTHROPIC_API_KEY=<key> # Mac/Linux
setx ANTHROPIC_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and Anthropic on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# Aider uses Claude 3.7 Sonnet by default
aider
# Claude 3 Opus
aider --model claude-3-opus-20240229
# List models available from Anthropic
aider --list-models anthropic/
```

View file

@ -7,9 +7,13 @@ nav_order: 500
Aider can connect to the OpenAI models on Azure.
```
python -m pip install -U aider-chat
First, install aider:
{% include install.md %}
Then configure your API keys and endpoint:
```
# Mac/Linux:
export AZURE_API_KEY=<key>
export AZURE_API_VERSION=2024-12-01-preview
@ -20,6 +24,13 @@ setx AZURE_API_KEY <key>
setx AZURE_API_VERSION 2024-12-01-preview
setx AZURE_API_BASE https://myendpt.openai.azure.com
# ... restart your shell after setx commands
```
Start working with aider and Azure on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
aider --model azure/<your_model_deployment_name>

View file

@ -6,10 +6,7 @@ nav_order: 560
# Amazon Bedrock
Aider can connect to models provided by Amazon Bedrock.
You will need to have an AWS account with access to the Bedrock service.
To configure Aider to use the Amazon Bedrock API, you need to set up your AWS credentials.
This can be done using the AWS CLI or by setting environment variables.
You will need to have an AWS account with access to the Bedrock service and the specific models you wish to use.
## Select a Model from Amazon Bedrock
@ -37,6 +34,14 @@ feature, you will receive an error message like the following:
anthropic.claude-3-7-sonnet-20250219-v1:0 with on-demand throughput isn\xe2\x80\x99t supported. Retry your
request with the ID or ARN of an inference profile that contains this model."}'
## Installation and Configuration
First, install aider:
{% include install.md %}
Next, configure your AWS credentials. This can be done using the AWS CLI or by setting environment variables.
## AWS CLI Configuration
If you haven't already, install the [AWS CLI](https://aws.amazon.com/cli/) and configure it with your credentials:
@ -49,7 +54,7 @@ This will prompt you to enter your AWS Access Key ID, Secret Access Key, and def
## Environment Variables
Alternatively, you can set the following environment variables:
You can set the following environment variables:
```bash
export AWS_REGION=your_preferred_region
@ -63,7 +68,7 @@ export AWS_PROFILE=your-profile
```
You can add these to your
[.env file](/docs/config/dotenv.html).
`.env` file.
### Set Environment Variables with PowerShell
@ -77,6 +82,8 @@ $env:AWS_REGION = 'us-west-2' # Put whichever AWS region that you'd like, that
## Install boto3
Aider needs the `boto3` library to connect to Bedrock.
The AWS Bedrock provider requires the `boto3` package in order to function correctly:
```bash
@ -95,12 +102,14 @@ You must install `boto3` dependency to aider's virtual environment installed via
uv tool run --from aider-chat pip install boto3
```
## Running Aider with Bedrock
## Get Started
Once your AWS credentials are set up, you can run Aider with the `--model` command line switch, specifying the Bedrock model you want to use:
```bash
# Change directory into your codebase
cd /to/your/project
aider --model bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
```

View file

@ -10,13 +10,22 @@ Their Command-R+ model works well with aider
as a *very basic* coding assistant.
You'll need a [Cohere API key](https://dashboard.cohere.com/welcome/login).
To use **Command-R+**:
First, install aider:
{% include install.md %}
Then configure your API keys:
```
python -m pip install -U aider-chat
export COHERE_API_KEY=<key> # Mac/Linux
setx COHERE_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and Cohere on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
aider --model command-r-plus-08-2024

View file

@ -9,11 +9,22 @@ Aider can connect to the DeepSeek.com API.
To work with DeepSeek's models, you need to set the `DEEPSEEK_API_KEY` environment variable with your [DeepSeek API key](https://platform.deepseek.com/api_keys).
The DeepSeek Chat V3 model has a top score on aider's code editing benchmark.
```
python -m pip install -U aider-chat
First, install aider:
{% include install.md %}
Then configure your API keys:
```
export DEEPSEEK_API_KEY=<key> # Mac/Linux
setx DEEPSEEK_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and DeepSeek on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# Use DeepSeek Chat v3
aider --model deepseek/deepseek-chat

View file

@ -10,13 +10,22 @@ The Llama 3 70B model works
well with aider and is comparable to GPT-3.5 in code editing performance.
You'll need a [Groq API key](https://console.groq.com/keys).
To use **Llama3 70B**:
First, install aider:
{% include install.md %}
Then configure your API keys:
```
python -m pip install -U aider-chat
export GROQ_API_KEY=<key> # Mac/Linux
setx GROQ_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and Groq on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
aider --model groq/llama3-70b-8192

View file

@ -5,11 +5,15 @@ nav_order: 400
# LM Studio
To use LM Studio:
Aider can connect to models served by LM Studio.
First, install aider:
{% include install.md %}
Then configure your API key and endpoint:
```
python -m pip install -U aider-chat
# Must set a value here even if its a dummy value
export LM_STUDIO_API_KEY=dummy-api-key # Mac/Linux
setx LM_STUDIO_API_KEY dummy-api-key # Windows, restart shell after setx
@ -17,12 +21,19 @@ setx LM_STUDIO_API_KEY dummy-api-key # Windows, restart shell after setx
# LM Studio default server URL is http://localhost:1234/v1
export LM_STUDIO_API_BASE=http://localhost:1234/v1 # Mac/Linux
setx LM_STUDIO_API_BASE http://localhost:1234/v1 # Windows, restart shell after setx
aider --model lm_studio/<your-model-name>
```
**Note:** Even though LM Studio doesn't require an API Key out of the box the `LM_STUDIO_API_KEY` must have a dummy value like `dummy-api-key` set or the client request will fail trying to send an empty `Bearer` token.
Start working with aider and LM Studio on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
aider --model lm_studio/<your-model-name>
```
See the [model warnings](warnings.html)
section for information on warnings which will occur
when working with models that aider is not familiar with.

View file

@ -7,6 +7,19 @@ nav_order: 500
Aider can connect to local Ollama models.
First, install aider:
{% include install.md %}
Then configure your Ollama API endpoint (usually the default):
```bash
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows, restart shell after setx
```
Start working with aider and Ollama on your codebase:
```
# Pull the model
ollama pull <model>
@ -14,11 +27,8 @@ ollama pull <model>
# Start your ollama server, increasing the context window to 8k tokens
OLLAMA_CONTEXT_LENGTH=8192 ollama serve
# In another terminal window...
python -m pip install -U aider-chat
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows, restart shell after setx
# In another terminal window, change directory into your codebase
cd /to/your/project
aider --model ollama_chat/<model>
```

View file

@ -7,10 +7,13 @@ nav_order: 500
Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint.
```
python -m pip install aider-install
aider-install
First, install aider:
{% include install.md %}
Then configure your API key and endpoint:
```
# Mac/Linux:
export OPENAI_API_BASE=<endpoint>
export OPENAI_API_KEY=<key>
@ -19,6 +22,13 @@ export OPENAI_API_KEY=<key>
setx OPENAI_API_BASE <endpoint>
setx OPENAI_API_KEY <key>
# ... restart shell after setx commands
```
Start working with aider and your OpenAI compatible API on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# Prefix the model name with openai/
aider --model openai/<model-name>

View file

@ -10,27 +10,34 @@ To work with OpenAI's models, you need to provide your
either in the `OPENAI_API_KEY` environment variable or
via the `--api-key openai=<key>` command line switch.
Aider has some built in shortcuts for the most popular OpenAI models and
has been tested and benchmarked to work well with them:
First, install aider:
{% include install.md %}
Then configure your API keys:
```
python -m pip install -U aider-chat
export OPENAI_API_KEY=<key> # Mac/Linux
setx OPENAI_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and OpenAI on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# o3-mini
aider --model o3-mini --api-key openai=<key>
aider --model o3-mini
# o1-mini
aider --model o1-mini --api-key openai=<key>
aider --model o1-mini
# GPT-4o
aider --model gpt-4o --api-key openai=<key>
aider --model gpt-4o
# List models available from OpenAI
aider --list-models openai/
# You can also store you API key in environment variables (or .env)
export OPENAI_API_KEY=<key> # Mac/Linux
setx OPENAI_API_KEY <key> # Windows, restart shell after setx
```
You can use `aider --model <model-name>` to use any other OpenAI model.

View file

@ -8,11 +8,22 @@ nav_order: 500
Aider can connect to [models provided by OpenRouter](https://openrouter.ai/models?o=top-weekly):
You'll need an [OpenRouter API key](https://openrouter.ai/keys).
```
python -m pip install -U aider-chat
First, install aider:
{% include install.md %}
Then configure your API keys:
```
export OPENROUTER_API_KEY=<key> # Mac/Linux
setx OPENROUTER_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and OpenRouter on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# Or any other open router model
aider --model openrouter/<provider>/<model>
@ -23,16 +34,6 @@ aider --list-models openrouter/
In particular, many aider users access Sonnet via OpenRouter:
```
python -m pip install -U aider-chat
export OPENROUTER_API_KEY=<key> # Mac/Linux
setx OPENROUTER_API_KEY <key> # Windows, restart shell after setx
aider --model openrouter/anthropic/claude-3.7-sonnet
```
{: .tip }
If you get errors, check your
[OpenRouter privacy settings](https://openrouter.ai/settings/privacy).

View file

@ -13,6 +13,10 @@ or service account with permission to use the Vertex AI API.
With your chosen login method, the gcloud CLI should automatically set the
`GOOGLE_APPLICATION_CREDENTIALS` environment variable which points to the credentials file.
First, install aider:
{% include install.md %}
To configure Aider to use the Vertex AI API, you need to set `VERTEXAI_PROJECT` (the GCP project ID)
and `VERTEXAI_LOCATION` (the GCP region) [environment variables for Aider](/docs/config/dotenv.html).
@ -27,9 +31,12 @@ VERTEXAI_PROJECT=my-project
VERTEXAI_LOCATION=us-east5
```
Then you can run aider with the `--model` command line switch, like this:
Start working with aider and Vertex AI on your codebase:
```
# Change directory into your codebase
cd /to/your/project
aider --model vertex_ai/claude-3-5-sonnet@20240620
```

View file

@ -7,14 +7,22 @@ nav_order: 400
You'll need a [xAI API key](https://console.x.ai.).
To use xAI:
First, install aider:
{% include install.md %}
Then configure your API keys:
```bash
python -m pip install aider-install
aider-install
export XAI_API_KEY=<key> # Mac/Linux
setx XAI_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and xAI on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# Grok 3
aider --model xai/grok-3-beta