Refactor: Update LLM docs to match gemini.md structure

This commit is contained in:
Paul Gauthier (aider) 2025-04-20 10:28:01 -07:00
parent ec7ac60cfc
commit 21fa54d792
13 changed files with 173 additions and 65 deletions

View file

@ -10,21 +10,26 @@ To work with Anthropic's models, you need to provide your
either in the `ANTHROPIC_API_KEY` environment variable or either in the `ANTHROPIC_API_KEY` environment variable or
via the `--anthropic-api-key` command line switch. via the `--anthropic-api-key` command line switch.
Aider has some built in shortcuts for the most popular Anthropic models and First, install aider:
has been tested and benchmarked to work well with them:
{% include install.md %}
Then configure your API keys:
``` ```
python -m pip install -U aider-chat
export ANTHROPIC_API_KEY=<key> # Mac/Linux export ANTHROPIC_API_KEY=<key> # Mac/Linux
setx ANTHROPIC_API_KEY <key> # Windows, restart shell after setx setx ANTHROPIC_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and Anthropic on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# Aider uses Claude 3.7 Sonnet by default # Aider uses Claude 3.7 Sonnet by default
aider aider
# Claude 3 Opus
aider --model claude-3-opus-20240229
# List models available from Anthropic # List models available from Anthropic
aider --list-models anthropic/ aider --list-models anthropic/
``` ```

View file

@ -7,9 +7,13 @@ nav_order: 500
Aider can connect to the OpenAI models on Azure. Aider can connect to the OpenAI models on Azure.
``` First, install aider:
python -m pip install -U aider-chat
{% include install.md %}
Then configure your API keys and endpoint:
```
# Mac/Linux: # Mac/Linux:
export AZURE_API_KEY=<key> export AZURE_API_KEY=<key>
export AZURE_API_VERSION=2024-12-01-preview export AZURE_API_VERSION=2024-12-01-preview
@ -20,6 +24,13 @@ setx AZURE_API_KEY <key>
setx AZURE_API_VERSION 2024-12-01-preview setx AZURE_API_VERSION 2024-12-01-preview
setx AZURE_API_BASE https://myendpt.openai.azure.com setx AZURE_API_BASE https://myendpt.openai.azure.com
# ... restart your shell after setx commands # ... restart your shell after setx commands
```
Start working with aider and Azure on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
aider --model azure/<your_model_deployment_name> aider --model azure/<your_model_deployment_name>

View file

@ -6,10 +6,7 @@ nav_order: 560
# Amazon Bedrock # Amazon Bedrock
Aider can connect to models provided by Amazon Bedrock. Aider can connect to models provided by Amazon Bedrock.
You will need to have an AWS account with access to the Bedrock service. You will need to have an AWS account with access to the Bedrock service and the specific models you wish to use.
To configure Aider to use the Amazon Bedrock API, you need to set up your AWS credentials.
This can be done using the AWS CLI or by setting environment variables.
## Select a Model from Amazon Bedrock ## Select a Model from Amazon Bedrock
@ -37,6 +34,14 @@ feature, you will receive an error message like the following:
anthropic.claude-3-7-sonnet-20250219-v1:0 with on-demand throughput isn\xe2\x80\x99t supported. Retry your anthropic.claude-3-7-sonnet-20250219-v1:0 with on-demand throughput isn\xe2\x80\x99t supported. Retry your
request with the ID or ARN of an inference profile that contains this model."}' request with the ID or ARN of an inference profile that contains this model."}'
## Installation and Configuration
First, install aider:
{% include install.md %}
Next, configure your AWS credentials. This can be done using the AWS CLI or by setting environment variables.
## AWS CLI Configuration ## AWS CLI Configuration
If you haven't already, install the [AWS CLI](https://aws.amazon.com/cli/) and configure it with your credentials: If you haven't already, install the [AWS CLI](https://aws.amazon.com/cli/) and configure it with your credentials:
@ -49,7 +54,7 @@ This will prompt you to enter your AWS Access Key ID, Secret Access Key, and def
## Environment Variables ## Environment Variables
Alternatively, you can set the following environment variables: You can set the following environment variables:
```bash ```bash
export AWS_REGION=your_preferred_region export AWS_REGION=your_preferred_region
@ -63,7 +68,7 @@ export AWS_PROFILE=your-profile
``` ```
You can add these to your You can add these to your
[.env file](/docs/config/dotenv.html). `.env` file.
### Set Environment Variables with PowerShell ### Set Environment Variables with PowerShell
@ -77,6 +82,8 @@ $env:AWS_REGION = 'us-west-2' # Put whichever AWS region that you'd like, that
## Install boto3 ## Install boto3
Aider needs the `boto3` library to connect to Bedrock.
The AWS Bedrock provider requires the `boto3` package in order to function correctly: The AWS Bedrock provider requires the `boto3` package in order to function correctly:
```bash ```bash
@ -95,12 +102,14 @@ You must install `boto3` dependency to aider's virtual environment installed via
uv tool run --from aider-chat pip install boto3 uv tool run --from aider-chat pip install boto3
``` ```
## Get Started
## Running Aider with Bedrock
Once your AWS credentials are set up, you can run Aider with the `--model` command line switch, specifying the Bedrock model you want to use: Once your AWS credentials are set up, you can run Aider with the `--model` command line switch, specifying the Bedrock model you want to use:
```bash ```bash
# Change directory into your codebase
cd /to/your/project
aider --model bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0 aider --model bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
``` ```

View file

@ -10,13 +10,22 @@ Their Command-R+ model works well with aider
as a *very basic* coding assistant. as a *very basic* coding assistant.
You'll need a [Cohere API key](https://dashboard.cohere.com/welcome/login). You'll need a [Cohere API key](https://dashboard.cohere.com/welcome/login).
To use **Command-R+**: First, install aider:
{% include install.md %}
Then configure your API keys:
``` ```
python -m pip install -U aider-chat
export COHERE_API_KEY=<key> # Mac/Linux export COHERE_API_KEY=<key> # Mac/Linux
setx COHERE_API_KEY <key> # Windows, restart shell after setx setx COHERE_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and Cohere on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
aider --model command-r-plus-08-2024 aider --model command-r-plus-08-2024

View file

@ -9,11 +9,22 @@ Aider can connect to the DeepSeek.com API.
To work with DeepSeek's models, you need to set the `DEEPSEEK_API_KEY` environment variable with your [DeepSeek API key](https://platform.deepseek.com/api_keys). To work with DeepSeek's models, you need to set the `DEEPSEEK_API_KEY` environment variable with your [DeepSeek API key](https://platform.deepseek.com/api_keys).
The DeepSeek Chat V3 model has a top score on aider's code editing benchmark. The DeepSeek Chat V3 model has a top score on aider's code editing benchmark.
``` First, install aider:
python -m pip install -U aider-chat
{% include install.md %}
Then configure your API keys:
```
export DEEPSEEK_API_KEY=<key> # Mac/Linux export DEEPSEEK_API_KEY=<key> # Mac/Linux
setx DEEPSEEK_API_KEY <key> # Windows, restart shell after setx setx DEEPSEEK_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and DeepSeek on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# Use DeepSeek Chat v3 # Use DeepSeek Chat v3
aider --model deepseek/deepseek-chat aider --model deepseek/deepseek-chat

View file

@ -10,13 +10,22 @@ The Llama 3 70B model works
well with aider and is comparable to GPT-3.5 in code editing performance. well with aider and is comparable to GPT-3.5 in code editing performance.
You'll need a [Groq API key](https://console.groq.com/keys). You'll need a [Groq API key](https://console.groq.com/keys).
To use **Llama3 70B**: First, install aider:
{% include install.md %}
Then configure your API keys:
``` ```
python -m pip install -U aider-chat
export GROQ_API_KEY=<key> # Mac/Linux export GROQ_API_KEY=<key> # Mac/Linux
setx GROQ_API_KEY <key> # Windows, restart shell after setx setx GROQ_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and Groq on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
aider --model groq/llama3-70b-8192 aider --model groq/llama3-70b-8192

View file

@ -5,11 +5,15 @@ nav_order: 400
# LM Studio # LM Studio
To use LM Studio: Aider can connect to models served by LM Studio.
First, install aider:
{% include install.md %}
Then configure your API key and endpoint:
``` ```
python -m pip install -U aider-chat
# Must set a value here even if its a dummy value # Must set a value here even if its a dummy value
export LM_STUDIO_API_KEY=dummy-api-key # Mac/Linux export LM_STUDIO_API_KEY=dummy-api-key # Mac/Linux
setx LM_STUDIO_API_KEY dummy-api-key # Windows, restart shell after setx setx LM_STUDIO_API_KEY dummy-api-key # Windows, restart shell after setx
@ -17,12 +21,19 @@ setx LM_STUDIO_API_KEY dummy-api-key # Windows, restart shell after setx
# LM Studio default server URL is http://localhost:1234/v1 # LM Studio default server URL is http://localhost:1234/v1
export LM_STUDIO_API_BASE=http://localhost:1234/v1 # Mac/Linux export LM_STUDIO_API_BASE=http://localhost:1234/v1 # Mac/Linux
setx LM_STUDIO_API_BASE http://localhost:1234/v1 # Windows, restart shell after setx setx LM_STUDIO_API_BASE http://localhost:1234/v1 # Windows, restart shell after setx
aider --model lm_studio/<your-model-name>
``` ```
**Note:** Even though LM Studio doesn't require an API Key out of the box the `LM_STUDIO_API_KEY` must have a dummy value like `dummy-api-key` set or the client request will fail trying to send an empty `Bearer` token. **Note:** Even though LM Studio doesn't require an API Key out of the box the `LM_STUDIO_API_KEY` must have a dummy value like `dummy-api-key` set or the client request will fail trying to send an empty `Bearer` token.
Start working with aider and LM Studio on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
aider --model lm_studio/<your-model-name>
```
See the [model warnings](warnings.html) See the [model warnings](warnings.html)
section for information on warnings which will occur section for information on warnings which will occur
when working with models that aider is not familiar with. when working with models that aider is not familiar with.

View file

@ -7,6 +7,19 @@ nav_order: 500
Aider can connect to local Ollama models. Aider can connect to local Ollama models.
First, install aider:
{% include install.md %}
Then configure your Ollama API endpoint (usually the default):
```bash
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows, restart shell after setx
```
Start working with aider and Ollama on your codebase:
``` ```
# Pull the model # Pull the model
ollama pull <model> ollama pull <model>
@ -14,11 +27,8 @@ ollama pull <model>
# Start your ollama server, increasing the context window to 8k tokens # Start your ollama server, increasing the context window to 8k tokens
OLLAMA_CONTEXT_LENGTH=8192 ollama serve OLLAMA_CONTEXT_LENGTH=8192 ollama serve
# In another terminal window... # In another terminal window, change directory into your codebase
python -m pip install -U aider-chat cd /to/your/project
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows, restart shell after setx
aider --model ollama_chat/<model> aider --model ollama_chat/<model>
``` ```

View file

@ -7,10 +7,13 @@ nav_order: 500
Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint. Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint.
``` First, install aider:
python -m pip install aider-install
aider-install
{% include install.md %}
Then configure your API key and endpoint:
```
# Mac/Linux: # Mac/Linux:
export OPENAI_API_BASE=<endpoint> export OPENAI_API_BASE=<endpoint>
export OPENAI_API_KEY=<key> export OPENAI_API_KEY=<key>
@ -19,6 +22,13 @@ export OPENAI_API_KEY=<key>
setx OPENAI_API_BASE <endpoint> setx OPENAI_API_BASE <endpoint>
setx OPENAI_API_KEY <key> setx OPENAI_API_KEY <key>
# ... restart shell after setx commands # ... restart shell after setx commands
```
Start working with aider and your OpenAI compatible API on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# Prefix the model name with openai/ # Prefix the model name with openai/
aider --model openai/<model-name> aider --model openai/<model-name>

View file

@ -10,27 +10,34 @@ To work with OpenAI's models, you need to provide your
either in the `OPENAI_API_KEY` environment variable or either in the `OPENAI_API_KEY` environment variable or
via the `--api-key openai=<key>` command line switch. via the `--api-key openai=<key>` command line switch.
Aider has some built in shortcuts for the most popular OpenAI models and First, install aider:
has been tested and benchmarked to work well with them:
{% include install.md %}
Then configure your API keys:
``` ```
python -m pip install -U aider-chat export OPENAI_API_KEY=<key> # Mac/Linux
setx OPENAI_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and OpenAI on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# o3-mini # o3-mini
aider --model o3-mini --api-key openai=<key> aider --model o3-mini
# o1-mini # o1-mini
aider --model o1-mini --api-key openai=<key> aider --model o1-mini
# GPT-4o # GPT-4o
aider --model gpt-4o --api-key openai=<key> aider --model gpt-4o
# List models available from OpenAI # List models available from OpenAI
aider --list-models openai/ aider --list-models openai/
# You can also store you API key in environment variables (or .env)
export OPENAI_API_KEY=<key> # Mac/Linux
setx OPENAI_API_KEY <key> # Windows, restart shell after setx
``` ```
You can use `aider --model <model-name>` to use any other OpenAI model. You can use `aider --model <model-name>` to use any other OpenAI model.

View file

@ -8,11 +8,22 @@ nav_order: 500
Aider can connect to [models provided by OpenRouter](https://openrouter.ai/models?o=top-weekly): Aider can connect to [models provided by OpenRouter](https://openrouter.ai/models?o=top-weekly):
You'll need an [OpenRouter API key](https://openrouter.ai/keys). You'll need an [OpenRouter API key](https://openrouter.ai/keys).
``` First, install aider:
python -m pip install -U aider-chat
{% include install.md %}
Then configure your API keys:
```
export OPENROUTER_API_KEY=<key> # Mac/Linux export OPENROUTER_API_KEY=<key> # Mac/Linux
setx OPENROUTER_API_KEY <key> # Windows, restart shell after setx setx OPENROUTER_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and OpenRouter on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# Or any other open router model # Or any other open router model
aider --model openrouter/<provider>/<model> aider --model openrouter/<provider>/<model>
@ -23,16 +34,6 @@ aider --list-models openrouter/
In particular, many aider users access Sonnet via OpenRouter: In particular, many aider users access Sonnet via OpenRouter:
```
python -m pip install -U aider-chat
export OPENROUTER_API_KEY=<key> # Mac/Linux
setx OPENROUTER_API_KEY <key> # Windows, restart shell after setx
aider --model openrouter/anthropic/claude-3.7-sonnet
```
{: .tip } {: .tip }
If you get errors, check your If you get errors, check your
[OpenRouter privacy settings](https://openrouter.ai/settings/privacy). [OpenRouter privacy settings](https://openrouter.ai/settings/privacy).

View file

@ -13,6 +13,10 @@ or service account with permission to use the Vertex AI API.
With your chosen login method, the gcloud CLI should automatically set the With your chosen login method, the gcloud CLI should automatically set the
`GOOGLE_APPLICATION_CREDENTIALS` environment variable which points to the credentials file. `GOOGLE_APPLICATION_CREDENTIALS` environment variable which points to the credentials file.
First, install aider:
{% include install.md %}
To configure Aider to use the Vertex AI API, you need to set `VERTEXAI_PROJECT` (the GCP project ID) To configure Aider to use the Vertex AI API, you need to set `VERTEXAI_PROJECT` (the GCP project ID)
and `VERTEXAI_LOCATION` (the GCP region) [environment variables for Aider](/docs/config/dotenv.html). and `VERTEXAI_LOCATION` (the GCP region) [environment variables for Aider](/docs/config/dotenv.html).
@ -27,9 +31,12 @@ VERTEXAI_PROJECT=my-project
VERTEXAI_LOCATION=us-east5 VERTEXAI_LOCATION=us-east5
``` ```
Then you can run aider with the `--model` command line switch, like this: Start working with aider and Vertex AI on your codebase:
``` ```
# Change directory into your codebase
cd /to/your/project
aider --model vertex_ai/claude-3-5-sonnet@20240620 aider --model vertex_ai/claude-3-5-sonnet@20240620
``` ```

View file

@ -7,14 +7,22 @@ nav_order: 400
You'll need a [xAI API key](https://console.x.ai.). You'll need a [xAI API key](https://console.x.ai.).
To use xAI: First, install aider:
{% include install.md %}
Then configure your API keys:
```bash ```bash
python -m pip install aider-install
aider-install
export XAI_API_KEY=<key> # Mac/Linux export XAI_API_KEY=<key> # Mac/Linux
setx XAI_API_KEY <key> # Windows, restart shell after setx setx XAI_API_KEY <key> # Windows, restart shell after setx
```
Start working with aider and xAI on your codebase:
```bash
# Change directory into your codebase
cd /to/your/project
# Grok 3 # Grok 3
aider --model xai/grok-3-beta aider --model xai/grok-3-beta