From 21fa54d792a5ba25e6d93d3e93e7a6004c3c5083 Mon Sep 17 00:00:00 2001 From: "Paul Gauthier (aider)" Date: Sun, 20 Apr 2025 10:28:01 -0700 Subject: [PATCH] Refactor: Update LLM docs to match gemini.md structure --- aider/website/docs/llms/anthropic.md | 19 +++++++++++------ aider/website/docs/llms/azure.md | 15 +++++++++++-- aider/website/docs/llms/bedrock.md | 25 +++++++++++++++------- aider/website/docs/llms/cohere.md | 15 ++++++++++--- aider/website/docs/llms/deepseek.md | 15 +++++++++++-- aider/website/docs/llms/groq.md | 15 ++++++++++--- aider/website/docs/llms/lm-studio.md | 21 +++++++++++++----- aider/website/docs/llms/ollama.md | 20 +++++++++++++----- aider/website/docs/llms/openai-compat.md | 16 +++++++++++--- aider/website/docs/llms/openai.md | 27 +++++++++++++++--------- aider/website/docs/llms/openrouter.md | 25 +++++++++++----------- aider/website/docs/llms/vertex.md | 9 +++++++- aider/website/docs/llms/xai.md | 16 ++++++++++---- 13 files changed, 173 insertions(+), 65 deletions(-) diff --git a/aider/website/docs/llms/anthropic.md b/aider/website/docs/llms/anthropic.md index cf69ab610..26748b101 100644 --- a/aider/website/docs/llms/anthropic.md +++ b/aider/website/docs/llms/anthropic.md @@ -10,21 +10,26 @@ To work with Anthropic's models, you need to provide your either in the `ANTHROPIC_API_KEY` environment variable or via the `--anthropic-api-key` command line switch. -Aider has some built in shortcuts for the most popular Anthropic models and -has been tested and benchmarked to work well with them: +First, install aider: + +{% include install.md %} + +Then configure your API keys: ``` -python -m pip install -U aider-chat - export ANTHROPIC_API_KEY= # Mac/Linux setx ANTHROPIC_API_KEY # Windows, restart shell after setx +``` + +Start working with aider and Anthropic on your codebase: + +```bash +# Change directory into your codebase +cd /to/your/project # Aider uses Claude 3.7 Sonnet by default aider -# Claude 3 Opus -aider --model claude-3-opus-20240229 - # List models available from Anthropic aider --list-models anthropic/ ``` diff --git a/aider/website/docs/llms/azure.md b/aider/website/docs/llms/azure.md index c342ec700..7e20fc83d 100644 --- a/aider/website/docs/llms/azure.md +++ b/aider/website/docs/llms/azure.md @@ -7,9 +7,13 @@ nav_order: 500 Aider can connect to the OpenAI models on Azure. -``` -python -m pip install -U aider-chat +First, install aider: +{% include install.md %} + +Then configure your API keys and endpoint: + +``` # Mac/Linux: export AZURE_API_KEY= export AZURE_API_VERSION=2024-12-01-preview @@ -20,6 +24,13 @@ setx AZURE_API_KEY setx AZURE_API_VERSION 2024-12-01-preview setx AZURE_API_BASE https://myendpt.openai.azure.com # ... restart your shell after setx commands +``` + +Start working with aider and Azure on your codebase: + +```bash +# Change directory into your codebase +cd /to/your/project aider --model azure/ diff --git a/aider/website/docs/llms/bedrock.md b/aider/website/docs/llms/bedrock.md index c7705918f..f3e2131c7 100644 --- a/aider/website/docs/llms/bedrock.md +++ b/aider/website/docs/llms/bedrock.md @@ -6,10 +6,7 @@ nav_order: 560 # Amazon Bedrock Aider can connect to models provided by Amazon Bedrock. -You will need to have an AWS account with access to the Bedrock service. - -To configure Aider to use the Amazon Bedrock API, you need to set up your AWS credentials. -This can be done using the AWS CLI or by setting environment variables. +You will need to have an AWS account with access to the Bedrock service and the specific models you wish to use. ## Select a Model from Amazon Bedrock @@ -37,6 +34,14 @@ feature, you will receive an error message like the following: anthropic.claude-3-7-sonnet-20250219-v1:0 with on-demand throughput isn\xe2\x80\x99t supported. Retry your request with the ID or ARN of an inference profile that contains this model."}' +## Installation and Configuration + +First, install aider: + +{% include install.md %} + +Next, configure your AWS credentials. This can be done using the AWS CLI or by setting environment variables. + ## AWS CLI Configuration If you haven't already, install the [AWS CLI](https://aws.amazon.com/cli/) and configure it with your credentials: @@ -49,7 +54,7 @@ This will prompt you to enter your AWS Access Key ID, Secret Access Key, and def ## Environment Variables -Alternatively, you can set the following environment variables: +You can set the following environment variables: ```bash export AWS_REGION=your_preferred_region @@ -63,7 +68,7 @@ export AWS_PROFILE=your-profile ``` You can add these to your -[.env file](/docs/config/dotenv.html). +`.env` file. ### Set Environment Variables with PowerShell @@ -77,6 +82,8 @@ $env:AWS_REGION = 'us-west-2' # Put whichever AWS region that you'd like, that ## Install boto3 +Aider needs the `boto3` library to connect to Bedrock. + The AWS Bedrock provider requires the `boto3` package in order to function correctly: ```bash @@ -95,12 +102,14 @@ You must install `boto3` dependency to aider's virtual environment installed via uv tool run --from aider-chat pip install boto3 ``` - -## Running Aider with Bedrock +## Get Started Once your AWS credentials are set up, you can run Aider with the `--model` command line switch, specifying the Bedrock model you want to use: ```bash +# Change directory into your codebase +cd /to/your/project + aider --model bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0 ``` diff --git a/aider/website/docs/llms/cohere.md b/aider/website/docs/llms/cohere.md index 66ab3c842..ce3e1a795 100644 --- a/aider/website/docs/llms/cohere.md +++ b/aider/website/docs/llms/cohere.md @@ -10,13 +10,22 @@ Their Command-R+ model works well with aider as a *very basic* coding assistant. You'll need a [Cohere API key](https://dashboard.cohere.com/welcome/login). -To use **Command-R+**: +First, install aider: + +{% include install.md %} + +Then configure your API keys: ``` -python -m pip install -U aider-chat - export COHERE_API_KEY= # Mac/Linux setx COHERE_API_KEY # Windows, restart shell after setx +``` + +Start working with aider and Cohere on your codebase: + +```bash +# Change directory into your codebase +cd /to/your/project aider --model command-r-plus-08-2024 diff --git a/aider/website/docs/llms/deepseek.md b/aider/website/docs/llms/deepseek.md index 72073c1df..0abbf51a9 100644 --- a/aider/website/docs/llms/deepseek.md +++ b/aider/website/docs/llms/deepseek.md @@ -9,11 +9,22 @@ Aider can connect to the DeepSeek.com API. To work with DeepSeek's models, you need to set the `DEEPSEEK_API_KEY` environment variable with your [DeepSeek API key](https://platform.deepseek.com/api_keys). The DeepSeek Chat V3 model has a top score on aider's code editing benchmark. -``` -python -m pip install -U aider-chat +First, install aider: +{% include install.md %} + +Then configure your API keys: + +``` export DEEPSEEK_API_KEY= # Mac/Linux setx DEEPSEEK_API_KEY # Windows, restart shell after setx +``` + +Start working with aider and DeepSeek on your codebase: + +```bash +# Change directory into your codebase +cd /to/your/project # Use DeepSeek Chat v3 aider --model deepseek/deepseek-chat diff --git a/aider/website/docs/llms/groq.md b/aider/website/docs/llms/groq.md index f258e6848..b8e60e719 100644 --- a/aider/website/docs/llms/groq.md +++ b/aider/website/docs/llms/groq.md @@ -10,13 +10,22 @@ The Llama 3 70B model works well with aider and is comparable to GPT-3.5 in code editing performance. You'll need a [Groq API key](https://console.groq.com/keys). -To use **Llama3 70B**: +First, install aider: + +{% include install.md %} + +Then configure your API keys: ``` -python -m pip install -U aider-chat - export GROQ_API_KEY= # Mac/Linux setx GROQ_API_KEY # Windows, restart shell after setx +``` + +Start working with aider and Groq on your codebase: + +```bash +# Change directory into your codebase +cd /to/your/project aider --model groq/llama3-70b-8192 diff --git a/aider/website/docs/llms/lm-studio.md b/aider/website/docs/llms/lm-studio.md index 909d3afe1..be9e53845 100644 --- a/aider/website/docs/llms/lm-studio.md +++ b/aider/website/docs/llms/lm-studio.md @@ -5,11 +5,15 @@ nav_order: 400 # LM Studio -To use LM Studio: +Aider can connect to models served by LM Studio. + +First, install aider: + +{% include install.md %} + +Then configure your API key and endpoint: ``` -python -m pip install -U aider-chat - # Must set a value here even if its a dummy value export LM_STUDIO_API_KEY=dummy-api-key # Mac/Linux setx LM_STUDIO_API_KEY dummy-api-key # Windows, restart shell after setx @@ -17,12 +21,19 @@ setx LM_STUDIO_API_KEY dummy-api-key # Windows, restart shell after setx # LM Studio default server URL is http://localhost:1234/v1 export LM_STUDIO_API_BASE=http://localhost:1234/v1 # Mac/Linux setx LM_STUDIO_API_BASE http://localhost:1234/v1 # Windows, restart shell after setx - -aider --model lm_studio/ ``` **Note:** Even though LM Studio doesn't require an API Key out of the box the `LM_STUDIO_API_KEY` must have a dummy value like `dummy-api-key` set or the client request will fail trying to send an empty `Bearer` token. +Start working with aider and LM Studio on your codebase: + +```bash +# Change directory into your codebase +cd /to/your/project + +aider --model lm_studio/ +``` + See the [model warnings](warnings.html) section for information on warnings which will occur when working with models that aider is not familiar with. diff --git a/aider/website/docs/llms/ollama.md b/aider/website/docs/llms/ollama.md index 463dc4a3e..a9dbf6c07 100644 --- a/aider/website/docs/llms/ollama.md +++ b/aider/website/docs/llms/ollama.md @@ -7,6 +7,19 @@ nav_order: 500 Aider can connect to local Ollama models. +First, install aider: + +{% include install.md %} + +Then configure your Ollama API endpoint (usually the default): + +```bash +export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux +setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows, restart shell after setx +``` + +Start working with aider and Ollama on your codebase: + ``` # Pull the model ollama pull @@ -14,11 +27,8 @@ ollama pull # Start your ollama server, increasing the context window to 8k tokens OLLAMA_CONTEXT_LENGTH=8192 ollama serve -# In another terminal window... -python -m pip install -U aider-chat - -export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux -setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows, restart shell after setx +# In another terminal window, change directory into your codebase +cd /to/your/project aider --model ollama_chat/ ``` diff --git a/aider/website/docs/llms/openai-compat.md b/aider/website/docs/llms/openai-compat.md index e1b2a73f2..ea45a574f 100644 --- a/aider/website/docs/llms/openai-compat.md +++ b/aider/website/docs/llms/openai-compat.md @@ -7,10 +7,13 @@ nav_order: 500 Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint. -``` -python -m pip install aider-install -aider-install +First, install aider: +{% include install.md %} + +Then configure your API key and endpoint: + +``` # Mac/Linux: export OPENAI_API_BASE= export OPENAI_API_KEY= @@ -19,6 +22,13 @@ export OPENAI_API_KEY= setx OPENAI_API_BASE setx OPENAI_API_KEY # ... restart shell after setx commands +``` + +Start working with aider and your OpenAI compatible API on your codebase: + +```bash +# Change directory into your codebase +cd /to/your/project # Prefix the model name with openai/ aider --model openai/ diff --git a/aider/website/docs/llms/openai.md b/aider/website/docs/llms/openai.md index a9d907afb..e88944644 100644 --- a/aider/website/docs/llms/openai.md +++ b/aider/website/docs/llms/openai.md @@ -10,27 +10,34 @@ To work with OpenAI's models, you need to provide your either in the `OPENAI_API_KEY` environment variable or via the `--api-key openai=` command line switch. -Aider has some built in shortcuts for the most popular OpenAI models and -has been tested and benchmarked to work well with them: +First, install aider: + +{% include install.md %} + +Then configure your API keys: ``` -python -m pip install -U aider-chat +export OPENAI_API_KEY= # Mac/Linux +setx OPENAI_API_KEY # Windows, restart shell after setx +``` + +Start working with aider and OpenAI on your codebase: + +```bash +# Change directory into your codebase +cd /to/your/project # o3-mini -aider --model o3-mini --api-key openai= +aider --model o3-mini # o1-mini -aider --model o1-mini --api-key openai= +aider --model o1-mini # GPT-4o -aider --model gpt-4o --api-key openai= +aider --model gpt-4o # List models available from OpenAI aider --list-models openai/ - -# You can also store you API key in environment variables (or .env) -export OPENAI_API_KEY= # Mac/Linux -setx OPENAI_API_KEY # Windows, restart shell after setx ``` You can use `aider --model ` to use any other OpenAI model. diff --git a/aider/website/docs/llms/openrouter.md b/aider/website/docs/llms/openrouter.md index f9ec3ea0d..e5e8a48cc 100644 --- a/aider/website/docs/llms/openrouter.md +++ b/aider/website/docs/llms/openrouter.md @@ -8,11 +8,22 @@ nav_order: 500 Aider can connect to [models provided by OpenRouter](https://openrouter.ai/models?o=top-weekly): You'll need an [OpenRouter API key](https://openrouter.ai/keys). -``` -python -m pip install -U aider-chat +First, install aider: +{% include install.md %} + +Then configure your API keys: + +``` export OPENROUTER_API_KEY= # Mac/Linux setx OPENROUTER_API_KEY # Windows, restart shell after setx +``` + +Start working with aider and OpenRouter on your codebase: + +```bash +# Change directory into your codebase +cd /to/your/project # Or any other open router model aider --model openrouter// @@ -23,16 +34,6 @@ aider --list-models openrouter/ In particular, many aider users access Sonnet via OpenRouter: -``` -python -m pip install -U aider-chat - -export OPENROUTER_API_KEY= # Mac/Linux -setx OPENROUTER_API_KEY # Windows, restart shell after setx - -aider --model openrouter/anthropic/claude-3.7-sonnet -``` - - {: .tip } If you get errors, check your [OpenRouter privacy settings](https://openrouter.ai/settings/privacy). diff --git a/aider/website/docs/llms/vertex.md b/aider/website/docs/llms/vertex.md index b7afee42f..9dc82ea38 100644 --- a/aider/website/docs/llms/vertex.md +++ b/aider/website/docs/llms/vertex.md @@ -13,6 +13,10 @@ or service account with permission to use the Vertex AI API. With your chosen login method, the gcloud CLI should automatically set the `GOOGLE_APPLICATION_CREDENTIALS` environment variable which points to the credentials file. +First, install aider: + +{% include install.md %} + To configure Aider to use the Vertex AI API, you need to set `VERTEXAI_PROJECT` (the GCP project ID) and `VERTEXAI_LOCATION` (the GCP region) [environment variables for Aider](/docs/config/dotenv.html). @@ -27,9 +31,12 @@ VERTEXAI_PROJECT=my-project VERTEXAI_LOCATION=us-east5 ``` -Then you can run aider with the `--model` command line switch, like this: +Start working with aider and Vertex AI on your codebase: ``` +# Change directory into your codebase +cd /to/your/project + aider --model vertex_ai/claude-3-5-sonnet@20240620 ``` diff --git a/aider/website/docs/llms/xai.md b/aider/website/docs/llms/xai.md index 3374cf487..c2334fa3c 100644 --- a/aider/website/docs/llms/xai.md +++ b/aider/website/docs/llms/xai.md @@ -7,14 +7,22 @@ nav_order: 400 You'll need a [xAI API key](https://console.x.ai.). -To use xAI: +First, install aider: + +{% include install.md %} + +Then configure your API keys: ```bash -python -m pip install aider-install -aider-install - export XAI_API_KEY= # Mac/Linux setx XAI_API_KEY # Windows, restart shell after setx +``` + +Start working with aider and xAI on your codebase: + +```bash +# Change directory into your codebase +cd /to/your/project # Grok 3 aider --model xai/grok-3-beta