# Aider can connect to most LLMs [![connecting to many LLMs](/assets/llms.jpg)](https://aider.chat/assets/llms.jpg) ## Best models **Aider works best with [GPT-4 Turbo](#openai) and [Claude 3 Opus](#anthropic),** as they are the very best models for editing code. ## Free models Aider works with a number of **free** API providers: - Google's [Gemini 1.5 Pro](#gemini) is the most capable free model to use with aider, with code editing capabilities similar to GPT-3.5. - You can use [Llama 3 70B on Groq](#llama3) which is comparable to GPT-3.5 in code editing performance. - The [Deepseek Coder](#deepseek) model works well with aider, comparable to GPT-3.5. Deepseek.com currently offers 5M free tokens of API usage. - Cohere also offers free API access to their [Command-R+ model](#cohere), which works with aider as a *very basic* coding assistant. ## Local models Aider can work also with local models, for example using [Ollama](#ollama). It can also access local models that provide an [Open AI compatible API](#openai-compatible-apis). ## Use a capable model Be aware that aider may not work well with less capable models. If you see the model returning code, but aider isn't able to edit your files and commit the changes... this is usually because the model isn't capable of properly returning "code edits". Models weaker than GPT 3.5 may have problems working well with aider. ## Configuring models - [OpenAI](#openai) - [Anthropic](#anthropic) - [Gemini](#gemini) - [Groq & Llama3](#groq) - [Cohere](#cohere) - [Azure](#azure) - [OpenRouter](#openrouter) - [Ollama](#ollama) - [Deepseek](#deepseek) - [OpenAI compatible APIs](#openai-compatible-apis) - [Other LLMs](#other-llms) - [Model warnings](#model-warnings) - [Editing format](#editing-format) - [Using a .env file](#using-a-env-file) Aider uses the LiteLLM package to connect to LLM providers. The [LiteLLM provider docs](https://docs.litellm.ai/docs/providers) contain more detail on all the supported providers, their models and any required environment variables. ## OpenAI To work with OpenAI's models, you need to provide your [OpenAI API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key) either in the `OPENAI_API_KEY` environment variable or via the `--openai-api-key` command line switch. Aider has some built in shortcuts for the most popular OpenAI models and has been tested and benchmarked to work well with them: ``` pip install aider-chat export OPENAI_API_KEY= # Mac/Linux setx OPENAI_API_KEY # Windows # GPT-4 Turbo is used by default aider # GPT-4 Turbo with Vision aider --4-turbo-vision # GPT-3.5 Turbo aider --35-turbo # List models available from OpenAI aider --models openai/ ``` You can use `aider --model ` to use any other OpenAI model. For example, if you want to use a specific version of GPT-4 Turbo you could do `aider --model gpt-4-0125-preview`. ## Anthropic To work with Anthropic's models, you need to provide your [Anthropic API key](https://docs.anthropic.com/claude/reference/getting-started-with-the-api) either in the `ANTHROPIC_API_KEY` environment variable or via the `--anthropic-api-key` command line switch. Aider has some built in shortcuts for the most popular Anthropic models and has been tested and benchmarked to work well with them: ``` pip install aider-chat export ANTHROPIC_API_KEY= # Mac/Linux setx ANTHROPIC_API_KEY # Windows # Claude 3 Opus aider --opus # Claude 3 Sonnet aider --sonnet # List models available from Anthropic aider --models anthropic/ ``` You can use `aider --model ` to use any other Anthropic model. For example, if you want to use a specific version of Opus you could do `aider --model claude-3-opus-20240229`. ## Gemini Google currently offers [*free* API access to the Gemini 1.5 Pro model](https://ai.google.dev/pricing). This is the most capable free model to use with aider, with code editing capability that's comparable to GPT-3.5. You'll need a [Gemini API key](https://aistudio.google.com/app/u/2/apikey). ``` pip install aider-chat export GEMINI_API_KEY= # Mac/Linux setx GEMINI_API_KEY # Windows aider --model gemini/gemini-1.5-pro-latest # List models available from Gemini aider --models gemini/ ``` ## GROQ Groq currently offers *free* API access to the models they host. The Llama 3 70B model works well with aider and is comparable to GPT-3.5 in code editing performance. You'll need a [Groq API key](https://console.groq.com/keys). To use **Llama3 70B**: ``` pip install aider-chat export GROQ_API_KEY= # Mac/Linux setx GROQ_API_KEY # Windows aider --model groq/llama3-70b-8192 # List models available from Groq aider --models groq/ ``` ## Cohere Cohere offers *free* API access to their models. Their Command-R+ model works well with aider as a *very basic* coding assistant. You'll need a [Cohere API key](https://dashboard.cohere.com/welcome/login). To use **Command-R+**: ``` pip install aider-chat export COHERE_API_KEY= # Mac/Linux setx COHERE_API_KEY # Windows aider --model command-r-plus # List models available from Cohere aider --models cohere_chat/ ``` ## Azure Aider can connect to the OpenAI models on Azure. ``` pip install aider-chat # Mac/Linux: export AZURE_API_KEY= export AZURE_API_VERSION=2023-05-15 export AZURE_API_BASE=https://myendpt.openai.azure.com # Windows: setx AZURE_API_KEY setx AZURE_API_VERSION 2023-05-15 setx AZURE_API_BASE https://myendpt.openai.azure.com aider --model azure/ # List models available from Azure aider --models azure/ ``` ## OpenRouter Aider can connect to [models provided by OpenRouter](https://openrouter.ai/models?o=top-weekly): You'll need an [OpenRouter API key](https://openrouter.ai/keys). ``` pip install aider-chat export OPENROUTER_API_KEY= # Mac/Linux setx OPENROUTER_API_KEY # Windows # Or any other open router model aider --model openrouter// # List models available from OpenRouter aider --models openrouter/ ``` In particular, Llama3 70B works well with aider, at low cost: ``` pip install aider-chat export OPENROUTER_API_KEY= # Mac/Linux setx OPENROUTER_API_KEY # Windows aider --model openrouter/meta-llama/llama-3-70b-instruct ``` ## Ollama Aider can connect to local Ollama models. ``` # Pull the model ollama pull # Start your ollama server ollama serve # In another terminal window... pip install aider-chat export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows aider --model ollama/ ``` In particular, `llama3:70b` works very well with aider: ``` ollama pull llama3:70b ollama serve # In another terminal window... export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows aider --model ollama/llama3:70b ``` Also see the [model warnings](#model-warnings) section for information on warnings which will occur when working with models that aider is not familiar with. ## Deepseek Aider can connect to the Deepseek API, which is OpenAI compatible. They appear to grant 5M tokens of free API usage to new accounts. ``` pip install aider-chat # Mac/Linux: export OPENAI_API_KEY= export OPENAI_API_BASE=https://api.deepseek.com/v1 # Windows: setx OPENAI_API_KEY setx OPENAI_API_BASE https://api.deepseek.com/v1 aider --model openai/deepseek-coder ``` See the [model warnings](#model-warnings) section for information on warnings which will occur when working with models that aider is not familiar with. ## OpenAI compatible APIs Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint. ``` pip install aider-chat # Mac/Linux: export OPENAI_API_BASE= export OPENAI_API_KEY= # Windows: setx OPENAI_API_BASE setx OPENAI_API_KEY # Prefix the model name with openai/ aider --model openai/ ``` See the [model warnings](#model-warnings) section for information on warnings which will occur when working with models that aider is not familiar with. ## Other LLMs Aider uses the [litellm](https://docs.litellm.ai/docs/providers) package to connect to hundreds of other models. You can use `aider --model ` to use any supported model. To explore the list of supported models you can run `aider --models ` with a partial model name. If the supplied name is not an exact match for a known model, aider will return a list of possible matching models. For example: ``` $ aider --models turbo Aider v0.29.3-dev Models which match "turbo": - gpt-4-turbo-preview (openai/gpt-4-turbo-preview) - gpt-4-turbo (openai/gpt-4-turbo) - gpt-4-turbo-2024-04-09 (openai/gpt-4-turbo-2024-04-09) - gpt-3.5-turbo (openai/gpt-3.5-turbo) - ... ``` See the [list of providers supported by litellm](https://docs.litellm.ai/docs/providers) for more details. ## Model warnings Aider supports connecting to almost any LLM, but it may not work well with less capable models. If you see the model returning code, but aider isn't able to edit your files and commit the changes... this is usually because the model isn't capable of properly returning "code edits". Models weaker than GPT 3.5 may have problems working well with aider. Aider tries to sanity check that it is configured correctly to work with the specified model: - It checks to see that all required environment variables are set for the model. These variables are required to configure things like API keys, API base URLs, etc. - It checks a metadata database to look up the context window size and token costs for the model. Sometimes one or both of these checks will fail, so aider will issue some of the following warnings. #### Missing environment variables ``` Model azure/gpt-4-turbo: Missing these environment variables: - AZURE_API_BASE - AZURE_API_VERSION - AZURE_API_KEY ``` You need to set the listed environment variables. Otherwise you will get error messages when you start chatting with the model. #### Unknown which environment variables are required ``` Model gpt-5: Unknown which environment variables are required. ``` Aider is unable verify the environment because it doesn't know which variables are required for the model. If required variables are missing, you may get errors when you attempt to chat with the model. You can look in the [litellm provider documentation](https://docs.litellm.ai/docs/providers) to see if the required variables are listed there. #### Unknown model, did you mean? ``` Model gpt-5: Unknown model, context window size and token costs unavailable. Did you mean one of these? - gpt-4 ``` If you specify a model that aider has never heard of, you will get an "unknown model" warning. This means aider doesn't know the context window size and token costs for that model. Some minor functionality will be limited when using such models, but it's not really a significant problem. Aider will also try to suggest similarly named models, in case you made a typo or mistake when specifying the model name. ## Editing format Aider uses 3 different "edit formats" to collect code edits from different LLMs: - `whole` is a "whole file" editing format, where the model edits a file by returning a full new copy of the file with any changes included. - `diff` is a more efficient diff style format, where the model specifies blocks of code to search and replace in order to made changes to files. - `udiff` is the most efficient editing format, where the model returns unified diffs to apply changes to the file. Different models work best with different editing formats. Aider is configured to use the best edit format for the popular OpenAI and Anthropic models and the other models recommended on this page. For lesser known models aider will default to using the "whole" editing format. If you would like to experiment with the more advanced formats, you can use these switches: `--edit-format diff` or `--edit-format udiff`. # Using a .env file Aider will read environment variables from a `.env` file in root of your git repo or in current directory. You can give it an explicit file to load with the `--env-file ` parameter. You can use a `.env` file to store various keys and other settings for the models you use with aider. Here is an example `.env` file: ``` OPENAI_API_KEY= ANTHROPIC_API_KEY= GROQ_API_KEY= OPENROUTER_API_KEY= AZURE_API_KEY= AZURE_API_VERSION=2023-05-15 AZURE_API_BASE=https://example-endpoint.openai.azure.com OLLAMA_API_BASE=http://127.0.0.1:11434 ```