mirror of
https://github.com/Aider-AI/aider.git
synced 2025-05-25 14:55:00 +00:00
copy
This commit is contained in:
parent
821dd1e18a
commit
f8b51ea2df
1 changed files with 42 additions and 34 deletions
76
docs/llms.md
76
docs/llms.md
|
@ -42,8 +42,8 @@ So you should expect that models which are less capable than GPT-3.5 may struggl
|
|||
- [Cohere](#cohere)
|
||||
- [Azure](#azure)
|
||||
- [OpenRouter](#openrouter)
|
||||
- [OpenAI compatible APIs](#openai-compatible-apis)
|
||||
- [Ollama](#ollama)
|
||||
- [OpenAI compatible APIs](#openai-compatible-apis)
|
||||
- [Other LLMs](#other-llms)
|
||||
- [Model warnings](#model-warnings)
|
||||
- [Editing format](#editing-format)
|
||||
|
@ -190,9 +190,6 @@ You'll need an [OpenRouter API key](https://openrouter.ai/keys).
|
|||
pip install aider-chat
|
||||
export OPENROUTER_API_KEY=<your-key-goes-here>
|
||||
|
||||
# Llama3 70B instruct
|
||||
aider --model openrouter/meta-llama/llama-3-70b-instruct
|
||||
|
||||
# Or any other open router model
|
||||
aider --model openrouter/<provider>/<model>
|
||||
|
||||
|
@ -200,6 +197,47 @@ aider --model openrouter/<provider>/<model>
|
|||
aider --models openrouter/
|
||||
```
|
||||
|
||||
In particular, Llama3 70B works well with aider:
|
||||
|
||||
```
|
||||
# Llama3 70B instruct
|
||||
aider --model openrouter/meta-llama/llama-3-70b-instruct
|
||||
```
|
||||
|
||||
|
||||
## Ollama
|
||||
|
||||
Aider can connect to local Ollama models.
|
||||
|
||||
```
|
||||
# Pull the model
|
||||
ollama pull <MODEL>
|
||||
|
||||
# Start your ollama server
|
||||
ollama serve
|
||||
|
||||
# In another terminal window
|
||||
export OLLAMA_API_BASE=http://127.0.0.1:11434
|
||||
aider --model ollama/<MODEL>
|
||||
```
|
||||
|
||||
In particular, `llama3:70b` works very well with aider
|
||||
|
||||
|
||||
```
|
||||
ollama pull llama3:70b
|
||||
ollama serve
|
||||
|
||||
# ...in another terminal window...
|
||||
export OLLAMA_API_BASE=http://127.0.0.1:11434
|
||||
aider --model ollama/llama3:70b
|
||||
```
|
||||
|
||||
Also see the [model warnings](#model-warnings)
|
||||
section for information on warnings which will occur
|
||||
when working with models that aider is not familiar with.
|
||||
|
||||
|
||||
## OpenAI compatible APIs
|
||||
|
||||
Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint.
|
||||
|
@ -219,36 +257,6 @@ See the [model warnings](#model-warnings)
|
|||
section for information on warnings which will occur
|
||||
when working with models that aider is not familiar with.
|
||||
|
||||
## Ollama
|
||||
|
||||
Aider can connect to local Ollama models.
|
||||
|
||||
```
|
||||
# Pull the model
|
||||
ollama pull <MODEL>
|
||||
|
||||
# Start your ollama server
|
||||
ollama serve
|
||||
|
||||
# In another terminal window:
|
||||
export OLLAMA_API_BASE=http://127.0.0.1:11434
|
||||
aider --model ollama/<MODEL>
|
||||
|
||||
###
|
||||
#
|
||||
# llama3:70b works very well with aider
|
||||
#
|
||||
ollama pull llama3:70b
|
||||
ollama serve
|
||||
# ...in another terminal window:
|
||||
export OLLAMA_API_BASE=http://127.0.0.1:11434
|
||||
aider --model ollama/llama3:70b
|
||||
```
|
||||
|
||||
Also see the [model warnings](#model-warnings)
|
||||
section for information on warnings which will occur
|
||||
when working with models that aider is not familiar with.
|
||||
|
||||
## Other LLMs
|
||||
|
||||
Aider uses the [litellm](https://docs.litellm.ai/docs/providers) package
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue