From f8b51ea2df6f8b8908dec7f4eeffffd6294e0738 Mon Sep 17 00:00:00 2001 From: Paul Gauthier Date: Sun, 28 Apr 2024 17:12:23 -0700 Subject: [PATCH] copy --- docs/llms.md | 76 +++++++++++++++++++++++++++++----------------------- 1 file changed, 42 insertions(+), 34 deletions(-) diff --git a/docs/llms.md b/docs/llms.md index 861e39bf9..e531c7a1d 100644 --- a/docs/llms.md +++ b/docs/llms.md @@ -42,8 +42,8 @@ So you should expect that models which are less capable than GPT-3.5 may struggl - [Cohere](#cohere) - [Azure](#azure) - [OpenRouter](#openrouter) -- [OpenAI compatible APIs](#openai-compatible-apis) - [Ollama](#ollama) +- [OpenAI compatible APIs](#openai-compatible-apis) - [Other LLMs](#other-llms) - [Model warnings](#model-warnings) - [Editing format](#editing-format) @@ -190,9 +190,6 @@ You'll need an [OpenRouter API key](https://openrouter.ai/keys). pip install aider-chat export OPENROUTER_API_KEY= -# Llama3 70B instruct -aider --model openrouter/meta-llama/llama-3-70b-instruct - # Or any other open router model aider --model openrouter// @@ -200,6 +197,47 @@ aider --model openrouter// aider --models openrouter/ ``` +In particular, Llama3 70B works well with aider: + +``` +# Llama3 70B instruct +aider --model openrouter/meta-llama/llama-3-70b-instruct +``` + + +## Ollama + +Aider can connect to local Ollama models. + +``` +# Pull the model +ollama pull + +# Start your ollama server +ollama serve + +# In another terminal window +export OLLAMA_API_BASE=http://127.0.0.1:11434 +aider --model ollama/ +``` + +In particular, `llama3:70b` works very well with aider + + +``` +ollama pull llama3:70b +ollama serve + +# ...in another terminal window... +export OLLAMA_API_BASE=http://127.0.0.1:11434 +aider --model ollama/llama3:70b +``` + +Also see the [model warnings](#model-warnings) +section for information on warnings which will occur +when working with models that aider is not familiar with. + + ## OpenAI compatible APIs Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint. @@ -219,36 +257,6 @@ See the [model warnings](#model-warnings) section for information on warnings which will occur when working with models that aider is not familiar with. -## Ollama - -Aider can connect to local Ollama models. - -``` -# Pull the model -ollama pull - -# Start your ollama server -ollama serve - -# In another terminal window: -export OLLAMA_API_BASE=http://127.0.0.1:11434 -aider --model ollama/ - -### -# -# llama3:70b works very well with aider -# -ollama pull llama3:70b -ollama serve -# ...in another terminal window: -export OLLAMA_API_BASE=http://127.0.0.1:11434 -aider --model ollama/llama3:70b -``` - -Also see the [model warnings](#model-warnings) -section for information on warnings which will occur -when working with models that aider is not familiar with. - ## Other LLMs Aider uses the [litellm](https://docs.litellm.ai/docs/providers) package