mirror of
https://github.com/Aider-AI/aider.git
synced 2025-05-29 08:44:59 +00:00
Added gemini 1.5 pro
This commit is contained in:
parent
4461c7c4b2
commit
9afa6e8435
4 changed files with 40 additions and 2 deletions
|
@ -132,6 +132,19 @@ MODEL_SETTINGS = [
|
||||||
use_repo_map=True,
|
use_repo_map=True,
|
||||||
send_undo_reply=True,
|
send_undo_reply=True,
|
||||||
),
|
),
|
||||||
|
# Gemini
|
||||||
|
ModelSettings(
|
||||||
|
"gemini/gemini-1.5-pro",
|
||||||
|
"udiff",
|
||||||
|
use_repo_map=True,
|
||||||
|
send_undo_reply=True,
|
||||||
|
),
|
||||||
|
ModelSettings(
|
||||||
|
"gemini/gemini-1.5-pro-latest",
|
||||||
|
"udiff",
|
||||||
|
use_repo_map=True,
|
||||||
|
send_undo_reply=True,
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -23,6 +23,7 @@ CACHE = None
|
||||||
RateLimitError,
|
RateLimitError,
|
||||||
APIConnectionError,
|
APIConnectionError,
|
||||||
httpx.ConnectError,
|
httpx.ConnectError,
|
||||||
|
litellm.exceptions.BadRequestError,
|
||||||
),
|
),
|
||||||
max_tries=10,
|
max_tries=10,
|
||||||
on_backoff=lambda details: print(
|
on_backoff=lambda details: print(
|
||||||
|
|
25
docs/llms.md
25
docs/llms.md
|
@ -7,7 +7,11 @@
|
||||||
as they are the very best models for editing code.
|
as they are the very best models for editing code.
|
||||||
Aider also works quite well with [GPT-3.5](#openai).
|
Aider also works quite well with [GPT-3.5](#openai).
|
||||||
|
|
||||||
To use aider with a *free* API provider, you can use [Llama 3 70B on Groq](#llama3)
|
**Aider works with a number of free API providers.**
|
||||||
|
Google's [Gemini 1.5 Pro](#gemini) is
|
||||||
|
the most capable free model to use with aider, with
|
||||||
|
code editing capabilities between GPT-3.5 and GPT-4.
|
||||||
|
You can use [Llama 3 70B on Groq](#llama3)
|
||||||
which is comparable to GPT-3.5 in code editing performance.
|
which is comparable to GPT-3.5 in code editing performance.
|
||||||
Cohere also offers free API access to their [Command-R+ model](#cohere),
|
Cohere also offers free API access to their [Command-R+ model](#cohere),
|
||||||
which works with aider
|
which works with aider
|
||||||
|
@ -27,6 +31,7 @@ contain more detail on the supported providers, their models and any required en
|
||||||
|
|
||||||
- [OpenAI](#openai)
|
- [OpenAI](#openai)
|
||||||
- [Anthropic](#anthropic)
|
- [Anthropic](#anthropic)
|
||||||
|
- [Gemini](#gemini)
|
||||||
- [Groq & Llama3](#groq)
|
- [Groq & Llama3](#groq)
|
||||||
- [Cohere](#cohere)
|
- [Cohere](#cohere)
|
||||||
- [Azure](#azure)
|
- [Azure](#azure)
|
||||||
|
@ -95,6 +100,24 @@ You can use `aider --model <model-name>` to use any other Anthropic model.
|
||||||
For example, if you want to use a specific version of Opus
|
For example, if you want to use a specific version of Opus
|
||||||
you could do `aider --model claude-3-opus-20240229`.
|
you could do `aider --model claude-3-opus-20240229`.
|
||||||
|
|
||||||
|
## Gemini
|
||||||
|
|
||||||
|
Google currently offers
|
||||||
|
[*free* API access to the Gemini 1.5 Pro model](https://ai.google.dev/pricing).
|
||||||
|
This is the most capable free model to use with aider,
|
||||||
|
with code editing capability that's better than GPT-3.5
|
||||||
|
and competitive with GPT-4 Turbo and Opus.
|
||||||
|
You'll need a [Gemini API key](https://aistudio.google.com/app/u/2/apikey).
|
||||||
|
|
||||||
|
```
|
||||||
|
pip install aider-chat
|
||||||
|
export GEMINI_API_KEY=<your-key-goes-here>
|
||||||
|
aider --model gemini/gemini-1.5-pro-latest
|
||||||
|
|
||||||
|
# List models available from Gemini
|
||||||
|
aider --models gemini/
|
||||||
|
```
|
||||||
|
|
||||||
## GROQ
|
## GROQ
|
||||||
|
|
||||||
Groq currently offers *free* API access to the models they host.
|
Groq currently offers *free* API access to the models they host.
|
||||||
|
|
|
@ -24,4 +24,5 @@ Pillow
|
||||||
diff-match-patch
|
diff-match-patch
|
||||||
playwright
|
playwright
|
||||||
pypandoc
|
pypandoc
|
||||||
litellm
|
litellm
|
||||||
|
google-generativeai
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue