mirror of
https://github.com/Aider-AI/aider.git
synced 2025-05-28 08:14:59 +00:00
copy
This commit is contained in:
parent
0b29ffaa65
commit
a6a53d8861
3 changed files with 20 additions and 12 deletions
10
HISTORY.md
10
HISTORY.md
|
@ -1,8 +1,16 @@
|
|||
|
||||
# Release history
|
||||
|
||||
### main
|
||||
### v0.29.0
|
||||
|
||||
- Added support for [directly connecting to Anthropic, Cohere, Gemini and many other LLM providers](https://aider.chat/docs/llms.html).
|
||||
- Added `--weak-model <model-name>` which allows you to specify which model to use for commit messages and chat history summarization.
|
||||
- New command line switches for working with popular models:
|
||||
- `--4-turbo-vision`
|
||||
- `--opus`
|
||||
- `--sonnet`
|
||||
- `--anthropic-api-key`
|
||||
- Improved "whole" and "diff" backends to better support [Cohere's free to use Command-R+ model](https://aider.chat/docs/llms.html#cohere).
|
||||
- Allow `/add` of images from anywhere in the filesystem.
|
||||
- Fixed crash when operating in a repo in a detached HEAD state.
|
||||
- Fix: Use the same default model in CLI and python scripting.
|
||||
|
|
|
@ -121,14 +121,13 @@ import os
|
|||
import openai
|
||||
from aider.coders import Coder
|
||||
|
||||
# Make an openai client
|
||||
client = openai.OpenAI(api_key=os.environ["OPENAI_API_KEY"])
|
||||
|
||||
# This is a list of files to add to the chat
|
||||
fnames = ["foo.py"]
|
||||
|
||||
model = models.Model("gpt-4-turbo", weak_model="gpt-3.5-turbo")
|
||||
|
||||
# Create a coder object
|
||||
coder = Coder.create(client=client, fnames=fnames)
|
||||
coder = Coder.create(main_model=model, fnames=fnames)
|
||||
|
||||
# This will execute one instruction on those files and then return
|
||||
coder.run("make a script that prints hello world")
|
||||
|
|
15
docs/llms.md
15
docs/llms.md
|
@ -3,11 +3,10 @@
|
|||
|
||||
[](https://aider.chat/assets/llms.jpg)
|
||||
|
||||
Aider works well with OpenAI's GPT 3.5, GPT-4, GPT-4 Turbo with Vision and
|
||||
Anthropic's Claude 3 Opus and Sonnet.
|
||||
|
||||
GPT-4 Turbo and Claude 3 Opus are recommended, as they are the very best coding assistants.
|
||||
Cohere offers *free* API access to their Command-R+ model, which works well with aider
|
||||
Aider works best with GPT-4 Turbo and Claude 3 Opus,
|
||||
as they are the very best models for editing code.
|
||||
Aider also works quite well with GPT-3.5.
|
||||
Cohere offers *free* API access to their Command-R+ model, which works with aider
|
||||
as a *very basic* coding assistant.
|
||||
|
||||
Aider supports connecting to almost any LLM,
|
||||
|
@ -61,7 +60,8 @@ you could do `aider --model claude-3-opus-20240229`.
|
|||
|
||||
## Cohere
|
||||
|
||||
Cohere offers *free* API access to their Command-R+ model, which works well with aider
|
||||
Cohere offers *free* API access to their Command-R+ model with reasonably
|
||||
low rate limits. Command-R+ works well with aider
|
||||
as a *very basic* coding assistant.
|
||||
|
||||
To work with Cohere's models, you need to provide your
|
||||
|
@ -112,7 +112,8 @@ Aider uses the [litellm](https://docs.litellm.ai/docs/providers) package
|
|||
to connect to hundreds of other models.
|
||||
You can use `aider --model <model-name>` to use any supported model.
|
||||
|
||||
To explore the list of supported models you can run `aider --model <name>`.
|
||||
To explore the list of supported models you can run `aider --model <model-name>`
|
||||
with a partial model name.
|
||||
If the supplied name is not an exact match for a known model, aider will
|
||||
return a list of possible matching models.
|
||||
For example:
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue