Merge branch 'main' into gui

This commit is contained in:
Paul Gauthier 2024-04-29 05:54:03 -07:00
commit 28cd2c8580
4 changed files with 66 additions and 48 deletions

21
.github/ISSUE_TEMPLATE/issue.yml vendored Normal file
View file

@ -0,0 +1,21 @@
name: Question or bug report
description: Submit a question or bug report to help us improve aider
labels: []
body:
- type: textarea
attributes:
label: Issue
description: Please describe your problem or question.
validations:
required: true
- type: textarea
attributes:
label: Version and model info
description: Please include aider version, model being used (`gpt-4-xxx`, etc) and any other switches or config settings that are active.
placeholder: |
Aider v0.21.2-dev
Model: gpt-4-0613 using diff edit format
Git repo: .git with 134 files
Repo-map: using 1024 tokens
validations:
required: false

View file

@ -1,24 +0,0 @@
---
name: New issue
about: Ask a question or report a bug
title: ''
labels: ''
assignees: ''
---
When asking questions or reporting issues, it is very helpful if you can include:
- Aider version
- Model being used (`gpt-4-xxx`, etc)
- Other switches or config settings that are active
The easiest way to do this is just just copy & paste the announcement lines that aider prints when you launch it, like these:
```
Aider v0.21.2-dev
Model: gpt-4-0613 using diff edit format
Git repo: .git with 134 files
Repo-map: using 1024 tokens
Use /help to see in-chat commands, run with --help to see cmd line args
```

View file

@ -11,6 +11,7 @@
<meta name="viewport" content="width=device-width, initial-scale=1">
<meta name="theme-color" content="#157878">
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent">
<link rel="icon" type="image/png" sizes="32x32" href="{{ '/assets/favicon-32x32.png' | relative_url }}">
<link rel="stylesheet" href="{{ '/assets/css/style.css?v=' | append: site.github.build_revision | relative_url }}">
{% include head-custom.html %}
</head>

View file

@ -42,8 +42,8 @@ So you should expect that models which are less capable than GPT-3.5 may struggl
- [Cohere](#cohere)
- [Azure](#azure)
- [OpenRouter](#openrouter)
- [OpenAI compatible APIs](#openai-compatible-apis)
- [Ollama](#ollama)
- [OpenAI compatible APIs](#openai-compatible-apis)
- [Other LLMs](#other-llms)
- [Model warnings](#model-warnings)
- [Editing format](#editing-format)
@ -190,9 +190,6 @@ You'll need an [OpenRouter API key](https://openrouter.ai/keys).
pip install aider-chat
export OPENROUTER_API_KEY=<your-key-goes-here>
# Llama3 70B instruct
aider --model openrouter/meta-llama/llama-3-70b-instruct
# Or any other open router model
aider --model openrouter/<provider>/<model>
@ -200,6 +197,49 @@ aider --model openrouter/<provider>/<model>
aider --models openrouter/
```
In particular, Llama3 70B works well with aider, at low cost:
```
pip install aider-chat
export OPENROUTER_API_KEY=<your-key-goes-here>
aider --model openrouter/meta-llama/llama-3-70b-instruct
```
## Ollama
Aider can connect to local Ollama models.
```
# Pull the model
ollama pull <MODEL>
# Start your ollama server
ollama serve
# In another terminal window
pip install aider-chat
export OLLAMA_API_BASE=http://127.0.0.1:11434
aider --model ollama/<MODEL>
```
In particular, `llama3:70b` works very well with aider:
```
ollama pull llama3:70b
ollama serve
# ...in another terminal window...
export OLLAMA_API_BASE=http://127.0.0.1:11434
aider --model ollama/llama3:70b
```
Also see the [model warnings](#model-warnings)
section for information on warnings which will occur
when working with models that aider is not familiar with.
## OpenAI compatible APIs
Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint.
@ -219,26 +259,6 @@ See the [model warnings](#model-warnings)
section for information on warnings which will occur
when working with models that aider is not familiar with.
## Ollama
Aider can connect to local Ollama models.
```
# Start your ollama server
ollama serve
# In another terminal window:
export OLLAMA_API_BASE=http://127.0.0.1:11434
aider --model ollama/<MODEL>
```
The Llama3 70B model works well with aider.
Give aider the `--edit-format diff` switch if you're working with it.
Also see the [model warnings](#model-warnings)
section for information on warnings which will occur
when working with models that aider is not familiar with.
## Other LLMs
Aider uses the [litellm](https://docs.litellm.ai/docs/providers) package