mirror of
https://github.com/Aider-AI/aider.git
synced 2025-06-01 10:14:59 +00:00
Merge branch 'main' into issue-73
This commit is contained in:
commit
1f83a89192
5 changed files with 44 additions and 13 deletions
|
@ -3,6 +3,10 @@
|
||||||
### Next release
|
### Next release
|
||||||
|
|
||||||
- Added `--dark-mode` to select colors suitable for a dark terminal background
|
- Added `--dark-mode` to select colors suitable for a dark terminal background
|
||||||
|
- Reorganized the `--help` output
|
||||||
|
- Bugfix so that aider throws an exception when OpenAI returns InvalidRequest
|
||||||
|
- Bugfix/improvement to /add and /drop to recurse selected directories
|
||||||
|
- Bugfix for live diff output when using "whole" edit format
|
||||||
|
|
||||||
### v0.8.2
|
### v0.8.2
|
||||||
|
|
||||||
|
|
14
README.md
14
README.md
|
@ -3,7 +3,7 @@
|
||||||
`aider` is a command-line chat tool that allows you to write and edit
|
`aider` is a command-line chat tool that allows you to write and edit
|
||||||
code with OpenAI's GPT models. You can ask GPT to help you start
|
code with OpenAI's GPT models. You can ask GPT to help you start
|
||||||
a new project, or modify code in your existing git repo.
|
a new project, or modify code in your existing git repo.
|
||||||
Aider makes it easy to git commit, diff & undo changes proposed by GPT without copy/pasting.
|
Aider makes it easy to git commit, diff & undo changes proposed by GPT without copy/pasting.
|
||||||
It also has features that [help GPT-4 understand and modify larger codebases](https://aider.chat/docs/ctags.html).
|
It also has features that [help GPT-4 understand and modify larger codebases](https://aider.chat/docs/ctags.html).
|
||||||
|
|
||||||

|

|
||||||
|
@ -33,7 +33,7 @@ myapp.py> change the fibonacci function from recursion to iteration
|
||||||
|
|
||||||
## Example chat transcripts
|
## Example chat transcripts
|
||||||
|
|
||||||
Here are some example transcripts that show how you can chat with `aider` to write and edit code with GPT-4.
|
Here are some example transcripts that show how you can chat with `aider` to write and edit code with GPT-4.
|
||||||
|
|
||||||
* [**Hello World Flask App**](https://aider.chat/examples/hello-world-flask.html): Start from scratch and have GPT create a simple Flask app with various endpoints, such as adding two numbers and calculating the Fibonacci sequence.
|
* [**Hello World Flask App**](https://aider.chat/examples/hello-world-flask.html): Start from scratch and have GPT create a simple Flask app with various endpoints, such as adding two numbers and calculating the Fibonacci sequence.
|
||||||
|
|
||||||
|
@ -62,7 +62,7 @@ You can find more chat transcripts on the [examples page](https://aider.chat/exa
|
||||||
1. Install the package with pip:
|
1. Install the package with pip:
|
||||||
* PyPI: `python -m pip install aider-chat`
|
* PyPI: `python -m pip install aider-chat`
|
||||||
* GitHub: `python -m pip install git+https://github.com/paul-gauthier/aider.git`
|
* GitHub: `python -m pip install git+https://github.com/paul-gauthier/aider.git`
|
||||||
* Local clone: `python -m pip install -e .`
|
* Local clone: `python -m pip install -e .`
|
||||||
|
|
||||||
2. Set up your OpenAI API key:
|
2. Set up your OpenAI API key:
|
||||||
* As an environment variable:
|
* As an environment variable:
|
||||||
|
@ -120,7 +120,7 @@ Aider supports commands from within the chat, which all start with `/`. Here are
|
||||||
## Tips
|
## Tips
|
||||||
|
|
||||||
* Think about which files need to be edited to make your change and add them to the chat.
|
* Think about which files need to be edited to make your change and add them to the chat.
|
||||||
Aider has some ability to help GPT figure out which files to edit all by itself, but the most effective approach is to explicitly add the needed files to the chat yourself.
|
Aider has some ability to help GPT figure out which files to edit all by itself, but the most effective approach is to explicitly add the needed files to the chat yourself.
|
||||||
* Large changes are best performed as a sequence of thoughtful bite sized steps, where you plan out the approach and overall design. Walk GPT through changes like you might with a junior dev. Ask for a refactor to prepare, then ask for the actual change. Spend the time to ask for code quality/structure improvements.
|
* Large changes are best performed as a sequence of thoughtful bite sized steps, where you plan out the approach and overall design. Walk GPT through changes like you might with a junior dev. Ask for a refactor to prepare, then ask for the actual change. Spend the time to ask for code quality/structure improvements.
|
||||||
* Use Control-C to safely interrupt GPT if it isn't providing a useful response. The partial response remains in the conversation, so you can refer to it when you reply to GPT with more information or direction.
|
* Use Control-C to safely interrupt GPT if it isn't providing a useful response. The partial response remains in the conversation, so you can refer to it when you reply to GPT with more information or direction.
|
||||||
* Use the `/run` command to run tests, linters, etc and show the output to GPT so it can fix any issues.
|
* Use the `/run` command to run tests, linters, etc and show the output to GPT so it can fix any issues.
|
||||||
|
@ -167,7 +167,7 @@ This minimizes your use of the context window, as well as costs.
|
||||||
| ----------------- | -- | -- | -----| -- | -- |
|
| ----------------- | -- | -- | -----| -- | -- |
|
||||||
| gpt-3.5-turbo | 4k tokens | whole file | 2k tokens | ~8k bytes | no |
|
| gpt-3.5-turbo | 4k tokens | whole file | 2k tokens | ~8k bytes | no |
|
||||||
| gpt-3.5-turbo-16k | 16k tokens | whole file | 8k tokens | ~32k bytes | no |
|
| gpt-3.5-turbo-16k | 16k tokens | whole file | 8k tokens | ~32k bytes | no |
|
||||||
| gpt-4 | 8k tokens | diffs | 8k tokens | ~32k bytes | yes |
|
| gpt-4 | 8k tokens | diffs | 8k tokens | ~32k bytes | yes |
|
||||||
| gpt-4-32k | 32k tokens | diffs | 32k tokens | ~128k bytes | yes |
|
| gpt-4-32k | 32k tokens | diffs | 32k tokens | ~128k bytes | yes |
|
||||||
|
|
||||||
## Kind words from users
|
## Kind words from users
|
||||||
|
@ -176,4 +176,8 @@ This minimizes your use of the context window, as well as costs.
|
||||||
* "Aider ... has easily quadrupled my coding productivity." -- [SOLAR_FIELDS](https://news.ycombinator.com/item?id=36212100)
|
* "Aider ... has easily quadrupled my coding productivity." -- [SOLAR_FIELDS](https://news.ycombinator.com/item?id=36212100)
|
||||||
* "What an amazing tool. It's incredible." -- [valyagolev](https://github.com/paul-gauthier/aider/issues/6#issue-1722897858)
|
* "What an amazing tool. It's incredible." -- [valyagolev](https://github.com/paul-gauthier/aider/issues/6#issue-1722897858)
|
||||||
* "It was WAY faster than I would be getting off the ground and making the first few working versions." -- [Daniel Feldman](https://twitter.com/d_feldman/status/1662295077387923456)
|
* "It was WAY faster than I would be getting off the ground and making the first few working versions." -- [Daniel Feldman](https://twitter.com/d_feldman/status/1662295077387923456)
|
||||||
|
* "Amazing project, definitely the best AI coding assistant I've used." -- [joshuavial](https://github.com/paul-gauthier/aider/issues/84)
|
||||||
|
|
||||||
|
## FAQ
|
||||||
|
|
||||||
|
For more information, see the [FAQ](https://aider.chat/docs/faq.html).
|
||||||
|
|
|
@ -58,7 +58,7 @@ class WholeFileCoder(Coder):
|
||||||
full_path = (Path(self.root) / fname).absolute()
|
full_path = (Path(self.root) / fname).absolute()
|
||||||
|
|
||||||
if mode == "diff":
|
if mode == "diff":
|
||||||
output += self.do_live_diff(full_path, new_lines)
|
output += self.do_live_diff(full_path, new_lines, True)
|
||||||
else:
|
else:
|
||||||
edits.append((fname, fname_source, new_lines))
|
edits.append((fname, fname_source, new_lines))
|
||||||
|
|
||||||
|
@ -105,7 +105,7 @@ class WholeFileCoder(Coder):
|
||||||
if fname is not None:
|
if fname is not None:
|
||||||
# ending an existing block
|
# ending an existing block
|
||||||
full_path = (Path(self.root) / fname).absolute()
|
full_path = (Path(self.root) / fname).absolute()
|
||||||
output += self.do_live_diff(full_path, new_lines)
|
output += self.do_live_diff(full_path, new_lines, False)
|
||||||
return "\n".join(output)
|
return "\n".join(output)
|
||||||
|
|
||||||
if fname:
|
if fname:
|
||||||
|
@ -128,14 +128,14 @@ class WholeFileCoder(Coder):
|
||||||
|
|
||||||
return edited
|
return edited
|
||||||
|
|
||||||
def do_live_diff(self, full_path, new_lines):
|
def do_live_diff(self, full_path, new_lines, final):
|
||||||
if full_path.exists():
|
if full_path.exists():
|
||||||
orig_lines = self.io.read_text(full_path).splitlines(keepends=True)
|
orig_lines = self.io.read_text(full_path).splitlines(keepends=True)
|
||||||
|
|
||||||
show_diff = diffs.diff_partial_update(
|
show_diff = diffs.diff_partial_update(
|
||||||
orig_lines,
|
orig_lines,
|
||||||
new_lines,
|
new_lines,
|
||||||
final=True,
|
final=final,
|
||||||
).splitlines()
|
).splitlines()
|
||||||
output = show_diff
|
output = show_diff
|
||||||
else:
|
else:
|
||||||
|
|
12
docs/faq.md
12
docs/faq.md
|
@ -6,8 +6,8 @@
|
||||||
Aider does not officially support use with LLMs other than OpenAI's gpt-3.5-turbo and gpt-4
|
Aider does not officially support use with LLMs other than OpenAI's gpt-3.5-turbo and gpt-4
|
||||||
and their variants.
|
and their variants.
|
||||||
|
|
||||||
It generally requires some model-specific tuning to get prompts and
|
It seems to require model-specific tuning to get prompts and
|
||||||
editing formats working well. For example, GPT-3.5 and GPT-4 use very
|
editing formats working well with a new model. For example, GPT-3.5 and GPT-4 use very
|
||||||
different prompts and editing formats in aider right now.
|
different prompts and editing formats in aider right now.
|
||||||
Adopting new LLMs will probably require a similar effort to tailor the
|
Adopting new LLMs will probably require a similar effort to tailor the
|
||||||
prompting and edit formats.
|
prompting and edit formats.
|
||||||
|
@ -16,11 +16,13 @@ That said, aider does provide some features to experiment with other models.
|
||||||
If you can make the model accessible via an OpenAI compatible API,
|
If you can make the model accessible via an OpenAI compatible API,
|
||||||
you can use `--openai-api-base` to connect to a different API endpoint.
|
you can use `--openai-api-base` to connect to a different API endpoint.
|
||||||
|
|
||||||
Here is are some
|
Here are some
|
||||||
[GitHub issues which may contain relevant information](https://github.com/paul-gauthier/aider/issues?q=is%3Aissue+%22openai-api-base%22+).
|
[GitHub issues which may contain relevant information](https://github.com/paul-gauthier/aider/issues?q=is%3Aissue+%22openai-api-base%22+).
|
||||||
|
|
||||||
[LocalAI](https://github.com/go-skynet/LocalAI)
|
[LocalAI](https://github.com/go-skynet/LocalAI)
|
||||||
looks like a relevant tool to serve many local models via a compatible API:
|
and
|
||||||
|
[SimpleAI](https://github.com/lhenault/simpleAI)
|
||||||
|
look like relevant tools to serve local models via a compatible API:
|
||||||
|
|
||||||
|
|
||||||
## Can I change the system prompts that aider uses?
|
## Can I change the system prompts that aider uses?
|
||||||
|
@ -49,6 +51,8 @@ has provided this
|
||||||
## How do I get ctags working?
|
## How do I get ctags working?
|
||||||
|
|
||||||
First, be aware that ctags is completely optional and not required to use aider.
|
First, be aware that ctags is completely optional and not required to use aider.
|
||||||
|
Aider only attempts to use ctags with GPT-4,
|
||||||
|
and currently doesn't use ctags with GPT-3.5.
|
||||||
|
|
||||||
If you wish to use ctags, you should consult the
|
If you wish to use ctags, you should consult the
|
||||||
[universal ctags repo](https://github.com/universal-ctags/ctags)
|
[universal ctags repo](https://github.com/universal-ctags/ctags)
|
||||||
|
|
|
@ -8,6 +8,7 @@ from unittest.mock import MagicMock, patch
|
||||||
from aider import models
|
from aider import models
|
||||||
from aider.coders import Coder
|
from aider.coders import Coder
|
||||||
from aider.coders.wholefile_coder import WholeFileCoder
|
from aider.coders.wholefile_coder import WholeFileCoder
|
||||||
|
from aider.dump import dump # noqa: F401
|
||||||
from aider.io import InputOutput
|
from aider.io import InputOutput
|
||||||
|
|
||||||
|
|
||||||
|
@ -76,6 +77,24 @@ class TestWholeFileCoder(unittest.TestCase):
|
||||||
updated_content = f.read()
|
updated_content = f.read()
|
||||||
self.assertEqual(updated_content, "Updated content\n")
|
self.assertEqual(updated_content, "Updated content\n")
|
||||||
|
|
||||||
|
def test_update_files_live_diff(self):
|
||||||
|
# Create a sample file in the temporary directory
|
||||||
|
sample_file = "sample.txt"
|
||||||
|
with open(sample_file, "w") as f:
|
||||||
|
f.write("\n".join(map(str, range(0, 100))))
|
||||||
|
|
||||||
|
# Initialize WholeFileCoder with the temporary directory
|
||||||
|
io = InputOutput(yes=True)
|
||||||
|
coder = WholeFileCoder(main_model=models.GPT35, io=io, fnames=[sample_file])
|
||||||
|
|
||||||
|
# Set the partial response content with the updated content
|
||||||
|
coder.partial_response_content = f"{sample_file}\n```\n0\n\1\n2\n"
|
||||||
|
|
||||||
|
lines = coder.update_files(mode="diff").splitlines()
|
||||||
|
|
||||||
|
# the live diff should be concise, since we haven't changed anything yet
|
||||||
|
self.assertLess(len(lines), 20)
|
||||||
|
|
||||||
def test_update_files_with_existing_fence(self):
|
def test_update_files_with_existing_fence(self):
|
||||||
# Create a sample file in the temporary directory
|
# Create a sample file in the temporary directory
|
||||||
sample_file = "sample.txt"
|
sample_file = "sample.txt"
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue