From 1af38e943d3052088ce7735423eb9128ed41770e Mon Sep 17 00:00:00 2001 From: Paul Gauthier Date: Thu, 3 Aug 2023 11:45:51 -0300 Subject: [PATCH] copy --- docs/faq.md | 50 +++++++++++++++++++++++++++++++------------------- 1 file changed, 31 insertions(+), 19 deletions(-) diff --git a/docs/faq.md b/docs/faq.md index 0bacf5dd3..67c52b9ae 100644 --- a/docs/faq.md +++ b/docs/faq.md @@ -109,14 +109,41 @@ In these cases, here are some things you might try: Aider does not officially support use with LLMs other than OpenAI's gpt-3.5-turbo and gpt-4 and their variants. +This is because while it's "easy" to connect aider to a new LLM, it's "hard" +to actually teach new LLMs to *edit* code. -It seems to require model-specific tuning to get prompts and -editing formats working well with a new model. For example, GPT-3.5 and GPT-4 use very -different prompts and editing formats in aider right now. +GPT-3.5 is just barely able to understand how to modify existing source code files, +and GPT-4 is quite good at it. +Getting them working that well was a significant undertaking, involving +[specific code editing prompts and backends for each model and extensive benchmarking](https://aider.chat/docs/benchmarks.html). Adopting new LLMs will probably require a similar effort to tailor the -prompting and edit formats. +prompts and editing backends. That said, aider does provide some features to experiment with other models. +Numerous users have already done experiments with numerous models. +So far, no one has reported much success in working with them the way aider +can work with GPT-3.5 and GPT-4. + +Once we see signs that a *particular* model is capable of code editing, +it would be reasonable for aider to attempt to officially support such a model. +Until then, aider will simply maintain experimental support for using alternative models +as describe below. + +### OpenAI API compatible LLMs + +If you can make the model accessible via an OpenAI compatible API, +you can use `--openai-api-base` to connect to a different API endpoint. + +Here are some +[GitHub issues which may contain relevant information](https://github.com/paul-gauthier/aider/issues?q=is%3Aissue+%22openai-api-base%22+). + +### Local LLMs + +[LocalAI](https://github.com/go-skynet/LocalAI) +and +[SimpleAI](https://github.com/lhenault/simpleAI) +look like relevant tools to serve local models via a compatible API: + ### Azure @@ -149,21 +176,6 @@ See the [official Azure documentation on using OpenAI models](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/chatgpt-quickstart?tabs=command-line&pivots=programming-language-python) for more information on how to populate the above configuration values. -### Other LLMs - -If you can make the model accessible via an OpenAI compatible API, -you can use `--openai-api-base` to connect to a different API endpoint. - -Here are some -[GitHub issues which may contain relevant information](https://github.com/paul-gauthier/aider/issues?q=is%3Aissue+%22openai-api-base%22+). - -### Local LLMs - -[LocalAI](https://github.com/go-skynet/LocalAI) -and -[SimpleAI](https://github.com/lhenault/simpleAI) -look like relevant tools to serve local models via a compatible API: -