mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-28 06:25:00 +00:00
docs: Initial import from localai-website (#1312)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
This commit is contained in:
parent
763f94ca80
commit
c5c77d2b0d
66 changed files with 6111 additions and 0 deletions
30
docs/content/features/constrained_grammars.md
Normal file
30
docs/content/features/constrained_grammars.md
Normal file
|
@ -0,0 +1,30 @@
|
|||
|
||||
+++
|
||||
disableToc = false
|
||||
title = "✍️ Constrained grammars"
|
||||
weight = 6
|
||||
+++
|
||||
|
||||
The chat endpoint accepts an additional `grammar` parameter which takes a [BNF defined grammar](https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_form).
|
||||
|
||||
This allows the LLM to constrain the output to a user-defined schema, allowing to generate `JSON`, `YAML`, and everything that can be defined with a BNF grammar.
|
||||
|
||||
{{% notice note %}}
|
||||
This feature works only with models compatible with the [llama.cpp](https://github.com/ggerganov/llama.cpp) backend (see also [Model compatibility]({{%relref "model-compatibility" %}})). For details on how it works, see the upstream PRs: https://github.com/ggerganov/llama.cpp/pull/1773, https://github.com/ggerganov/llama.cpp/pull/1887
|
||||
{{% /notice %}}
|
||||
|
||||
## Setup
|
||||
|
||||
Follow the setup instructions from the [LocalAI functions]({{%relref "features/openai-functions" %}}) page.
|
||||
|
||||
## 💡 Usage example
|
||||
|
||||
For example, to constrain the output to either `yes`, `no`:
|
||||
|
||||
```bash
|
||||
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{
|
||||
"model": "gpt-4",
|
||||
"messages": [{"role": "user", "content": "Do you like apples?"}],
|
||||
"grammar": "root ::= (\"yes\" | \"no\")"
|
||||
}'
|
||||
```
|
Loading…
Add table
Add a link
Reference in a new issue