mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-25 04:54:59 +00:00
docs: add distributed inferencing docs
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
This commit is contained in:
parent
785c54e7b0
commit
bb3ec56de3
5 changed files with 109 additions and 4 deletions
|
@ -370,6 +370,8 @@ there are additional environment variables available that modify the behavior of
|
|||
| `GO_TAGS` | | Go tags. Available: `stablediffusion` |
|
||||
| `HUGGINGFACEHUB_API_TOKEN` | | Special token for interacting with HuggingFace Inference API, required only when using the `langchain-huggingface` backend |
|
||||
| `EXTRA_BACKENDS` | | A space separated list of backends to prepare. For example `EXTRA_BACKENDS="backend/python/diffusers backend/python/transformers"` prepares the conda environment on start |
|
||||
| `DISABLE_AUTODETECT` | `false` | Disable autodetect of CPU flagset on start |
|
||||
| `LLAMACPP_GRPC_SERVERS` | | A list of llama.cpp workers to distribute the workload. For example `LLAMACPP_GRPC_SERVERS="address1:port,address2:port"` |
|
||||
|
||||
Here is how to configure these variables:
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue