mirror of
https://github.com/mudler/LocalAI.git
synced 2025-06-24 19:54:59 +00:00
Add examples/config defaults
This commit is contained in:
parent
8254df3f4c
commit
f995f23042
4 changed files with 58 additions and 1 deletions
5
.env
5
.env
|
@ -66,4 +66,7 @@ MODELS_PATH=/models
|
|||
### Python backends GRPC max workers
|
||||
### Default number of workers for GRPC Python backends.
|
||||
### This actually controls wether a backend can process multiple requests or not.
|
||||
# PYTHON_GRPC_MAX_WORKERS=1
|
||||
# PYTHON_GRPC_MAX_WORKERS=1
|
||||
|
||||
### Define the number of parallel LLAMA.cpp workers (Defaults to 1)
|
||||
# LLAMACPP_PARALLEL=1
|
Loading…
Add table
Add a link
Reference in a new issue