LocalAI/backend
Ettore Di Giacinto 11d960b2a6
chore(cli): be consistent between workers and expose ExtraLLamaCPPArgs to both (#3428)
* chore(cli): be consistent between workers and expose ExtraLLamaCPPArgs to both

Fixes: https://github.com/mudler/LocalAI/issues/3427

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* bump grpcio

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-08-30 00:10:17 +02:00
..
cpp chore(deps): bump llama.cpp, rename llama_add_bos_token (#3253) 2024-08-16 01:20:21 +02:00
go chore: drop gpt4all.cpp (#3106) 2024-08-07 23:35:55 +02:00
python chore(cli): be consistent between workers and expose ExtraLLamaCPPArgs to both (#3428) 2024-08-30 00:10:17 +02:00
backend.proto feat: elevenlabs sound-generation api (#3355) 2024-08-24 00:20:28 +00:00