LocalAI/backend
Brandon Beiler 6a6e1a0ea9
feat(vllm): Additional vLLM config options (Disable logging, dtype, and Per-Prompt media limits) (#4855)
* Adding the following vLLM config options: disable_log_status, dtype, limit_mm_per_prompt

Signed-off-by: TheDropZone <brandonbeiler@gmail.com>

* using " marks in the config.yaml file

Signed-off-by: TheDropZone <brandonbeiler@gmail.com>

* adding in missing colon

Signed-off-by: TheDropZone <brandonbeiler@gmail.com>

---------

Signed-off-by: TheDropZone <brandonbeiler@gmail.com>
2025-02-18 19:27:58 +01:00
..
cpp fix(llama.cpp): improve context shift handling (#4820) 2025-02-14 14:55:03 +01:00
go chore(llama-ggml): drop deprecated backend (#4775) 2025-02-06 18:36:23 +01:00
python feat(vllm): Additional vLLM config options (Disable logging, dtype, and Per-Prompt media limits) (#4855) 2025-02-18 19:27:58 +01:00
backend.proto feat(vllm): Additional vLLM config options (Disable logging, dtype, and Per-Prompt media limits) (#4855) 2025-02-18 19:27:58 +01:00