LocalAI/backend
TheDropZone f0f2c87553
Adding the following vLLM config options: disable_log_status, dtype, limit_mm_per_prompt
Signed-off-by: TheDropZone <brandonbeiler@gmail.com>
2025-02-18 08:21:26 -05:00
..
cpp fix(llama.cpp): improve context shift handling (#4820) 2025-02-14 14:55:03 +01:00
go chore(llama-ggml): drop deprecated backend (#4775) 2025-02-06 18:36:23 +01:00
python Adding the following vLLM config options: disable_log_status, dtype, limit_mm_per_prompt 2025-02-18 08:21:26 -05:00
backend.proto Adding the following vLLM config options: disable_log_status, dtype, limit_mm_per_prompt 2025-02-18 08:21:26 -05:00