LocalAI/backend
Ettore Di Giacinto adb24214c6
chore(deps): bump llama.cpp to b34c859146630dff136943abc9852ca173a7c9d6 (#5323)
chore(deps): bump llama.cpp to 'b34c859146630dff136943abc9852ca173a7c9d6'

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-05-06 11:21:25 +02:00
..
cpp chore(deps): bump llama.cpp to b34c859146630dff136943abc9852ca173a7c9d6 (#5323) 2025-05-06 11:21:25 +02:00
go fix(stablediffusion-ggml): Build with DSD CUDA, HIP and Metal flags (#5236) 2025-04-24 10:27:17 +02:00
python fix: vllm missing logprobs (#5279) 2025-04-30 12:55:07 +00:00
backend.proto feat(video-gen): add endpoint for video generation (#5247) 2025-04-26 18:05:01 +02:00