LocalAI/backend
Ettore Di Giacinto 1fc6d469ac
chore(deps): bump llama.cpp to '1d36b3670b285e69e58b9d687c770a2a0a192194 (#5307)
chore(deps): bump llama.cpp to '1d36b3670b285e69e58b9d687c770a2a0a192194'

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-05-03 18:44:40 +02:00
..
cpp chore(deps): bump llama.cpp to '1d36b3670b285e69e58b9d687c770a2a0a192194 (#5307) 2025-05-03 18:44:40 +02:00
go fix(stablediffusion-ggml): Build with DSD CUDA, HIP and Metal flags (#5236) 2025-04-24 10:27:17 +02:00
python fix: vllm missing logprobs (#5279) 2025-04-30 12:55:07 +00:00
backend.proto feat(video-gen): add endpoint for video generation (#5247) 2025-04-26 18:05:01 +02:00