LocalAI/backend
2025-05-16 08:19:13 +02:00
..
cpp chore(deps): bump llama.cpp to b34c859146630dff136943abc9852ca173a7c9d6 (#5323) 2025-05-06 11:21:25 +02:00
go fix: typos 2025-05-16 08:19:13 +02:00
python fix: vllm missing logprobs (#5279) 2025-04-30 12:55:07 +00:00
backend.proto feat(video-gen): add endpoint for video generation (#5247) 2025-04-26 18:05:01 +02:00