LocalAI/backend
omahs 0f365ac204
fix: typos (#5376)
Signed-off-by: omahs <73983677+omahs@users.noreply.github.com>
2025-05-16 12:45:48 +02:00
..
cpp chore(deps): bump llama.cpp to b34c859146630dff136943abc9852ca173a7c9d6 (#5323) 2025-05-06 11:21:25 +02:00
go fix: typos (#5376) 2025-05-16 12:45:48 +02:00
python fix: vllm missing logprobs (#5279) 2025-04-30 12:55:07 +00:00
backend.proto feat(video-gen): add endpoint for video generation (#5247) 2025-04-26 18:05:01 +02:00