LocalAI/backend
2025-06-01 06:28:35 +00:00
..
cpp chore(deps): bump llama.cpp to 'e562eece7cb476276bfc4cbb18deb7c0369b2233' (#5552) 2025-05-31 12:46:32 +02:00
go feat: Realtime API support reboot (#5392) 2025-05-25 22:25:05 +02:00
python chore(deps): bump torch in /backend/python/exllama2 in the pip group 2025-05-30 20:15:58 +00:00
backend.proto feat: Realtime API support reboot (#5392) 2025-05-25 22:25:05 +02:00