LocalAI/backend/python/diffusers
Ettore Di Giacinto d2934dd69f feat(elevenlabs): map elevenlabs API support to TTS
This allows elevenlabs Clients to work automatically with LocalAI by
supporting the elevenlabs API.

The elevenlabs server endpoint is implemented such as it is wired to the
TTS endpoints.

Fixes: https://github.com/mudler/LocalAI/issues/1809
2024-03-14 18:12:47 +01:00
..
backend_diffusers.py feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
backend_pb2.py feat(elevenlabs): map elevenlabs API support to TTS 2024-03-14 18:12:47 +01:00
backend_pb2_grpc.py refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
diffusers-rocm.yml Build docker container for ROCm (#1595) 2024-02-16 15:08:50 +01:00
diffusers.yml Build docker container for ROCm (#1595) 2024-02-16 15:08:50 +01:00
install.sh feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
Makefile feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
run.sh feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
test.py feat(diffusers): update, add autopipeline, controlnet (#1432) 2023-12-13 19:20:22 +01:00
test.sh tests: add diffusers tests (#1419) 2023-12-11 08:20:34 +01:00

Creating a separate environment for the diffusers project

make diffusers