LocalAI/backend/python/vllm
Ettore Di Giacinto d2934dd69f feat(elevenlabs): map elevenlabs API support to TTS
This allows elevenlabs Clients to work automatically with LocalAI by
supporting the elevenlabs API.

The elevenlabs server endpoint is implemented such as it is wired to the
TTS endpoints.

Fixes: https://github.com/mudler/LocalAI/issues/1809
2024-03-14 18:12:47 +01:00
..
backend_pb2.py feat(elevenlabs): map elevenlabs API support to TTS 2024-03-14 18:12:47 +01:00
backend_pb2_grpc.py refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
backend_vllm.py Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
Makefile deps(conda): use transformers-env with vllm,exllama(2) (#1554) 2024-01-06 13:32:28 +01:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
run.sh deps(conda): use transformers-env with vllm,exllama(2) (#1554) 2024-01-06 13:32:28 +01:00
test.sh deps(conda): use transformers-env with vllm,exllama(2) (#1554) 2024-01-06 13:32:28 +01:00
test_backend_vllm.py feat(conda): share envs with transformer-based backends (#1465) 2023-12-21 08:35:15 +01:00

Creating a separate environment for the vllm project

make vllm