LocalAI/backend/python/autogptq
Ludovic Leroux 939411300a
Bump vLLM version + more options when loading models in vLLM (#1782)
* Bump vLLM version to 0.3.2

* Add vLLM model loading options

* Remove transformers-exllama

* Fix install exllama
2024-03-01 22:48:53 +01:00
..
autogptq.py refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
autogptq.yml Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
backend_pb2.py Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
backend_pb2_grpc.py refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
Makefile deps(conda): use transformers environment with autogptq (#1555) 2024-01-06 15:30:53 +01:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
run.sh deps(conda): use transformers environment with autogptq (#1555) 2024-01-06 15:30:53 +01:00

Creating a separate environment for the autogptq project

make autogptq