LocalAI/core
Ludovic Leroux 939411300a
Bump vLLM version + more options when loading models in vLLM (#1782)
* Bump vLLM version to 0.3.2

* Add vLLM model loading options

* Remove transformers-exllama

* Fix install exllama
2024-03-01 22:48:53 +01:00
..
backend Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
config Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
http refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00
schema refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00
services refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00
startup refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00