LocalAI/backend/python/vllm
dependabot[bot] eda36347a5
chore(deps): Bump torch in /backend/python/vllm
Bumps torch from 2.3.1+cxx11.abi to 2.6.0+cu118.

---
updated-dependencies:
- dependency-name: torch
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-03 18:58:30 +00:00
..
backend.py feat(vllm): expose 'load_format' (#3943) 2024-10-23 15:34:57 +02:00
install.sh chore(deps): bump grpcio to 1.68.1 (#4301) 2024-12-02 19:13:26 +01:00
Makefile feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
requirements-after.txt fix(python): move vllm to after deps, drop diffusers main deps 2024-08-07 23:34:37 +02:00
requirements-cpu.txt chore(deps): Bump torch in /backend/python/vllm 2025-02-03 18:58:30 +00:00
requirements-cublas11-after.txt feat(venv): shared env (#3195) 2024-08-07 19:45:14 +02:00
requirements-cublas11.txt chore(deps): Bump torch in /backend/python/vllm 2025-02-03 18:58:30 +00:00
requirements-cublas12-after.txt feat(venv): shared env (#3195) 2024-08-07 19:45:14 +02:00
requirements-cublas12.txt chore(deps): Bump torch in /backend/python/vllm 2025-02-03 18:58:30 +00:00
requirements-hipblas.txt chore(deps): Bump torch in /backend/python/vllm 2025-02-03 18:58:30 +00:00
requirements-install.txt feat: migrate python backends from conda to uv (#2215) 2024-05-10 15:08:08 +02:00
requirements-intel.txt chore(deps): Bump torch in /backend/python/vllm 2025-02-03 18:58:30 +00:00
requirements.txt chore(deps): bump grpcio to 1.70.0 (#4682) 2025-01-24 10:18:22 +01:00
run.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
test.py feat(vllm): add support for embeddings (#3440) 2024-09-02 21:44:32 +02:00
test.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00

Creating a separate environment for the vllm project

make vllm