mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-20 18:45:00 +00:00
![]() * feat(extra-backends): Improvements vllm: add max_tokens, wire up stream event mamba: fixups, adding examples for mamba-chat * examples(mamba-chat): add * docs: update |
||
---|---|---|
.. | ||
backend_pb2.py | ||
backend_pb2_grpc.py | ||
backend_vllm.py | ||
Makefile | ||
README.md | ||
run.sh | ||
test.sh | ||
test_backend_vllm.py |
Creating a separate environment for the vllm project
make vllm