This website requires JavaScript.
Explore
Help
Register
Sign in
dearwolf
/
LocalAI
Watch
1
Star
0
Fork
You've already forked LocalAI
0
mirror of
https://github.com/mudler/LocalAI.git
synced
2025-06-27 05:04:59 +00:00
Code
Issues
Projects
Releases
Packages
Wiki
Activity
Actions
21
f0f2c87553
LocalAI
/
backend
History
Download ZIP
Download TAR.GZ
TheDropZone
f0f2c87553
Adding the following vLLM config options: disable_log_status, dtype, limit_mm_per_prompt
...
Signed-off-by: TheDropZone <brandonbeiler@gmail.com>
2025-02-18 08:21:26 -05:00
..
cpp
fix(llama.cpp): improve context shift handling (
#4820
)
2025-02-14 14:55:03 +01:00
go
chore(llama-ggml): drop deprecated backend (
#4775
)
2025-02-06 18:36:23 +01:00
python
Adding the following vLLM config options: disable_log_status, dtype, limit_mm_per_prompt
2025-02-18 08:21:26 -05:00
backend.proto
Adding the following vLLM config options: disable_log_status, dtype, limit_mm_per_prompt
2025-02-18 08:21:26 -05:00