This website requires JavaScript.
Explore
Help
Register
Sign in
dearwolf
/
LocalAI
Watch
1
Star
0
Fork
You've already forked LocalAI
0
mirror of
https://github.com/mudler/LocalAI.git
synced
2025-05-20 10:35:01 +00:00
Code
Issues
Projects
Releases
Packages
Wiki
Activity
Actions
21
d5d82ba344
LocalAI
/
backend
History
Download ZIP
Download TAR.GZ
Ettore Di Giacinto
697c769b64
fix(llama.cpp): enable cont batching when parallel is set (
#1622
)
...
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-01-21 14:59:48 +01:00
..
cpp
fix(llama.cpp): enable cont batching when parallel is set (
#1622
)
2024-01-21 14:59:48 +01:00
go
Revert "[Refactor]: Core/API Split" (
#1550
)
2024-01-05 18:04:46 +01:00
python
feat(extra-backends): Improvements, adding mamba example (
#1618
)
2024-01-20 17:56:08 +01:00
backend.proto
feat:
🐍
add mamba support (
#1589
)
2024-01-19 23:42:50 +01:00